More threads by Scott Ostermiller

Joined
Jun 24, 2016
Messages
8
Reaction score
0
I work with a franchise network of SABs and I've had a few of them say they're getting phone calls from SEO vendors saying that the use of the zip code meta tag on their websites (which usually have 20-100 different zip codes) is going to get them blacklisted by Google.

While I have my doubts about it getting them blacklisted, I'm open to the idea that there may be a better practice for local SEO.

My question is, does anyone here have a good idea as to whether or not Google uses the zip code HTML meta tag anymore? Is it a wasted effort to use it or does it still influence your ranking/relevance on a local level?
 
Great idea. Biggest problem we run into is the CMS we're using doesn't allow individual owners that kind of access (let alone understanding of how to implement).

It's a step in the right direction, though.

Thanks!
 
We have a thread going discussing a similar topic (JSON implementation) - https://www.localsearchforum.com/google-local/43511-json-schema.html

You could use Google Tag manager to implement the different sets of JSON markup, and tell each one to fire only on the respective franchise page. It's a little bit of a time investment to initially set it up depending on how many offices you have, but simple to manage after.

What industry is this client in?
 
We're in the carpet, upholstery, tile cleaning space. Largest brand in the world, actually, but individual franchise owners use a variety of different website CMS's and platforms so some of them do better than others.
 
Hardcoding stuff like that can be a PITA. Use tag manager so you don't need to change the back end each time, and update with schema.org JSON markup.

You could use Organization, Local Business, and Services

Service - https://schema.org/Service

Probably could use General Contractor markup to be more specific with this type of business too - https://schema.org/GeneralContractor

Let us know how it works out
 
Google Tag Manager won't work if you are trying to service this from a single GTM container. There is a practical limit on the number of rules, tags, and an overall container limit of 200kb. It would be quite challenging to create a valid rule set and address tables to fit into this limit for more than a few dozen locations - plus, a serious pain to update. Remember, ALL the contents of the container are downloaded and the rules fired on the client side.

Good news is that GTM is not needed. You can load the JSON dynamically from any external script. That is why JSON-LD works from GTM-- its just injecting JavaScript. Its nothing special about GTM. See https://developers.google.com/search/docs/guides/intro-structured-data

"Also, Google can read JSON-LD data when it is dynamically injected into the page's contents, such as by JavaScript code or embedded widgets in your content management system."

All you need is a externally-called JavaScript file that calls back to a server, pulls the relevant information based on the page, and then injects the appropriate JSON-LD into the DOM.

That said, the SEO people claiming Google is about to take some action are simply crooks attempting to defraud the franchisees. Period.

Google simply ignores meta tags it isn't interested in. Keeping the old school ICBM/geo.position tags and ZIP tags on pages have zero effect. Considering how wide-spread they were, if Google was going to do something that drastic, they would announce it years in advance. As it is, they have made one single statement several year ago that they ignore meta keywords, and have said nothing else.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

Events

LocalU October 2024 Webinar

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom