More threads by Linda Buquet

Linda Buquet

Moderator
Local Search Expert
Joined
Jun 28, 2012
Messages
13,313
Reaction score
4,148
I've posted several times about rank tracking programs being off since that damn bird flew the coup.

The reason in part is that most ranking trackers are using Classic Google Maps.

As I've explained in post #2 of this thread Classic uses the OLD pre-Pigeon algo so the ranking order can be significantly different than today's post-Pigeon reality.

However, what I didn't realize is some of the reasons the software companies can't just switch over to NEW Google Maps, which are Pigeonized.

Myles from Bright Local explains in a customer support complaint thread here:

Ranking reports are in-correct for US based companies

Customer said:

has anyone else noticed that ranking reports are in-correct for US based companies since the last google update? We have sent several email to support here on BL but have not received an answer. Can anyone comment on this please?

Myles replied:

This issue was picked up by our team and is being worked on. We take the accuracy of our reports & data very seriously and monitor it very closely.

However the significance of the latest change in Google requires a very considered change by us. There are a number of factors at play which makes the solution highly complex. I'll explain these to give you an understanding of them.

Our current tool tracks map rankings using the older version of maps. This currently has a slightly different algo to the new maps that Google uses. This change came about in the Pigeon update.

The reasons that we haven't switched over to tracking new maps yet are as follows -

1. Time-based issue - as with most Google changes there is an initial change and then a bedding down period. We are waiting to see how Pigeon beds down and don't want to make a significant change to only found out that this change was either unnecessary or needs to change again with the next update.

2. No URL in listings - the new map listings don't carry a website URL so we can't track based on website URL.

3. No Google+ Local URL there is also no Google+ in the Maps listings.

Therefore the only way to identify a listing is using business name which is not nearly so accurate and most of our customers track using URLs so they'll need to update their reports to add a business name.

4. Javascript - new maps is 100% driven by Javascript. This maps scraping results vey difficult and requires a complete redevelopment of our maps ranking engine.

With all these factors at play it's therefore a major decision to switch to tracking new maps until we know it's what google will stick with mid-long term.

Please let me know if you have any questions & i'm happy to expand on the above points if it would help.

Thanks for the explanation Myles - those are all really good points to know.

Hat tip to Joy for sharing this on G+.

For TONS more insights about that dirty bird - check our #Pigeon hashtag.

What do you think???
<meta property="og:type" content="article"><meta property="og:title" content=""><meta property="og:description" content="">
<meta property="og:image" content="http://marketing-blog.catalystemarketing.com/wp-content/uploads/2014/10/PigeonBoo.jpg">
 
Hi Linda. Thanks for re-sharing that post here and I am happy to answer any questions that your readers have.

I should point out that this doesn't affect organic rankings or the tracking of Maps/Google+ Local results in Googles 'web' search results. These are not hampered by same issues - it's just the pure maps results as you see on a page such as this - https://www.google.com/maps/search/personal+injury+attorney

I'll bookmark this page and return to with any updated info when i have it. And of course i'll answer questions in the meantime
 
Thanks Myles, I appreciate it.

Yes, I should have clarified, this problem does not affect page 1 tracking in the local pack.

It only affects if you rank too low to be in the 3 pack or 7 pack, then the tracking software goes out to maps to see where you rank since page one stops there. There is no other way to see if you rank #9 or #20 in local except to go to out to maps.
 
Thanks Darren, good to know!

I think Mark from Places Scout figured out a work around and has accurate results too. Can't remember exactly how they are doing it though.
 
The bigger problem we see is different data centres returning different results (as you so astutely pointed out). One day you'll be ranking #1, and the next day you're nowhere to be found. It's really bizarre and results in a lot of confusion and questions from our users.
 
Cray Cray Pigeon!

This is one of the many times I'm so glad I don't have SMB clients.

Must be so hard trying to help them understand it's nothing the consultant did, it's just this crazy unpredictable algo.
 
Yes, i too have seen that the new vs. old maps rankings are widely different. Many customers reported it and it was hurting their reporting big time.

So we released the latest 2.8.0 Version of Places Scout that supports gathering data from the new Google Maps, and has an option to revert back to old maps if desired. Seems to be working very well based on customer feedback.

@Myles - FYI - The website URL and +Local page link is in the HTML of the results in the new maps, though not sure if it is populated post Javascript execution. Some results do not have either but most of them do.

As Darren mentioned as well, i have seen that different data centers are still using different algos (based on pack analysis), though i suspect that soon all will be congruent and that the data center issue will only be apparent during rollouts of new algo updates.

One other big thing i am noticing between the new vs the old maps is how the new maps is changing the zoom factor dramatically when doing keyword only searches with a set Google location. For example:

  • Search using just the keyword only (no geo-modifier in keyword)
  • Set a location
  • Click the 'Maps' link to goto the new Maps
  • Click the the 'See Results in List View' link
  • As you start to page through the results, usually after page 3 or so the map starts to zoom out dramatically outside of the location you set. So results start appearing from cities around your set Google Location. This doesn't happen nearly as much in the old Maps
  • However, if you include the geo-modifier in the keyword, the map doesn't zoom out or shift around nearly as much as it does when searching keyword only.

Also, i have noticed that if you are logged into your Google account, that the 'See Results in List View' link is no longer present and you have to page through the results in the drop down box of results that first appears.

Just some more interesting observations...and i'm sure there will be plenty more to follow.
 
Thanks Mark, I had meant to ping you about this thread as I knew you'd have some great insights. Thanks for sharing all that and good to know Scout has the flexibility to work with new or old maps as needed.
 
@mark - thanks for that very useful update. We do see the website link in the HTML but not with as much frequency as we'd like to to enable best ranking. I have seen that link come and go also which is strange. Not sure if that's another data center issue impacting results.

Certainly the inconsistency between data centers makes running consistent reports much more tricky. We don't see much variance in organic & pack results - just the pure maps results. Are you seeing the same thing in your tests?

---------- Post Merged at 09:13 AM ---------- Previous Post was at 09:10 AM ----------

@Linda - with your years of Google watching experience what's your expectation about these data center inconsistencies. Do you think that Google will rollout the same algo across all data centers and these inconsistencies will disappear/reduce?

And what sort of time frame would you expect to see this happen post a big update like Pigeon?

Any insights you can share about the thought process Google goes through when doing an update would be very interesting (please point to an existing post if you've covered this before - i couldn't find one :) )

Thanks Linda
 
@Linda - with your years of Google watching experience what's your expectation about these data center inconsistencies. Do you think that Google will rollout the same algo across all data centers and these inconsistencies will disappear/reduce?

And what sort of time frame would you expect to see this happen post a big update like Pigeon?

Any insights you can share about the thought process Google goes through when doing an update would be very interesting (please point to an existing post if you've covered this before - i couldn't find one :) )

Hi Myles,

I don't really consider myself a Google update expert, but honestly I thought this crazy flux would last 3 weeks max. With most previous updates there seems to be some flakiness for just a couple weeks, as the aglo settles in. But I've never seen flux this crazy or for this long.

My best guess is either: A) They are still tesing B) This algo is one that needs to be trained and it's taking awhile

No idea how long it will last because like I said - if I had to bet on 7/24 how long it would be in flux, I would have guessed 3 weeks.

But yes, once it's tested or trained, as the case may be, I expect it to settle into something consistent across Google maps, browsers and data centers.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

Events

LocalU Webinar

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom