I have to admit this is way over my head and I don't have time to dig in... But I think it could shed some light on Pigeon.
CCarter can you summarize here what you are trying to explain there since it all has to do with local? Not the part about you and competition but just what the SERPs are doing.
Yeah it's a bit involving since it draws upon several factors of previous conversations and analysis done.
First let me go into monitoring and the volatility metrics for a background. We monitor the volatility of each keyword and display them for users to see in the main interface when looking at the SERPs.
The volatility score is calculated by 1 movement up or down, equaling 1% of the top 10 results of a keyword (even though we monitor top 20 to 28, the volatility metric only takes into account the top 10). So if the URL in #1 position moved to #10, that would be a 9% movement - also vice-a-versa if URL #10 moved to #1, that would be 9% movement. Now that score would be if only 1 URL moved when comparing to the previous day's position. So as more URLs move up and down for that keyword the total volatility number will increase. if #1 moves to #10 that's 9% - and then #2 moves to #8 that's 6%. If nothing else moved for that day's comparison to the previous day, then the total for that example would be 15% (9% + 6%).
Now if ANYTHING drops out of the top 10, that's automatically the MAX of 10% - So If #1 goes below #10 spot (#11 to infinity) - that's total of 10%. We keep it simple - so if ALL top 10 drop out of the top 10 results, that would equate to 100% volatility.
I normally don't like seeing anything above 20% - unless a major update is happening, cause that would mean lots of strong movement. If you see 7 day averages of 20% or more, that would be the equivalent of two URLs in the top 10 dropping out of the top 10 EVERY DAY for 7 days - that's really incredible in my opinion.
To give a comparison the keyword "Garcinia Cambogia" has running 28 day average of 13% vol. - that's a manually monitored keyword. "Cheap Viagra" has an 28 day of average of 17% - that's a highly spammy and volatile niche. "Replica Gucci" is the most competitive niche I've seen in my life, but only has an average of only 13% volatility. For a visual here is a screenshot of the last 300 days of 'replica gucci'. All that white area are urls which are no longer ranking:
To see the URLs from the white area, set the filter to "All URLS within the View Range", you'll see there is so many URLs that the chart is completely un-readable (reason for the focus date):
So in that date range, there have been tons of URLs rankings and then disappearing - lots of activity, spammers and blackhats fighting one another, yet averaging less than 20%. That means when I say above 20% is high, that's REALLY high movement for the top 10. Since even the most volatile niches aren't seeing averages above 20%.
So we have a global volatility metric - soon available to all users, where we are able to see the volatility of all the SERPs we are monitoring in SERPWoo. Here is a screenshot of today's:
It's worth noting Moz uses the same methodology except they only monitor 1000 selected, private, keywords for their "Mozcast". Here is a screenshot of today's Mozcast:
Algoroo.com is another Google monitoring service that monitors an "unknown" amount of keywords - but does not state HOW they come up with the metrics, but still here is their screenshot of today:
Both Mozcast and Algoroo don't display today's data though unlike us, whether that's bad or good, is still to be determined since we have a running progress chart.
So that's ends the "rounding out the whole picture" of volatility. Now to come to what I was talking about at BHU (blackhatunderground dot net), we implemented volatility metrics upon each project, so users can group keywords together that are for clients, or related, or whatever and see the volatility for that set of data, like so:
^^ If you look closely, you notice something is way off when comparing the Celebrities and Christmas Holiday projects versus these "pseudo local terms". That's what the BHU post was focusing on, cause those terms are completely out of control. Now looking at the original Global Volatility screenshot of SERPWoo, you'll notice this update started on August 25th (I'm not sure if it's Google Pigeon related, but it's movement seen in SERPWoo, Mozcast, and Algoroo around the 25th to 28th days - SW just seems to be picking up more sensitivity - possibly due to our userbase, or that we calculate based on all data versus selected from the other guys).
On 8/25 is when these "pseudo local terms" all started going crazy. First what are "pseudo local terms"? They are terms that are either entered in nationally or globally, but should ALWAYS 100% of the time really return local results; "pseudo local terms" include localized and specialized repairmen - think air conditioning, generator repair, plumbers, garage doors, windshield repair. This also includes attorneys keyword phrases which are specialized think injury or ticket attorneys, and other regional services like cellphone service, internet services, phone service, etc.
Now it needs to be pointed out the reason we use "pseudo" is because there is no location based word inputted by users when searching, so if they are in Colorado looking for phone service, they aren't typing in "Denver phone service", or "Colorado phone service" they are simply typing in "phone service" - generic and broad.
Whenever someone types in these "pseudo local terms" in incognito mode - and this is where you have to pay attention close to what I am saying, Google will at times return localized results in incognito mode even with pws=0 and all those crazy local neutralizing variables - DURING AN UPDATE.
Now what happens when SW pulls data from Google, we used a variety of IPs and double and triple check rankings in a given pull for accuracy. So if we see something off, the system will check and re-check just to make sure those are indeed the results. "pseudo local terms" are really a special outlier terms if you think about it from a macro level, they should always return regional results so the user has the best experience but we as a crawler tell it not to - and it complies except when these updates happen.
Here are the 4 terms I'm monitoring in my "pseudo local terms" project:
--
--
--
Here you are seeing those spikes occur, that have regular data, or as I called them the old folks", and then the new areas (white areas cause there are not ranking on the set focus date - today, Sept 12th, 2014), where those were including some localized results even though we specified not to.
Those spikes are so disruptive they are causing volatility up to 90%+ - That's serious. If you were a website in those rankings you'd see constantly flux of traffic day by day - HOWEVER remember since these "pseudo local terms" are in a special category websites are getting regional searches already, so they will not necessary see a flux in their traffic in THIS special case. The flux is happening in what Google is returning, but since "pseudo local terms" always really return 99% of the time regional results, webmasters won't see this. This is one of those things that's an outlier of the algo which we are able to see, but doesn't really effect a webmaster at their level since they get regional traffic anyways, and when searcher search they are using personalized results. We are seeing an anomaly where we can detect when a Google update is occurring across the map, cause of this one 'glitch'.
I think the anomaly is due to databases fighting with one another, and there is missing data that has to be accounted for when pulling the results, so they put in data based on your IP address. Now you'll notice in the recent two days, there have been leveling off of the spikes in all 4 examples. This is the first time I've seen this, and could indicate that the major update that started on 8/25/2014 is finally coming to an end - or has ended, and google has turned off the "switch" for the update.
What's important to understand is, if these were not "pseudo local terms", and your website was ranking nationally and your SERPs were seeing this, you'd be getting traffic one day, then nothing the next - a constant pendulum of back and forth, which is crazy for any business. Looking at the volatility pie charts for the individual keywords, you are able to see the crazy averages and movements that NO website owner wants to see as volatility. That's why I say, anything above 20% is really out of control SERPs, and you might as well figure something else out for traffic, cause there is no chance. But since the "pseudo local terms" are regionalized, webmasters aren't seeing that flux.
So bottom line, it appears that we can see when Google is pushing an update through their system through this crack/anomaly in their settings. This is a great thing for us and the users.
Another thing we talked about in that BHU post was the ability to filter, so if people want to see what is ranking, and why, at the moment - you can filter the results accordingly. So if you think Higher PR is a factor, or lower Alexa, or backlinks or social signals you can filter with our pre-determined filters (soon going to customizable). Here is an example of the very first "pseudo local terms" filtered for URLs WITH social signals:
This shows there is only 1 URL ranking for their term with social signals, and even that is simply 1 linkedin. So the majority of the URLS do not have ANY social signals what-so-ever, which are ranking:
This might be just an anomaly, and users are able to make their own determination whether social makes a difference for their niche, keyword, or industry, or they don't make a difference. In this case - there is no ranking factor for social signals for any of the URLs which are ranking except the one with the 1 Linkedin.
That probably was a lot to take in, but a quick summary - Monitoring "pseudo local terms" allows us to see when a Google update is REALLY occurring and in the aftermath you can use various filters to dig deep into the winners and losers to see what makes a difference, what doesn't, and how you can move forward for your own SEO campaign.