More threads by HoosierBuff

Joined
Dec 12, 2013
Messages
370
Reaction score
105
Hi all,

I have a new client in the office furniture space. They are a local player, and for many searches google will show a national set of results "desk chair" and others show a local snack pack in the results, and more local results.

In doing my research, I want to find their biggest opportunties. The searches that are national in nature aren't much of an opportunity. Sure, a person could add a modifier "desk chairs near nashville, tn", but, that will be thinner. The searches that automatically turn on the 3 pack are better opportunities (I think).

So, in looking over 100's of potential keywords, I'm trying to figure out which ones are local, and which ones are not. Manually typing them in is time consuming. Does anyone have a way of doing this easily? Am I thinking about this incorrectly?
 
We track 10,000+ KWs and with some basic math create a %-chance that a given query would produce a local box.

I suspect you could do something similar with just about any rank-checking service that will provide you with the raw SERP files. Should be pretty straight forward for a few 100 KWs. The downside is you have to re-run this every so often because Google is constantly pushing buttons and pulling levers, but at least it would give you a straight-edge (for a period of time).

The other option (not paying someone) is to fire up a python scraper with a bunch of proxies and keep firing away until you have what you need. With using a rank-checking service though you can likely perform this analysis across multiple user-agents easier.

There already may be some of this data in the wild, but I'm not aware of it.
 
We track 10,000+ KWs and with some basic math create a %-chance that a given query would produce a local box.

So you're using an equation to predict local box/map pack or to predict local intent in the client's area?
 
So you're using an equation to predict local box/map pack or to predict local intent in the client's area?

Both really. It helps us predict which KWs are more than likely going to pop a local box; which, we figure is a 'local intent' query and gear the site content at those terms/topics/'ideas' based on a client's service offering.

We don't look specifically at 1 metro, we look at 100's of them and classify them based on populations.
 
I'm looking at different ways to do keyword research, so i'm really interested in the methodology. Could you explain how you're doing this in a little more detail? What kind of variables go into the math equation, and how are you tagging the % of traffic as local intent?
 
So just FYI, SEMrush will tell you the % of a domains keywords that trigger specific SERP features e.g. Answer Box, Packs and also tell you what those keywords are, what there search volumes are etc.
 
If I remember right, Moz's new keyword tool they came out with two months ago (or rather, the updates on their old tool) lets you search through your list of keywords, and it'll organize them (among other ways) by which trigger different SERP features. Sounds like exactly what you're looking for, though I'm sure it's not the only tool that does it.
 
As a note, on long tail searches, ehether a pack is triggered is a variable based on signals Google is getting from third party local sites as to what is "important". Thus a phrase that triggers a pack in one area might not in another. I have tested triggering packs and been successful.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

Events

LocalU Webinar

Trending: Most Viewed

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom