More threads by cleverlyengaged

Joined
Jul 16, 2014
Messages
87
Solutions
1
Reaction score
25
I have a client with 700+ gas station locations and we are discovering a variety of problems on Google Maps.

1. Old gas stations that are still open in Google Maps at the same address as the client.
2. Convenience store partners that are incorrectly categorized as gas station, creating two businesses with the same category, same phone # and same address.

Here is an example of a duplicate:
https://maps.app.goo.gl/CitzGtr6ysHLB6MNA
https://maps.app.goo.gl/fbbcQ3pwhsPSVXcUA

I have attempted to get AI tools to help with this effort, but none appear to be able to crawl or access Google Maps. This is where most of the issues are coming from. Trying to scale and look at all locations, but curious if anyone has a recommendation
 
If I had to solve this problem I would :
  • build a list of all the gas stations in my client's portfolio into a google sheet
  • get the zip codes for each one as a column in the google sheet
  • work with a rank tracker api (I like dataforseo for adhoc problems like this)
  • build a script that performs a search in each zip code for "gas station"
  • store the results in Google BigQuery.
  • build a looker studio report to help me look at each city to find problems
  • compile a list of problem locations
  • go through normal GBP processes to report problems.
I know this is a pain in the behind amount of engineering, but this is that kind of problem to solve.
 
I think that valueserp or scaleserp (both are Traject companies) might even allow you to use a csv to enter all your locations to perform the searches for: https://trajectdata.com/serp/value-serp-api/ and they can export the results straight to bigQuery for you. This convenience would solve nearly all the engineering problems for you.

Looking at their docs at https://docs.trajectdata.com/valueserp/search-api/results/google/places endpoint, I see that they return the maps data you could use to find problematic locations by comparing the values found in the gps_coordinates.longitude,gps_coordinates.latitude, address, or extensions.[3] fields.

You could also highlight duplicates by making a table in looker studio with the data that has any ONE of these fields in it, and then have another field called record count (ie just two fields in the table) and then make a filter on that table to only show rows where record count > 1. This would then show all the addresses that have more than one GBP at it.

I bet if you reached out to their team with your exact problem they'd help you set up the data for the data pull.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

Events

LocalU Webinar

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Back
Top Bottom