More threads by MrAnderson

MrAnderson

Member
Joined
Mar 8, 2024
Messages
1
Reaction score
0
I've run into a couple of different agencies using the exact same content templates across thousands of clients, where all the services, sub-services, and several other miscellaneous pages are exactly the same. In some cases, they change the business name/city in the text; in others, it's exactly the same content with different title tags / metas. Searching a random paragraph from these sites on Google shows a good chunk of their clients, where Google shows all the duplicates in the serps for that paragraph of text search, totaling thousands for each.

Another network I found provides a product nationwide across a number of different sites but created a page for each major product for almost every city in the US, only changing the city name between thousands of city pages across numerous products. They mirrored this across a handful of sites in the same niche.

The problem is that these hyper-specific service or city pages have continued to rank locally despite being over-the-top duplicates, where the business could not afford unique content at that scale. There doesn't seem to be any filtering for duplicate content, instead it's rewarded.

In the past, I've tried to submit it to Google with zero effect here: Sign in - Google Accounts

Is there a better path to a manual intervention now that Google is trying to prove a point with their latest update?
 
I don't think you should bother reporting it. The fact is, the majority of content on local business websites is really the same stuff, just reworded. I don't Google is aiming these updates at local businesses because generally speaking, their sites are good for users.
 
I don't think you should bother reporting it. The fact is, the majority of content on local business websites is really the same stuff, just reworded. I don't Google is aiming these updates at local businesses because generally speaking, their sites are good for users.

On a similar note. I try to write original, informative, and helpful content. I see the type of content the OP describes on competitor websites. Those sites are indexed.

The fact is, the majority of content on local business websites is really the same stuff, just reworded.
Example: On plumbing websites I got good rankings for pages about a specific town/city and the water supply/quality.
A few years later/last year I did the same with pages for water heaters. One or several towns that share a water supply that is high in minerals such as calcium and magnesium will have residents replacing water heaters more frequently than those whose water supply has less minerals. On one website i had about thirteen such pages which, yes, were worded differently but also had unique content including from the official town water quality report.

Some got indexed, some didn't.

Can anyone suggest how to get those pages indexed?

TIA
 
Make sure high traffic pages on the site link to them.

Appreciate the reply. I do have internal links. Maybe I can do better.

Are you suggesting that Google crawler following links may be a better way to get indexed than site maps or submitting to GSC.

Thanks for this forum.
 
If that's the case, you could try an indexing tool like IndexMeNow.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

LocalU Event

Live Webinar - Local SEO Audits

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom