Joined
Aug 8, 2012
Messages
27
Reaction score
1
Hi all,

[If this question belongs in the multi-location channel, or somewhere else please advise ;-) ]

We're offering a local search solution for various types of attorneys and related financial consultants. The program is layered on top of an existing service that is currently running in about 100 locations across the U.S.

We know from experience that client's almost never write effective, long-form, optimized content themselves, yet they need *something* to match local search intent. On the other hand, creating 25,000 words of unique content for each client would be cost-prohibitive.

The idea is to include a corpus of standard content (blog posts) as part of the package — each client would get the same content, with minor location tweaks. Each post would be professionally written to compete at a national level in terms of length, optimization, keyword clustering, etc.

So here it is . . . we all know the mantra about duplicate content, but we're wondering if it is a legitimate concern in this case, since clients would only be competing in their local markets, let's say Denver vs. Colorado Springs vs. Fort Collins. In other words, the bar is set lower for local search than it is for national search. We're thinking the benefit of the content to clients in local markets would outweigh any concerns about duplicate content.

Do you agree? Is there any problem including non-unique content in this context? Please let me know. Thank you!
 

Phil Rozek

Local Search Expert
Joined
Jul 26, 2012
Messages
1,993
Solutions
15
Reaction score
1,627
@Michael Charvet, I would say there's no inherent problem in using that kind/amount of boilerplate content, at least in terms of rankings. The fear of a duplicate-content "penalty" seems to be a myth. Sure, it's extremely common that pages full of duplicate content don't rank, but there's always at least one clear other reason for that. To wit:

1. It's competing against other, similar pages on the same site that do rank.

2. It's competing against a similar page on another site, perhaps owned by a competitor who works with the same duplicate-content-producing outfit, and the competitor has an edge in terms of inbound links or longevity in the search results. In other words, the duplicate content doesn't put anyone ahead or behind, and so the rankings come down to other factors.

3. It's so crappy it never gets enough good links to rank for national or international terms.

4. It's basically on-topic, but isn't too focused on the service(s) or service/catchment area. As in it's got 1500 words on (in the attorney example) notable Supreme Court cases, but no content on the specific types of cases attorneys handle, where clients come from, which offices/attorneys handle those cases, FAQs, case results, reviews, or anything else that local searchers (and Google) actually care about.

The duplicate nature of it is a red herring, I'd say. What I'd look out for are the other problems (above).

If you use that duplicate content and the rankings improve and your clients get clients, great. Both of those things may very well happen, because that plays out all the time. But if either one doesn't happen, then I'd say it's still fine to use some amount of duplicate content, but this time as a base that you add more-bespoke content onto.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

LocalU

LocalU Nov2021

Most UpVoted Answers

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Google Product Exert


Top Bottom