- Joined
- Aug 8, 2012
- Messages
- 30
- Reaction score
- 4
Hi all,
[If this question belongs in the multi-location channel, or somewhere else please advise ;-) ]
We're offering a local search solution for various types of attorneys and related financial consultants. The program is layered on top of an existing service that is currently running in about 100 locations across the U.S.
We know from experience that client's almost never write effective, long-form, optimized content themselves, yet they need *something* to match local search intent. On the other hand, creating 25,000 words of unique content for each client would be cost-prohibitive.
The idea is to include a corpus of standard content (blog posts) as part of the package — each client would get the same content, with minor location tweaks. Each post would be professionally written to compete at a national level in terms of length, optimization, keyword clustering, etc.
So here it is . . . we all know the mantra about duplicate content, but we're wondering if it is a legitimate concern in this case, since clients would only be competing in their local markets, let's say Denver vs. Colorado Springs vs. Fort Collins. In other words, the bar is set lower for local search than it is for national search. We're thinking the benefit of the content to clients in local markets would outweigh any concerns about duplicate content.
Do you agree? Is there any problem including non-unique content in this context? Please let me know. Thank you!
[If this question belongs in the multi-location channel, or somewhere else please advise ;-) ]
We're offering a local search solution for various types of attorneys and related financial consultants. The program is layered on top of an existing service that is currently running in about 100 locations across the U.S.
We know from experience that client's almost never write effective, long-form, optimized content themselves, yet they need *something* to match local search intent. On the other hand, creating 25,000 words of unique content for each client would be cost-prohibitive.
The idea is to include a corpus of standard content (blog posts) as part of the package — each client would get the same content, with minor location tweaks. Each post would be professionally written to compete at a national level in terms of length, optimization, keyword clustering, etc.
So here it is . . . we all know the mantra about duplicate content, but we're wondering if it is a legitimate concern in this case, since clients would only be competing in their local markets, let's say Denver vs. Colorado Springs vs. Fort Collins. In other words, the bar is set lower for local search than it is for national search. We're thinking the benefit of the content to clients in local markets would outweigh any concerns about duplicate content.
Do you agree? Is there any problem including non-unique content in this context? Please let me know. Thank you!