More threads by Elliot

Elliot

0
Joined
Feb 24, 2015
Messages
23
Reaction score
1
I'm having a dilemma as to trying to decide how to structure the content in a website for a service based company with multiple locations in multiple states.

The root domain, we'll call it myservice.com, contains generalized product and service information that is non-location specific.
e.g. myservice.com/services/roto-rooting

My plan is to build out the unique city landing pages with unique localized content ON the landing page (myservice.com/atlanta) BUT duplicate content on the information pages (myservice.com/services/roto-rooting is basically going to be the same as myservice.com/atlanta/services/roto-rooting).

My first question is, should I ditch this method? My goal is to obviously rank the local city in local search results. If this method of siloing is viable, my second question is, what's the best way to write the content? Should I continue to use duplicate content + canonical pointing back to the original page? Or should the service and product pages be unique and spun with localized terms?

Thanks in advance for any advice!
 
I have been following with interest a website that is very similar to what you are looking for. This is a service based business with presence across the United States. And more interestingly, they rank on the top of search for most of the cities.

As far as content goes, the website doesn't even link to the various cities from the homepage. Instead, I presume it's all through a sitemap submitted over GWT. The content too is not different on the various pages - the only change is the name of the location.

What goes for them is the terrific website authority that they have built as compared to the local competitors.

So to answer your question, I think content duplication may not be a factor in this case. Or possibly the website I have been monitoring is an outlier thanks to the terrific authority.
 
Thanks for the response!

So in that websites case, are they structuring the content the same as I would plan to? I think my next question is: Is the localized site (mydomain.com/atlanta) ranking powerfully in Atlanta because even though the content is duplicate, it makes it stand as a content rich site for that local search? Or is it irrelevant because of the domain authority?
 
Thanks for the response!

So in that websites case, are they structuring the content the same as I would plan to? I think my next question is: Is the localized site (mydomain.com/atlanta) ranking powerfully in Atlanta because even though the content is duplicate, it makes it stand as a content rich site for that local search? Or is it irrelevant because of the domain authority?

I would say it is likely due to the domain authority.

A follow-up question would be: How do the other area specific pages rank?

You may find a situation where Google only ranks 1 of the duplicated pages well and the rest seem to flounder. I would also check Bing to see if each location page is even indexed. Bing in my experience is much more harsh in-regards to duplicated content.

---

If you're worried about duplicate content with a LARGE number of service-area specific pages on the primary domain (myservice.com), I might suggest thinking about taking a sub-domain approach and using tactical links to pass authority to the sub-domains where they make sense.

atlanta.myservice.com
newyork.myservice.com
orlando.myservice.com

Using the sub-domains should help insulate your primary domain from any thin-content penalties Google might leverage based on duplicated content across your service-area specific pages as Google considers each sub-domain as a separate website. For this tactic to be valuable to you, you really have to have a excessive amount of duplication (or a future risk of it).

The sub-domains may not rank as well as those same pages found in a sub-directory (in the short-term), but you'll have some area specific websites you can link-build for without risking a manual action on the primary domain. #tradeoffs

Hope this helps!
 
If you're worried about duplicate content with a LARGE number of service-area specific pages on the primary domain (myservice.com), I might suggest thinking about taking a sub-domain approach and using tactical links to pass authority to the sub-domains where they make sense.

atlanta.myservice.com
newyork.myservice.com
orlando.myservice.com

Using the sub-domains should help insulate your primary domain from any thin-content penalties Google might leverage based on duplicated content across your service-area specific pages as Google considers each sub-domain as a separate website.

Not the OP, but that is an interesting suggestion. But I worry about a couple of things here:

1. Now since subdomains are treated as independent websites, the impact due to duplication is likely to be higher (since they do not belong to the same website and hence may not get the benefit of doubt)

2. Each of the various subdomains (and the main domain) is thin content now, instead of the original where we had dozens of pages in one website - each page catering to a city.

3. Building domain authority is hard since the backlink profile now gets thinly spread across all these various subdomains, instead of being focused on one main website.
 
Not the OP, but that is an interesting suggestion. But I worry about a couple of things here:

1. Now since subdomains are treated as independent websites, the impact due to duplication is likely to be higher (since they do not belong to the same website and hence may not get the benefit of doubt)

2. Each of the various subdomains (and the main domain) is thin content now, instead of the original where we had dozens of pages in one website - each page catering to a city.

3. Building domain authority is hard since the backlink profile now gets thinly spread across all these various subdomains, instead of being focused on one main website.

There certainly are trade-offs (which you listed); however, I'm not convinced of #2.

When you have a franchise with over 300+ locations this was the only way our team learned to prevent thin-content penalties while still being able to opt for a large number of service areas.

Google seems (to me at least) to be more forgiving of duplicate content across different websites vs on the same domain (we use a large amount of stock content). But I'm willing to listen if you have an alternate experience.

Sorry OP, didn't mean to hi-jack your thread. :(
 
I would be interested to hear from people who have worked on similar projects. I have not personally worked with local businesses across so many locations, so my only point of reference is one such business that I see is consistently ranked for each of the city it is present in. Their content is pretty much the same across all cities (except for the change in the city name and the phone numbers). I don't mind sharing the link here if this is acceptable.

Google seems (to me at least) to be more forgiving of duplicate content across different websites vs on the same domain (we use a large amount of stock content).

Thanks a lot for sharing your experience, but it sounds pretty counter-intuitive to me. It is more likely for Google to face duplicate content issues within a domain than across domains. For example, Wordpress blogs regularly contain the same text content across a blog post, its tag page, its category page as well as on the archives. This should not trigger duplicate content penalties. However, if such content is present across several domains, then it is quite logical that this was plagiarized from one main source.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

LocalU Event

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom