More threads by HoosierBuff

Joined
Dec 12, 2013
Messages
370
Reaction score
105
Hi all,

I'd love to get your thoughts on Joy's article from last week:

In the article, basically, boilerplate service area pages were deleted, and redirected to the home page. Soon after the deletion, ranking improved.

There is some sense to this, and you have heard case studies where a client removed a bunch of boilerplate, or other pages, and the ranking increased. . .but:

What is the mechanism that caused this? I would think that if you had a bunch of bad pages, Google would simply ignore them, and the damage would only come by wasting link equity on these pages. I can understand if the boilerplate pages were linked across the entire site, and it sort of diluted the page rank, but this does not seem to be the case (Joy said that the pages were not heavily linked). It seems like diluting link equity had no role in this.

The way this case study seems to work is, by removing bad pages, Google evaluated the domain more favorably overall, and ranked things higher.
 
I would think that if you had a bunch of bad pages, Google would simply ignore them, and the damage would only come by wasting link equity on these pages.
It's that, plus possibly wasting internal linking opportunities on pages that are unlikely to rank, plus spreading out relevant content over many pages when you could instead consolidate all the good stuff and put it to work on one page or a few pages.

It's a case-by-case question, but if none of the above happened in this case, I would guess the increase was due to something else. I've never found that the act of content-pruning alone produces a bump. There's usually more going on than that. Those other actions seem to produce a bump.
 
You're both spot on - the change was a bit like our old SEO link colander vs SEO link funnel analogy when we talk about link equity on a website.

A lot of internal links went to A LOT of poor quality, low visibility pages (a colander with lots of holes). We harnessed that link equity and pointed it at ONE social security disability insurance lawyer page (a funnel with one hole)

We might have been able to spend a bunch of time improving those pages - but in all honesty, they were outside of a reasonable service area in most cases and not worth the effort to get a page ranking in tiny towns, hours away from the client's brick and mortar.

When we removed the poor quality pages, changed all of our internal links to go to one SSDI page, removed the multitude of links that went to the poor pages (plugged up the colander) that's pretty much the only change that correlated to the beginning of the rise in rankings.

Hope this helps explain the strategy more!
 
Thanks for the additional insight.

I have a client right now that created boilerplate pages for specialties for 9 cities, creating 5 speciality pages for each city (Mold removal elmhurst, mold removal aurora, mold testing elmhurst, mold testing aurora, etc.).

The pages are all exact copies with the title changed. I'm of the mind to delete them, though they really aren't linked to that much, so I don't see that we would get quite the gain.
 
Thanks for the additional insight.

I have a client right now that created boilerplate pages for specialties for 9 cities, creating 5 speciality pages for each city (Mold removal elmhurst, mold removal aurora, mold testing elmhurst, mold testing aurora, etc.).

The pages are all exact copies with the title changed. I'm of the mind to delete them, though they really aren't linked to that much, so I don't see that we would get quite the gain.
So - this tactic CAN work - @Colan Nielsen spoke about it at our last LocalU Event. Duplicate pages are not necessarily a deal-breaker - BUT - they still need the normal marketing love we give pages. Adding a few at a time can also help - there are not 45 new pages for google to see all at the same time (which makes duplicate content more visible to them, I'd think)

Internal Links
External Links
If they don't get any traffic, then start tweaking the content.
  • add some FAQ
  • Add an <ol> or <ul>
  • Make the ones that are struggling more unique
I'm not a big fan of this types of content creation - but it CAN work with lower competition keywords
 
So - this tactic CAN work - @Colan Nielsen spoke about it at our last LocalU Event. Duplicate pages are not necessarily a deal-breaker - BUT - they still need the normal marketing love we give pages. Adding a few at a time can also help - there are not 45 new pages for google to see all at the same time (which makes duplicate content more visible to them, I'd think)

Internal Links
External Links
If they don't get any traffic, then start tweaking the content.
  • add some FAQ
  • Add an <ol> or <ul>
  • Make the ones that are struggling more unique
I'm not a big fan of this types of content creation - but it CAN work with lower competition keywords
This is very interesting to bring up. Because I may have gotten the wrong takeaway from the LocalU event. The information presented made me think that duplicate content was not being negatively scored as much as before, and that opening a site up to (for example) a multitude of similar Service Area pages potentially would not be as detrimental as before, and that you may actually see increases by doing this.

However, Joy's report seemed to say the exact opposite. But it seems like the age-old process will probably stay true: keep adding until you see metrics plateau or decrease, and then you'll know when you have hit your limit.

Long story short: is anyone taking this advice to mean "duplicate content Service Area pages / Service pages are back on the menu for success"?
 
This is very interesting to bring up. Because I may have gotten the wrong takeaway from the LocalU event. The information presented made me think that duplicate content was not being negatively scored as much as before, and that opening a site up to (for example) a multitude of similar Service Area pages potentially would not be as detrimental as before, and that you may actually see increases by doing this.

However, Joy's report seemed to say the exact opposite. But it seems like the age-old process will probably stay true: keep adding until you see metrics plateau or decrease, and then you'll know when you have hit your limit.

Long story short: is anyone taking this advice to mean "duplicate content Service Area pages / Service pages are back on the menu for success"?
I think the difference is in how the content was added & treated. Duplicate-ish content can work to a point - but you have to treat it well. Link it, promote it, test it and make sure it converts, optimize it, track it, get it traffic, watch its progress and tweak it if it doesn't start ranking.

If you plaster up 50 identical pages and ONLY add in some half-baked internal linking, it's very unlikely to be successful.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

LocalU Event

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom