More threads by djbaxter

djbaxter

Administrator
Joined
Jun 28, 2012
Messages
3,778
Solutions
2
Reaction score
1,877
Google's Matt Cutts: 25-30% Of The Web's Content Is Duplicate Content & That's Okay
Search Engine Land
December 16, 2013

Matt [Cutts]said that somewhere between 25% to 30% of the content on the web is duplicative. Of all the web pages and content across the internet, over one-quarter of it is repetitive or duplicative.

But Cutts says you don?t have to worry about it. Google doesn?t treat duplicate content as spam. It is true that Google only wants to show one of those pages in their search results, which may feel like a penalty if your content is not chosen ? but it is not.

Google takes all the duplicates and groups them into a cluster. Then Google will show the best of the results in that cluster.

Matt Cutts did say Google does reserve the right to penalize a site that is excessively duplicating content, in a manipulative manner. But overall, duplicate content is normal and not spam.

[video=youtube;mQZY7EmjbMA]http://www.youtube.com/watch?feature=player_embedded&v=mQZY7EmjbMA[/video]
 
Matt Cutts may say that but Google doesn't always do what it says :(

In a niche we are involved with, there is a page ranking which is just a word-for-word copy of a Wikipedia article except that it has four comma-separated keyword phrases added to the end of each paragraph.
 
Matt Cutts may say that but Google doesn't always do what it says :(

In a niche we are involved with, there is a page ranking which is just a word-for-word copy of a Wikipedia article except that it has four comma-separated keyword phrases added to the end of each paragraph.

???

How does that contradict what Cutts says in the video?
 
Hi DJB

Didn't he say that if it is manipulative or not adding value, it won't rank...
 
From the quote above:

Matt Cutts did say Google does reserve the right to penalize a site that is excessively duplicating content, in a manipulative manner. But overall, duplicate content is normal and not spam.

In my experience, what that means is that if you are scraping entire websites or even duplicating your own content across different exact match domains (EMDs) as a way of trying to gain an SEO advantage, sooner or later they will intervene probably with manual penalty of some sort. But in general, they expect and tolerate duplication - they'll just select and rank the best match for a specific search and ignore the others.

That is not to say that some sites may get away with those manipulations for a while, though. I've seen that over the years repeatedly. But then when the site is slapped they are slapped hard.

In your example:

In a niche we are involved with, there is a page ranking which is just a word-for-word copy of a Wikipedia article except that it has four comma-separated keyword phrases added to the end of each paragraph.

I would wonder how long has that site been up and how long has it been ranking well using that strategy. It may be that they've just managed to stay below the radar so far. It's also possible that if that's only one page on the site there may be sufficient merit and authority in the rest of the site to pull it up.
 
Duplicate content does not rank well. Many times we have taken content from a canned client website, rewrote it, and it ranked very well.

The only piece of duplicate content that will rank well is the original piece. They determine what the original piece is by date I would imagine but also through authoritative links as well I would think.

Duplicate content is an issue. If you have it, that page won't rank well.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

LocalU Event

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom