More threads by Dan Foland

Dan Foland

SEO Director at Postali
Joined
Sep 25, 2018
Messages
110
Reaction score
137
If you aren't aware yet Google has confirmed that there is a bug in their system that has been de-indexing pages from its index.

John Mueller confirmed the bug in a recent tweet. He also said that it has been fixed. However, people are still experiencing the issue. Just an hour ago (15 hours after Mueller said the bug was fixed) I realized one of my client's homepages was missing from the index, even though Google Search Console said it was indexed. Fortunately, using the URL Inspection tool in Google Search Console I was able to request indexing which worked and shot the page back up into the top 5.

My suggestion is for everyone to Google their priority pages and if it doesn't show up near where you expected to manually request Google re-index it.
 

Despite Google saying on Saturday morning it was fully resolved, it was not.

John Mueller from Google chimed back in on Sunday morning after many webmasters still didn’t see improvements and said on Twitter “Just a short update on this — indeed, it does look like there are still some pages that need to be reprocessed. Our systems are making good progress here, but it’s taking longer than I initially expected.”
 
Here's some speculation that I have heard... It was not a bug, but an update that did not go according to plan. Google was testing an algorithmic way to de-index spammy sites. It went too aggresive.

However, the counterpoint to that is then, why would it be so easy to re-index through Search Console to get the site back up?

But then again, most spam & blackhat sites don't connect to Search Console because they don't want Google tracking them.
 
Literally ran into this today. A new client emailed me about a page of theirs getting removed and I didn't even realize this bug was still existing. Resubmitting the page in Search Console instantly fixed it.
 
Here's some speculation that I have heard... It was not a bug, but an update that did not go according to plan. Google was testing an algorithmic way to de-index spammy sites. It went too aggresive.

However, the counterpoint to that is then, why would it be so easy to re-index through Search Console to get the site back up?

But then again, most spam & blackhat sites don't connect to Search Console because they don't want Google tracking them.

Now that is slightly concerning. But can't say it's not possible.

Where did you hear that?
 
Blackhat chatter, nothing concrete. Think some were getting concerned and were looking at all possibilities.

Pretty interesting and intelligent observation imo. Not sure it's true. But it's still creative, nonetheless.
 
@Yan Gilbert

I saw this same theory from a blackhat SEO (very possibly the same one).

I don't see any evidence this is true. I had a few sites that had pages de-indexed and none of them did anything remotely spammy. Additionally, high authority websites such as Wikipedia, Amazon, and Facebook experienced the issue too.

Surprisingly, I can't find anything where people have found a pattern in pages that were deindexed. There has to be some reason these specific pages were de-indexed. This Moz post is the best analysis I've found. Though, it still doesn't have any information on why it happened.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

Events

LocalU Webinar

Trending: Most Viewed

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom