More threads by Rich Owings

Rich Owings

0
LocalU Member
Joined
Apr 21, 2014
Messages
784
Reaction score
546
Okay, this has happened to me twice in two days, so I want to see if anyone has any insights to share.

I go into Google Search Console and see a bunch of obviously hacked URLs (pharma related) showing up as 404s. But there is no malware warning and a Sucuri scan shows no issues.

I asked the first client yesterday who told me that yes, the site had been hacked, but it was cleaned up, and they had even moved to a new host. But Google is still coming across these URLs in recent crawls, even though they 404 and have been removed from the site.

Any idea why? And should I be concerned?
 
I'm dealing with that now. Google should eventually remove those out of its index. You definitely don't want to 301 redirect them, so let them be.

But do check the "sites linking to you" report because there you may find a bunch of spam sites with backlinks to yours. Submit a disavow for those.
 
I had the same issue a with a WordPress website. What I did was cleaned up all files that had malware code injections (I reviewed each file and changed file permissions (644). I also saw some files that were called emad.html that was added to some of the folders as well. Once I removed all the malware, I added a security software to scan the site daily for any issues moving forward. Once I did that it fixed the issues and did not see any more spam 404 URL's in webmaster tools. I did not disavow the URLS I saw in webmaster. I would curious to see what would happen if you decide to do that
 
Thanks for taking the time to share your thoughts. I looked through GSC and didn't find any links from spam sites. I'm waiting to hear back from the webmaster to see if they can uncover anything and will update here if I learn more.
 
We had an issue with that.

I would just mark them as fixed. It takes time but they will be removed eventually.
 
We are dealing with this issue right now too on a few websites - no malware warning, Sucuri shows no issues, our programmers cannot find anything. One website we have been battling this issue for well over a year - marking them in GSC as fixed and they just come back. We do disavow all spammy links to the site.

Although Google states that 404 errors do not affect ranking, we have seen on at least 3 websites that once the issues start their organic search traffic plummets. Over time it bounces back, but the GSC 404 errors don't go away. Even after marking them as fixed and resubmitting sitemaps into Google, etc.

Any further insight into this issue? My team has done a lot of research without any clear solution.
 
We have that issue with one client in particular. They came to us with over 6,000 nott-found problems in GSC, almost all 404, and a lot of them were for non-existent URLs on the client's site with pretty awful porn terms. Makes me think that they were hacked at some time long ago, but we're not certain about that. Anyway, that sort of thing accounts for more than 90% of the problems.

In a couple of other forums, we received advice to create 410 redirects for the porn term 404s. I created a few regex-based redirects to do this and then marked the 6000 or so as "fixed" in GSC. Now, months later, we're back up to 3600 or so not-founds in GSC again, with some 404s and some of the 410s.

Did we do the best thing, following that advice?

I would just disavow the URLs that are sending us those porn-term-links, but it's not possible to manually go through the process of extracting that many of them, for often each not-found item is linked to from more than one linking URL.

Does anyone know of a tool that will automate the extraction of those linking URLs so that we build a disavow list from them?

Thanks!
 
Hey Rich,

The proper way to deal with pages that no longer exist, or pages you don't want in the index any longer is to use the 410 response. Not a 301 or 302.

I recommend the wordpress plugin called 410 for wordpress to do this simply.

Doing so will remove those pages from the index in days instead of waiting for Google to figure it out.

I had 2 new clients in the past 90 days that were hacked and had between 1500 to 3500 hacked pages created.

We deleted pages. Added 410 to all pages. Updated GSC as fixed. Waited a couple days to be removed from index and a couple weeks for the manual penalty for hack to be removed.

I usually get a message in GSC from Google notifying me that manual actions for the hack have been removed with 14 days after using 410's and adding Wordfence for security.

Sent from my SM-G900V using Tapatalk
 
...I recommend the wordpress plugin called 410 for wordpress to do this simply...

Wow, thanks, Cody. I did not know about that plugin. If it works as advertised, it should be a great help for us with a couple of our site.

This is a great find.:D
 
Great stuff.

I'll report back now and what we have going on with our own problem. And please tell me what you think of my solution.

This client had a Joomla site using the K2 extension through which hackers got in and placed pages with backlinks to their money pages. They hacked thousands of sites like this, maybe tens of thousands. They then created a link wheel. Site B is pointing to our site which then points to their money site, etc., to put it simply.

The result is my client has over 13,000 BAD backlinks pointing at her.

Her hacked pages are now long gone, but the links still remain in other hacked sites and in robot generated blog and forum comments.

Now to make it worse, most of those bad backlinks are pointing to dynamic variations of her home page url with a userid that keeps changing in the url string. That means there are new urls generated each time. They don't just point to the same one over and over. And the creation of these backlinks is on autopilot. Hundreds of new backlinks are auto-generating every month. The way it looks I may have to disavow forever.

Digest that a moment and you'll get a big YUCCH.

K2 no longer exists on this new Wordpress site, so all those links were either bouncing or redirecting "traffic" to the home page which, of course, was not generating 404s, and so the home page shows over 13000 backlinks pointing at it even though the urls all look different (userid=1, userid=2, etc.) And each time they create new links, they create new userids! So it is a never ending story.

I have a redirect now that is taking all the variations of that url and pointing to a 404 like this:

RewriteCond %{QUERY_STRING} ^option=(.*)$
RewriteRule ^(.*)$ /404? [R=301,L]

And I purposely made that 404 page non-existent so it generates a server 404 to tell Google all those urls, and those to come, don't exist.

My hope is that google would discount the backlinks because they are 404s AND that the 404's would bounce back a message to the hackers to REMOVE her from their wheel of death, assuming they are paying attention with some alert script.

Currently there is only ONE 404 in WT - my 404 page.

I'd love to get other expert opinions if you have them.

Rich if you don't have spammy backlinks, thank your lucky stars!
 
I should add that when I started, there were about 400 pages in the index. There are now 199, but there should only be about 25. The rest are the non-existent hacked pages.
 
Kathy

I haven't developed or re optimized a Joomla site in years. But I would recommend the same advice about using 410 response for all the pages created by the hack.

Hopefully you have a csv or excel sheet with those bad urls still. They will be gone from GSC if you marked them as fixed.

I'm sure that there is a plugin for 410's.

Same goes for security plugins. I have used Securri and Wordfence. I like that Wordfence doesn't send upsell emails and it is my prefernce. I'm sure that Joomla must have an acceptable alternative.

It may go unsaid but site:example.com works well to see which pages Google still has in the index.

Regarding Backlinks & Disavow Tool

I highly recommend following Marie Haynes advice moz.com/blog/guide-to-googles-disavow-tool

Sent from my SM-G900V using Tapatalk
 
Thanks Cody. I don't think the plugin will help me because of the fact that there is a never-ending stream of new home page url variations, with 13,000 to start with. I don't want to enter all those by hand.

Also I can't even 301 them because of that. Otherwise, I'd have a list of 301's a mile long and building maybe for eternity.

As for the disavow recommendation, same thing applies. I will be disavowing for eternity or as long as the hackers keep autogenerating these unique urls. That is not a good solution, but I am doing it. I just want it to end, and I'm hoping my strategy will do it.

But perhaps I need to 410 instead of 301 in my redirect script. But how? Here's what's there now:

RewriteCond %{QUERY_STRING} ^option=(.*)$
RewriteRule ^(.*)$ /404? [R=301,L]

I don't think I can say make all those 410s because they all land on the home page. Wouldn't that make the home page 410? Or is the server and Google smarter than that?

Or should I redirect them first so they don't point to the home page, and then 410 them? If so, what code should I use for that?

Do you see my logic? I'm attempting to redirect them all to one page, in this case the /404 page. I can make that anything. it can be /badpage. Then if I 410 that page, doesn't that de-index all that were redirecting to it?


Thanks again.
 
Any further insight into this issue? My team has done a lot of research without any clear solution.

I'd prefer to get to the bottom of the issue, so if the client is okay with it, I'd sign them up with Sucuri and let those guys figure it out.
 
Direct message me Kathy and I'll give my number if you'd like to discuss in further detail.

Sent from my SM-G900V using Tapatalk
 
I'd prefer to get to the bottom of the issue, so if the client is okay with it, I'd sign them up with Sucuri and let those guys figure it out.

We have worked with Sucuri on multiple sites with this issue and Sitelock on one -- neither company could find anything wrong in the websites. However, the 404s in GSC are still rising. Its crazy.
 
Sucuri wouldn't find anything wrong. You've already cleaned it up! What you are left with is the same thing we were - automated backlink generation to pages that don't exist. If you can, do what we did. We had thousands of 404s. Now there is only one.

You can see what I did above. Cody recommends using a 410, but I just left it the way I had it because it seems to be accomplishing what I want just fine.


In addition the number of bad backlinks pointing to her site has dropped 2.7K, and I hope that will continue to go down as their machine figures out those links just bounce.

She also is on page 1 in SERPS for 50 important keywords where she was nowhere to be found before.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

Events

LocalU Webinar

Trending: Most Viewed

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom