- Joined
- Jun 6, 2018
- Messages
- 573
- Reaction score
- 55
Hey guys, a local plumber here has a site which was ranking very well for the past 5 years until last year when his Stats started slowly dropping each month. We were baffled by this as we were consistently still doing one monthly blog and monthly backlinking without missing not even month.
I only recently noticed that only his blog pages had issues all this time not being indexed so my fault in not seeing this sooner. GSC shows 59 blog pages not indexed so I reached out to a WP Coder who works on getting those pages indexed. Here's what he says when I asked 'why the pages were not being indexed any more'?
So I then asked what needs to be done to 'Reset the crawl budget' and also if that will solve the problem so no more indexing issues? His response:
So guys, any word on all of this please, is he 100% accurate in every he has said that nothing can be done to make sure all 'future' pages will be indexed
I only recently noticed that only his blog pages had issues all this time not being indexed so my fault in not seeing this sooner. GSC shows 59 blog pages not indexed so I reached out to a WP Coder who works on getting those pages indexed. Here's what he says when I asked 'why the pages were not being indexed any more'?
Issue is crawling. And also a crawl budget. Crawl budget is the number of pages Googlebot crawls and indexes on a website within a given timeframe.
So if your number of pages exceeds your site’s crawl budget, you’re going to have pages on your site that aren’t indexed. and they won't get indexed until the crawl budget is reset.
So I then asked what needs to be done to 'Reset the crawl budget' and also if that will solve the problem so no more indexing issues? His response:
Nothing.. we can't do anything to it..They pile up over time, we can try to index them. Most of These urls have not been crawled properly so i will re-crawl them which will result in Indexing. My work only affect current pages. So there won't be any fix for future post or pages.
But if you have a lot of urls already indexed those will help new one get indexed and also crawler won't have to worry about old urls..
So guys, any word on all of this please, is he 100% accurate in every he has said that nothing can be done to make sure all 'future' pages will be indexed