5 Reasons Why Google Is Not Indexing Your Pages and How to Fix Them
Have you noticed a decline in the number of indexed pages of your website reported by Google? In this article, we are going to throw light on why the number of your indexed pages can decrease and how you can tackle it.
It is important to get your pages indexed as most of the traffic on a website comes from search engines, particularly Google, which transmits more than 90 percent of Internet traffic.
Indexing helps a website or page actually appear in search engine results, which is typically the first step to ranking and generating traffic.
How to check how many pages are indexed?
Put the following command on your browser
- check the status of your XML sitemap submission on Google Search Console
Now let’s discuss and analyze how to diagnose the issue and look at possible solutions to fix it.
If your indexed pages count starts decreasing, the following could be the reason:
- Google can’t crawl your pages
- Google finds your pages irrelevant
- Google may have penalized you for one or more issues
Few tips on how to determine and fix the problem:
- Page load:
Check if your pages are loading properly with 200 HTTP header status.
Did the server experience longtime or frequent downtime?
Did the server recently get expired or renewed late?
Action item: Use a free HTTP Header Status checking tool to determine whether the proper status is there. The correct response for your header should be 200 HTTP.
- Duplicate content:
Duplicate site content are those which appear on the web in more than one location. Duplicating site content burns the crawl budget which sometimes leads to de-indexing of pages.
Check duplicate content or fix canonical tags, 301 redirects, noindex tags. A wrong tag can result in de-indexing of pages.
Action item: All the above-mentioned tags are important. You just need to be double sure if all the tags have been implemented properly.
- URL change:
Have you recently altered or modified your page URL? Search engines might only know the old URLs. If the old URLs are not redirected correctly, chances are a lot of pages can be de-indexed.
Action item: Set up a 301 redirect so that any links to your non-preferred URL will go to the right ones.
- Search Engine Bots:
Sometimes what search engine spiders see and what we see is different. This content seen by the spider might have been intentionally done or by a third-party hacker to promote their own hidden links and get traffic stealthily to their own website.
The worst situation can be if pages are infected with some kind of malware, in which case, Google will automatically deindex the page immediately after it has been detected.
Action item: Use Google Search Console’s fetch and render feature to see if Googlebot views the same content as you do.
- Page time out Servers have bandwidth restrictions:
If your server is hitting the maximum bandwidth, the page will time out and a normal search engine bot will not be able to crawl the site properly, which might result in deindexing your pages.
Action item: The only way to fix the issue is by upgrading your server bandwidth.
Even though indexed pages are not the only performance indicators for search engine optimization, indexed pages are most likely to show up in Google search results and have the ability to drive organic traffic to your website.
Therefore it is important that search engines are able to adequately crawl and index your pages. Unless the search bot is able to crawl them properly, your web pages can’t be found in the search engine results page for relevant keywords.