In the Google SEO office hours hangout, Google’s John Mueller was asked why Google did not crawl enough web pages.
The person who asked the question explained that Google is not growing fast enough to keep up with a huge website.
John Mueller explained why Google may not be able to crawl enough pages.
Also read: clickfunnels alternative
Google Crawl Budget
GoogleBot is the name of a Google crawler, which moves from one page to another and indexes them for ranking.
But because of the large network, Google’s strategy is to index only high-quality web pages and not low-quality web pages.
How Google Bot Decides Crawl Budget?
The person asking the question had a site with hundreds of thousands of pages.
But Google was only crawling about 2,000 web pages per day, a rate that is too slow for such a large site.
Also read: How News Appears in Google News and Search
Site Quality Can Impact GoogleBot Crawl Budget
Google’s John Mueller then mentioned the quality of the website.
Poor website quality can prevent the GoogleBot crawler from crawling the site.
Factors Affecting How Many Pages Google Crawls
There are other factors that may affect the number of unmentioned pages crawled by Google.
For example, a website hosted on a shared server may not be able to deliver pages to Google fast enough because other websites on that server may be using too many resources, reducing the server speed of thousands of other websites on that server. .
Another reason may be that the server is attacked by rogue robots, causing the website to slow down. John Mueller recommends considering the speed at which the server delivers web pages.
Make sure to check after the evening, because many crawlers like Google will check in the early hours of the morning, because this is usually a less destructive time for crawling and there are fewer site visitors on the site at that time.