Crawl Budget
The number of pages a search engine bot will crawl on your website within a given timeframe, determined by crawl rate limit and crawl demand.
What is Crawl Budget?
Crawl budget refers to the finite resources search engines allocate to crawling your website. It is determined by two factors: the crawl rate limit, which is the maximum crawling speed that will not degrade your server performance, and crawl demand, which is how much Google wants to crawl your site based on its popularity and freshness.
For most small to medium websites, crawl budget is rarely a concern because search engines can easily crawl the entire site. However, for large websites with thousands or millions of pages -- such as e-commerce stores, news sites, or platforms with user-generated content -- crawl budget becomes a critical optimization factor. If important pages are not being crawled frequently enough, they may not be indexed or may appear with outdated content in search results.
Optimizing crawl budget involves several strategies: keeping your XML sitemap updated with only indexable pages, using robots.txt to block low-value pages from crawling, fixing broken links and redirect chains, improving site speed to allow faster crawling, reducing duplicate content, and implementing proper pagination. Monitoring your crawl stats in Google Search Console provides insight into how efficiently Googlebot is crawling your site.