Google 會採用精密的演算法來決定檢索網站的最佳頻率。我們的目標是在每次瀏覽您的網站時,盡可能檢索更多網頁,同時避免對您的伺服器頻寬造成太大影響。在某些情況下,當 Google 檢索您的網站時,可能會對您的基礎架構造成沉重負載,或在服務中斷期間產生不必要費用。為減輕這種情況所造成的影響,您可以決定是否要減少 Googlebot 提出的要求數量。
[null,null,["上次更新時間:2025-02-17 (世界標準時間)。"],[[["Google automatically adjusts crawl rate to avoid overloading your server, but you can reduce it further if needed."],["Temporarily reducing crawl rate can be achieved by returning 500, 503, or 429 HTTP response codes, but this impacts content freshness and discovery."],["For longer-term crawl rate reduction, file a special request with Google; however, increasing the rate isn't possible."],["Before reducing crawl rate, consider optimizing your website structure for better crawling efficiency as this might resolve the issue."],["Extended use of error codes to control crawling may lead to URLs being dropped from Google's index, so it's crucial to use this method cautiously."]]],["Google's crawlers may need to be slowed if they overload a site. Common causes for increased crawling include inefficient site structure, like faceted navigation. For urgent reductions, return `500`, `503`, or `429` HTTP status codes to crawler requests; this will lower the crawl rate, but can negatively affect site indexing if done for too long. Alternatively, if returning errors isn't viable, submit a special request specifying an optimal crawl rate. Note: reducing the crawl rate will result in slower updates of existing pages.\n"]]