The legacy feature to control how fast Google crawls your website is going away on January 8, 2024. It is a feature that launched 15 years ago but Google never migrated it to the new version of Search Console. Gary Illyes from Google said, “with the improvements we’ve made to our crawling logic and other tools available to publishers, its usefulness has dissipated.”
Gary noted how the tool was slow to react to the setting change, in fact, Google told us years ago that it can take a day to kick in, by then, your server may be toast (okay, kidding on the toast). Gary wrote, “the rate limiter tool had a much slower effect; in fact it may have taken over a day for the new limits to be applied on crawling.”
He also said very few site owners used it, and said when they used it, it didn’t really end up doing what they expected it to do. “Fortunately though, site owners rarely had to resort to using the tool, and those who have, in many cases set the crawling speed to the bare minimum.”
Google is also setting the minimum crawling speed of Googlebot to a lower rate, “comparable to the old crawl rate limits,” Gary explained. This is to “continue honoring the settings that some site owners have set in the past if the Search interest is low, and our crawlers don’t waste the site’s bandwidth,” he said.
“Googlebot reacts to how the site–or more specifically the server handling the site– responds to Googlebot’s HTTP requests. For example, if the server persistently returns HTTP 500 status codes for a range of URLs, Googlebot will automatically, and almost immediately slow down crawling. Similarly, Googlebot slows down automatically if the response time for requests gets significantly longer. If you do experience unusually heavy crawling that your site can’t manage on its own, refer to this help article,” he explained.
For now, you can access the tool over here, it looks like this:
Here is how it looked when it launched:
Forum discussion at X.
Source link : Seroundtable.com