John Mueller from Google in the latest SEO office hours said that Google, Googlebot, does not generally try variations of URLs to see if they work. So if you have a URL like domain.com/pagegoeshere1/, Google won’t try domain.com/pagegoeshere2/ to just try it out of curiosity.
To be fair, this is not new, I covered the same question in 2018, when Google’s John Mueller also said, Googlebot doesn’t make up URLs – but it has been several years and it is a common SEO question I see come up from time to time.
John was asked at the 9 minute and 50 second mark “Does Google crawl subfolders in a URL path which don’t have pages? Would it be a problem?”
John responded, “Google systems generally don’t just try variations of URLs out, they rely on links to discover new URLs.”
Here is the video:
Here is the transcript:
Does Google crawl subfolders in a URL path which don’t have pages? Would it be a problem?
I’ve seen variations of this over time. it’s very common to have URLs with paths that don’t actually exist. The easy answer is that Google systems generally don’t just try variations of URLs out, they rely on links to discover new URLs.
This means that unless you’re linking to those subdirectories most likely Google wouldn’t learn about them and try them. That said even if Google were to try them and they return a 404, that’s totally fine. Having pages on your site return 404 when they’re not used is expected and not a sign of a problem.
Forum discussion at YouTube.
Source link : Seroundtable.com