Google Says Disallowing UTM Parameters In URLs Won’t Help With Crawling Or Ranking


Google’s John Mueller said on Reddit that disallowing URLs with UTM parameters in them won’t help you to improve crawling or rating with Google Search. He added that a site should try to keep its internal URLs clean and consistent, but over time, the canonical tags should help with external links that carry UTM parameters on them.

John wrote, “I doubt you’d to see any visible effects in crawling or ranking from this. (And if there’s no value from doing it, why do it?)” When he was asked about disallowing such URLs.

He added:

Generally speaking, I’d still try to improve the site so that irrelevant URLs don’t need to be crawled (internal linking, rel-canonical, being consistent with URLs in feeds). I think that makes sense in terms of having things cleaner & easier to track – it’s good website-hygiene. If you have random parameter URLs from external links, those would get cleaned up with rel-canonical over time anyway, I wouldn’t block those with robots.txt. If you’re producing random parameter URLs yourself, say within the internal linking, or from feeds submissions, that’s something I’d clean up at the source, rather than blocking it with robots.txt.

tldr: clean website? yes. block random crufty URLs from outside? no.

This is all very similar to previous advice from John Mueller that I quoted in these stories:

Forum discussion at Reddit.



Source link : Seroundtable.com

Social media & sharing icons powered by UltimatelySocial
error

Enjoy Our Website? Please share :) Thank you!