Have you ever had a site that ranks for unrelated keywords, queries you do not want to rank for? John Mueller from Google said if you do, maybe you should make your title and content clearer if they are too amibiguious.
He said you an either ignore the fact that it ranks for those queries or you can try to improve the content overall. He added, “Sometimes pages rank for unexpected things, you can’t prevent it, and that won’t negatively affect the rest of your site.”
John wrote that in response to a question on LinkedIn.
The question was from Álvaro Pichó Torres:
To prevent my website and/or specific URLs from not showing up in impressions for certain queries/searches, which is better? meta noindex in the head, or blocking by robots.
My exact case: I have done an SEO audit about the industrial sector of metal coatings, and my website comes up in SERPs and Google Search Console Impressions for ‘metal coatings workshop’, which messes up my ReferenceQueries, and I want to remove them without deleting the post.
John Mueller replied that blocking Google won’t really help here, but improving the content might. He said:
If you noindex or robots.txt disallow the page, it won’t appear for normal searches either. I’d just ignore it, or make title / description a bit clearer if they’re ambiguious. Sometimes pages rank for unexpected things, you can’t prevent it, and that won’t negatively affect the rest of your site.
If you want to have the page blocked from indexing completely, then noindex is the right mechanism.
What would you do?
Forum discussion at LinkedIn.
Source link : Seroundtable.com