Google has updated, in a big way, its how to “Remove images hosted on your site from search results” help documentation page. Google said, “Part of our ongoing efforts to keep our documentation accurate, we updated the documentation for image removals with more precise language, and addressed some documentation feedback.”
So what changed?
Previously, this section read:
For quick removal, use the Removals tool to remove images hosted on your site from Google’s search results within hours.
Now that section reads:
For emergency image removal
To quickly remove images hosted on your site from Google’s search results, use the Removals tool. Keep in mind that unless you also remove the images from your site or otherwise block the images as described in the non-emergency image removal section, the images may resurface in Google’s search results once the removal request expires.
Google added this new section:
If you don’t have access to the site that’s hosting your images (for example a CDN) or your CMS doesn’t provide a way to block images with the noindex X-Robots-Tag HTTP header or robots.txt, you might need to delete the images altogether from your site.
Previously, this section read:
To prevent images from your site appearing in Google’s search results, add a robots.txt file to the root of the server that blocks the image. While it takes longer to remove an image from search results than it does to use the Removals tool, it gives you more flexibility and control through the use of wildcards or subpath blocking. It also applies to all search engines, whereas the Remove URLs tool only applies to Google.
Now that section reads:
To prevent images from your site appearing in Google’s search results, add a robots.txt file to the root of the site that hosts the image, for example https://yoursite.example.com/robots.txt. While it takes longer to remove an image from Google’s search results using robots.txt rules than it does to use the Removals tool, it gives you more flexibility and control through the use of wildcards or subpath blocking. It also applies to all search engines, whereas the Removals tool only applies to Google.
Previous:
The next time Google crawls your site, we’ll see this rule and drop your image from our search results.
Rules may include special characters for more flexibility and control. The * character matches any sequence of characters, and patterns may end in $ to indicate the end of a path.
Now:
The next time Google crawls the dogs.jpg image, we’ll see this rule and drop your image from our search results.
To remove multiple images on your site from Google’s index, add a disallow rule for each image, or if the images share a common pattern such as a suffix in the filename, use a the * character in the filename. For example:
Google also added more Wildcard character in the filename examples.
Previous:
To remove all the images on your site from our index, place the following robots.txt file in your server root:
Now:
To remove all the images on your site from our index, place the following rule in your robots.txt file:
Google also added this section:
Note that adding the noimageindex robots tag to a particular page will also prevent that images embedded in that page from getting indexed. However, if the same images also appear in other pages, they might get indexed through those pages. To make sure a particular image is blocked no matter where it appears, use the noindex X-Robots-Tag HTTP response header.
There were a few more tweaks as well that I did not cover.
Forum discussion at X.
Source link : Seroundtable.com