Depending on the current indexing, two steps are needed to ensure that the site will not be re-indexed and ensure that it will not appear in Google search results with error 404
:
Disable indexing
To ensure that the site is permanently removed from Google search results, you need to create a file at the domain root with the name robots.txt
to block access to crawler
to that site:
Content of robots.txt
:
User-agent: *
Disallow: /
These directives will indicate that nothing should be indexed when the crawler
goes on this site.
Remove the site and its pages from the search results
After blocking access to the crawler
to the domain, you need to follow the following steps to remove the domain in question from Google search results:
Login to the site Google Webmaster Tools
Choose from the list of existing sites the intended site
In the side menu, choose Google Index
Choose the sub-option Remove URLs
Once on the page, on the right side we have a button to create a new removal request:
After clicking the button you need to enter the desired address:
If the box appears asking what kind of removal you want, you should choose to remove from the cache and search results.
This way, not only ensure that you have blocked access to crawler
to the site, but also ensure that the site in question will no longer appear in the search results avoiding the possibility of errors.
Part of the instructions can be found on this page of Help from Google Webmaster Tools.
Good illustration :) +1
– Sergio