Robots continues to index deleted pages

Asked

Viewed 45 times

0

I work in a company that owns a store Magento 1.9 and in order to optimize the search organizes we are trying to delete hundreds of pages indexed by google, however, I am facing two difficulties.

Shop: lojamastertoys.com.br

1) The page "Categories" is generic, it contemplates all products whether they boys or girls, then this category and all its dependencies were set as noindex, nofolow in meta robots, it is still indexed.

2) In Robots.txt it was set to delete for example, the category page and all its dependencies, as well as my account page, contact, about, whishlist, product_search (search) among others, they are still indexed after a few days.

Below I’ll put my robots.txt, maybe I did something wrong or you may know another solution to delete these various pages.

User-agent: Googlebot-Image
User-agent: googlebot
User-agent: google
User-agent: bingbot
User-agent: bing
Disallow: /404
Disallow: /app
Disallow: /cgi-bin
Disallow: /downloader
Disallow: /errors
Disallow: /includes
Disallow: /lib
Disallow: /magento
Disallow: /pkginfo
Disallow: /report
Disallow: /scripts
Disallow: /shell
Disallow: /stats
Disallow: /var
Disallow: /index.php
Disallow: /catalogsearch
Disallow: /catalogsearch/*
Disallow: /catalog/*
Disallow: /catalog/product_compare
Disallow: /catalog/product_compare/*
Disallow: /catalog/category/view
Disallow: /catalog/category/view/*
Disallow: /catalog/product/view
Disallow: /catalog/product/view/*
Disallow: /catalog/product/gallery
Disallow: /catalog/product/gallery/*
Disallow: /checkout
Disallow: /control
Disallow: /contacts
Disallow: /customer
Disallow: /customize
Disallow: /newsletter
Disallow: /poll
Disallow: /review
Disallow: /sendfriend
Disallow: /tag
Disallow: /wishlist
Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt
Disallow: /*?SID=
Disallow: /*?*SID=
Disallow: /politica-de-trocas
Disallow: /categorias
Disallow: /categorias/*
Disallow: /sobre
Disallow: /contato

  • Are you sure you need this * after the path, example: "/catalogsearch/*" so I know just declare "Disallow: /catalogsearch/" and all folders after " / " are already unindexed.

  • I added the "duplicate" today, because as it was not working I tried this "forced", I believe that even her practice is wrong, but I needed to test.

  • The whole test is valid that hour rss... Something else! Remove all links from your Sitemap, then Generate a new Sitmap and send it updated to Google.

No answers

Browser other questions tagged

You are not signed in. Login or sign up in order to post.