1
Hello, I found a file robots.txt
with the following directives inside the file
User-agent: Googlebot
Allow: *
User-agent: Bingbot
Allow: *
User-agent: facebot
Allow: *
User-agent: *
Disallow: /
My question is, the last disallow command will not cancel all the above?
I thought that p
Disallow: /
denied access to all internal directories, publicHTML type still with access, but publicHTML/Configuration the bot would not have access. Kind that the bot would only see what’s at the root, and not the folders inside the directory... But I guess I was finding rss wrong– hugocsl
@hugocsl this yes, blocks all, but in the "order", the "allow 'explicit'" apparently gives preference to the "named" BOT, but it depends on how you write, in case I reviewed the answer is
um wildcard
and noto wildcard
, soon maybe soon I need to reformulate the answer better to detail examples that would write, but I need to be sure of everything, because I only had to base the texts of google support, and maybe I "lost" something, beyond what I need to complement.– Guilherme Nascimento