What is the meaning of using robots on Github pages?

Asked

Viewed 64 times

1

According to this Response of the Stack Overflow it is possible to use robots.txt for avoid that search engines index the pages that the webmaster does not wish.

How is that possible?

through a Custom Domain or custom domain.

However, if the objective of the robots is to delimit a private area on the site (for example), what is the sense in trying to "hide"(with the robots) any content? if everything can be viewed freely on the Github platform?

For the content to be effectively hidden or hidden, then it would be necessary to pay for the Github platform and have access to the private repositories resource?

  • 1

    Jonathas recommend reading the tags before adding to your question, the question is unrelated to "Git" and the tag "Search" is about algorithms.

  • In fact, I apologize for what happened, I’ll pay more attention, thank you for the warning. Can you visualize any other tag that fits the question? i try to put as many tags as possible to gain visibility, so I end up making these mistakes.

  • 1

    The bots will index the "sources" by github.com, they just won’t index the github-pages, so if you’re using github-pages for your own website you’ll be blocking some bots. Now pro github.com what will index is like files and "sources", I’m not sure how to explain well, but that’s the basic difference.

No answers

Browser other questions tagged

You are not signed in. Login or sign up in order to post.