How to use a Robots.txt on Github pages

Asked

Viewed 55 times

1

I have a repository on Github and it has a site that’s visible on github pages, only I want the search sites to just show you the homepage of the site. If it was a regular site I could add a robots.txt to the root folder, only github doesn’t give me access to the domain root folder, only the repository folder. What do I do ?

*You can’t use meta tags because I also want Googlebot to not see other files that are in the repository that are not html files

  • In Soen there is a thread who discusses this, has already taken a look?

1 answer

1


If it’s githubpages then it’s a domain something like silas333.github.io, then just go up in your repository robots.txt, i have a website on githubpages and it works normal: https://inphinit.github.io/robots.txt

What you can’t create is a robots.txt to your repository page, which is something totally different from the Domain of githubpages, that is to say the domain github.com you have no control, your subdomain you have in .github.io

Browser other questions tagged

You are not signed in. Login or sign up in order to post.