Answer the question
In order to leave comments, you need to log in
How to make search robots access only what they need in robots.txt?
I read a lot of articles on writing a robots.txt file, but did not understand it. It would be easier for me to ban everything except for the pages that I indicate. What is the best way to do this?
Answer the question
In order to leave comments, you need to log in
What specifically do you not understand? Everywhere it says the same thing:
disallow
The disallow directive defines specific paths that should be inaccessible to the specified crawlers. If no path is specified, it is ignored.
disallow: [path]
The allow directive defines the paths that should be available to the specified crawlers. If no path is specified, it is ignored.
allow: [path]
disable EVERYTHING except those pages that I specify:
allow: /url-1.html
allow: /url-2.html
allow: /2020-08-28/*
disallow: /
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question