A
A
Anton2016-09-09 23:43:44
Ruby on Rails
Anton, 2016-09-09 23:43:44

How to create /robots.txt for a Rails application?

Rails 5.0.0.
The robots.txt file is located in the application's public directory.
Also next to it are files such as: sitemap.xml.gz and rights confirmation files (Google and Yandex). All these files are available via site.ru/file.format in both development and production.
But for some reason the problem occurs exclusively with the robots.txt file. It was only available at development. Now it is not available at all. I rebooted it on the server - did not help.
The command
curl http://site.ru/robots.txt
returns void.
What's the matter?
UPD
Now I have this following.
In development mode:

curl http://site.ru/robots.txt
User-agent: *
Disallow:
Disallow: /user
Disallow: /admin

Sitemap:

In production mode, the same as above. But it's all cache. Yandex cannot access the file when it is in production mode. What's the matter?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
K
Karim Kyatlottyavi, 2016-09-10
@constXife

Perhaps someone is intercepting the request for robots.txt. Check routes.rb, run rake routes, disable all extraneous gems, if you are using Nginx, check that there are no handlers there either. You should be more careful, since the interception does not necessarily go through robots.txt, maybe some unreadable wildcard.

F
Falsealarm, 2016-10-08
@Falsealarm

If you are using nginx, the robots.txt file must be specified in the config file, since the web server controls the distribution of files.

server {
        .....
        location ~ ^/(assets|fonts|system)/|favicon.ico|robots.txt {
          gzip_static on;
          expires max;
          add_header Cache-Control public;
        }
        .....
}

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question