Answer the question
In order to leave comments, you need to log in
How to force search robots to read files using GET requests?
You need to make the crawler crawl through
https://example.com/link?link=1
https://example.com/link?link=2
https://example.com/link?link=3
https://example .com/link?link=4 ...
How?
Answer the question
In order to leave comments, you need to log in
If we are talking about Yandex, go to the webmaster's panel, to the "Indexing" section. Here's a screenshot: https://cln.sh/jCzWpG
1. Submit pages for re-crawl
2. Put them on monitoring
If it's Google, go to GSC, "URL Check" section. Here is a screenshot: https://cln.sh/MDy93M
1. Go to the section and write down the page address
2. Click on the "Request indexing" button
Subscribe to my author's telegram channel SEO specialist
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question