D
D
dNertyco2021-06-21 19:03:58
Search Engine Optimization
dNertyco, 2021-06-21 19:03:58

How to force search robots to read files using GET requests?

You need to make the crawler crawl through
https://example.com/link?link=1
https://example.com/link?link=2
https://example.com/link?link=3
https://example .com/link?link=4 ...

How?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
R
Rinat Haisman, 2021-07-02
@Rinat_Haisman

If we are talking about Yandex, go to the webmaster's panel, to the "Indexing" section. Here's a screenshot: https://cln.sh/jCzWpG
1. Submit pages for re-crawl
2. Put them on monitoring
If it's Google, go to GSC, "URL Check" section. Here is a screenshot: https://cln.sh/MDy93M
1. Go to the section and write down the page address
2. Click on the "Request indexing" button
Subscribe to my author's telegram channel SEO specialist

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question