G
G
Greg Plitt2020-05-30 12:15:52
robots.txt
Greg Plitt, 2020-05-30 12:15:52

How to close in robots # - sharp in URL?

The site is full of such pages:
****.**/p/sharp-00042277
****.**/p/sharp-00042277#!path=odnotonnye-tkani
****.**/p/sharp -00042277?_escaped_fragment_=path=odnotonnye-tkani

I want to close duplicates from indexing leaving only the original.

Rule:
Disallow: /p/*?

It closes the last URL from indexing, but I can't close the URL in which "#!path=" in any way.

I tried
Disallow: /p/*path

But in this case the URL remains open and this directive only works if the "#" is removed from the URL.

Actually what rule can be used to close such a construction:
****.**/p/sharp-00042277#!path=odnotonnye-tkani

Answer the question

In order to leave comments, you need to log in

1 answer(s)
S
SagePtr, 2020-05-30
@bit24yes

Nothing, everything after the # symbol is not transmitted to the server

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question