J
J
jurim7632020-01-09 21:55:06
Wget
jurim763, 2020-01-09 21:55:06

How to save exact copies of relative links with wget command?

There is one site that needs to be copied locally, but you need to keep an exact copy of the relative urls. All posts end in .html, but after downloading, in the browser we get "[email protected]=134.html". What needs to be added to this wget -r -k -l 3 -p -E -nc command to get an exact copy of the relative links in the browser? Removed -k , added and removed various commands - did not help. Rummaged literally everything, and did not understand how to download an exact copy while maintaining the exact links. Tell me, please, what's the matter. It is not necessary that all files open in the browser, for example, files that do not have an extension.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
D
Dimonchik, 2020-01-09
@dimonchik2013

here are all the parameters, I usually watch them,
some can change behavior, depending on others
, listing the types of downloaded content directly helps (that is, pictures, etc.)

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question