M
M
mapatka2021-06-16 15:02:04
Browsers
mapatka, 2021-06-16 15:02:04

How to download web pages?

Good time! I'm ashamed, but how to download web pages?
The essence of the problem is that if I save a page from a website through a browser, then in the source code there is normal text.
If Download Master feeds a list of links, then the pages are downloaded, but inside the text is like this.
Охрана

Is there some kind of plugin / extension that can feed a list of links and will it save them to me?
Curl, wget - I'm for a proxy, and I don't have enough brains to set them up. Is there anything faster? With GUI :)

Answer the question

In order to leave comments, you need to log in

2 answer(s)
Z
Zettabyte, 2021-06-16
@Zettabyte

Curl, wget - I'm behind a proxy, and I don't have enough brains to set them up

wget -e use_proxy=yes -e https_proxy=https://mapatka:[email protected]:443 https://dirty-videos.com/wild-anal.html

You can also set all this in a file .wgetrc:
use_proxy = on
http_proxy = http://1.2.3.4:3128
https_proxy = https://5.6.7.8:443

And run wget -e use_proxy=yes -e https_proxy=$proxy ...
PS
With GUI there is WinWGet

C
CityCat4, 2021-06-17
@CityCat4

Curl, wget - I'm for a proxy, and I don't have enough brains to set them up.

That is, the problem is not in the lack of software, but in the brain :)
man curl
man wget
There is a global file /etc/wgetrc, there is a local .wgetrc file - settings for wget are written in them

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question