T
T
TRXXX2021-10-06 17:25:16
Parsing
TRXXX, 2021-10-06 17:25:16

Parsing sites, how to make it easier?

Hello!
There is a personal need to receive every day information from sites located on the same domain. The information on them is public, there is no captcha, no registration.
What is required: go through 30+ links of the same type in one cycle (change some value in them in the cycle or just enter a list), go to the site, copy the page, save and append to the html to the previous one saved from the cycle. Those. at the end of the loop, there will be one html page from all 30 links. Next, in manual mode, I will look at the information I need.
I wrote in Delphi before, I can remember, but it takes time.
Can someone suggest some solution? The programs that are in the search for parsing do something that is not what we would like, or they save it to the database, or separate files, or something else. Separate 30 files are not needed, because. you can just change a few characters in the browser line, but it takes a lot of time.
If someone has a solution (recommendation of a program or a script, or sources in Delphi), the operating system does not matter (Windows, Linux), I will be grateful.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
Alexey Gorbunov, 2021-10-06
@leha_gorbunov

In Linux, you create a file with the extension .sh
. You write 30 lines of code there.

wget --quiet -O - http://example.com/link1 >> output.txt

where output.txt is the output file.
And run in terminal
bash scriptfile.sh

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question