Answer the question
In order to leave comments, you need to log in
How to load multiple files in Bash and merge them into one?
You need to take some content from several sites and combine everything into one file. And the order must be appropriate. That is, each site, for example, corresponds to the number 1, 2, 3 ... 1000, in the resulting file, first there should be infa from the 1st site, then from the 2nd, etc. (first write from n- th site, then from (nk) th is invalid), the responses can come in any order, so simply redirecting to a file (>>) will not work.
I wrote this script:
rm -rf /tmp/content
mkdir /tmp/content
for i in ${!urls[@]}; do
mkfifo /tmp/content/$i
curl -s "${urls[$i]}" > /tmp/content/$i &
done
rm -f /tmp/output
for i in ${!urls[@]}; do
cat /tmp/content/$i >> /tmp/output
done
rm -rf /tmp/livesport
Answer the question
In order to leave comments, you need to log in
mkfifo /tmp/content/$i curl -s "${urls[$i]}" > /tmp/content/$i &
Bash does something incredible - instead of N parallel requests, each of which will be completed in 1-2 seconds (which means the whole download + build will take a couple of seconds), it makes requests sequentially
rm -rf /tmp/content
mkdir /tmp/content
for i in ${!urls[@]}; do
curl -s "${urls[$i]}" > /tmp/content/$i &
done
wait
rm -f /tmp/output
for i in ${!urls[@]}; do
cat /tmp/content/$i >> /tmp/output
done
I analyzed it in detail here
. Almost 80% suits you
https://klondike-studio.ru/blog/bitrixtar/?sphrase...
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question