Answer the question
In order to leave comments, you need to log in
How to bulk download images?
Hello and Happy New Year.
First of all, I want to find the most... "Efficient" way to download images from a Specific Site.
The site has a main feed, which contains all the images.
And there are tags ( For example , the tag - New Year) and it contains - 185 pages (185, 184, 183)
And the goal is to download all the images in the feed (Bypassing the avatars, etc.)
Of course I understand that, ideally, you can write a script that will do it, but I can’t handle it yet (Although the information on which is the easiest to learn this script can be written { For example , HTML - This is just an example} will not be superfluous)
Or there is existing solutionsFor example , extensions) that will cope with this purpose.
Answer the question
In order to leave comments, you need to log in
If the images have direct links, just set up a simple script that sorts through them according to some principle using wget.
For example, there is one site with boobs (I will not advertise). To download the set I like from it, I first scroll it in the browser to the end, and then I launch this chopped script with an ax:
#!/bin/bash
if [ $# -lt 2 ]; then
echo "Run: n_down <bunch_number> <bunch_quantity>"
exit
fi
if [ ! -d $1 ]; then
mkdir $1
fi
cd $1
_start=1
while [ $_start -le $2 ]
do
if [ ! -e $_start.jpg ]; then
echo "Downloading $_start.jpg picture"
wget тут-урл-с-которого-качать-он-у-вас-свой
_add=$((RANDOM%10))
_sleep=$((5+$_add))
sleep $_sleep
fi
_start=$(($_start+1))
done
there are Barracuda extensions
purely for pictures, there are a bunch of others that graphically highlight the area and extract from
it, and there is the browser cache (xs, maybe in FF they already thought of separate caches for profiles), and you can use software directly from there
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question