Answer the question
In order to leave comments, you need to log in
How to optimize, compress images for the web?
It is necessary to sort out and optimize, compress about 3GB of images.
There is an optimizer on the web _ https://tinypng.com/, I really like how it compresses, on average by 40-60%, is there something similar under linux to achieve a similar effect after compression?
You need exactly softwares that work in batch mode, from the command line, since there are about 50-60k files and there is no way to manually sort through them.
Maybe someone knows what the service above uses to optimize images? There is also PageSpeed Insights from Google, which also compresses decently. What do they use, is it really some kind of individual development?
Now I'm looking in the direction of ImageMagick, but something tells me that such a compression result cannot be achieved with it
Answer the question
In order to leave comments, you need to log in
For png - only pngquant.org I
use settings
For jpeg lossless jpegtran
jpegclub.org/jpegtran
settings
For lossy jpeg compression, you can take anything.
pngcrush (or its fork optipng) and jpgcrush to optimize png and jpg respectively.
The same gulp collector has plugins for compressing and optimizing images: imagemin, imagemin-pngquant. Simple and no magic.
Try this (jpeg only): jpeg-recompress with smallfry metric - analogue of the popular jpegmini - just what you need - command line
Binaries (for windows, mac, linux) here
It is possible to use cloud solutions for optimizing and storing photos - almost all have free plans - which should be enough for this job. List: https://i.onthe.io - api up to 100,000 photos for free (storage-optimization)
cloudinary.com - api up to 75,000 free
other solutions: www.resrc.it , https://kraken.io /
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question