Answer the question
In order to leave comments, you need to log in
How to speed up the return of dynamically generated thumbnails in PHP + GD?
The essence of the question is as follows: there is a project where a huge number of photos with different storage times are loaded. It can be both small images and photos with a resolution of 5000*5000. Due to the fact that images can be deleted even a second after uploading, I thought that creating a thumbnail is an extra waste of resources and processor time, since these nightstands need about 5-6 different sizes. Since all images are .jpg I decided to use GD. I filed something like this script (thumb.php):
<?php
$from_name = dirname(__FILE__) . $_GET['name']; // для условности без проверок
$thumb_x = 280; // необходимая ширина, но я тоже передаю ее в параметре
if (!file_exists($from_name)) die("Source image does not exist!");
list($orig_x, $orig_y, $orig_img_type, $img_sizes) = @GetImageSize($from_name); // получаю нужные параметры
// тут высчитываю размеры эскиза с пропорциональным сжатием
$thumb_y = floor($orig_y * $thumb_x / $orig_x);
$per_x = $orig_x / $thumb_x;
$per_y = $orig_y / $thumb_y;
if($per_y < $per_x){
$thumb_x = $orig_x / $per_y;
}
else{
$thumb_y = $orig_y / $per_x;
}
// создаю пустой холст, заливаю белым фоном и накладываю изображение
$ni = ImageCreateTrueColor($thumb_x,$thumb_y);
$white = imagecolorallocate($ni, 255, 255, 255);
imagefilledrectangle($ni, 0, 0, $thumb_x, $thumb_y, $white);
$im = ImageCreateFromJpeg($from_name);
imagepalettecopy($ni,$im);
imagecopyresampled($ni, $im, 0, 0, 0, 0, $thumb_x, $thumb_y, $orig_x, $orig_y);
header("Content-type: image/jpeg");
ImageJPEG($ni, NULL, 85); // последний параметр - качество (100-0)
?>
<a class="group" rel="group1" href="/photos/uploaded/8316e6749c40428ab140c79a7dbbf789.jpg">
<img src="/thumb.php?name=/photos/uploaded/8316e6749c40428ab140c79a7dbbf789.jpg">
</a>
Answer the question
In order to leave comments, you need to log in
Generate in advance only the largest of the previews. there will be a deliberately smaller 5000x5000 monster and previews will be generated from it much faster than from the original
one, instead of where you can still use an image - it is better, faster and smarter
You need about 5-6 pieces of different sizes.Why the hell are there so many of them? 2, well, 3 is the limit, the rest are just bare with html to the desired size. Keep unequivocally ready.
The generation of images is essentially long, as it rests on the resources of the server (CPU) and it is unlikely that it will be possible to speed up here (on the fly).
The most correct option is to cache the generated previews on disk.
You can leave thumb.php as an intermediate link, at the first generation we save the result to the disk, and on repeated requests, check that if the thumbnail file is on the disk, read it and return it.
But the fastest and best option is still to cache and give previews directly to the web server (without PHP), for example, when uploading, immediately make the right sizes.
You can do without PHP at all and use the ngx_http_image_filter_module
https://habr.com/en/post/94435/
https://nginx.org/en/docs/http/ngx_http_image_filt...
Cache, use cache. Generate at the least busy time, write it to disk, then give it to clients from the cache. Under such a task, the CPU will not save enough.
I don’t understand why there is so much trouble and each time to give different sizes? Just wondering.
You need to use the cache, but not a local drive, but a tool specialized for this, for example s3.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question