A
A
Alexey Medvedev2020-03-06 15:40:48
PHP
Alexey Medvedev, 2020-03-06 15:40:48

How to speed up the return of dynamically generated thumbnails in PHP + GD?

The essence of the question is as follows: there is a project where a huge number of photos with different storage times are loaded. It can be both small images and photos with a resolution of 5000*5000. Due to the fact that images can be deleted even a second after uploading, I thought that creating a thumbnail is an extra waste of resources and processor time, since these nightstands need about 5-6 different sizes. Since all images are .jpg I decided to use GD. I filed something like this script (thumb.php):

<?php
$from_name = dirname(__FILE__) . $_GET['name']; // для условности без проверок
$thumb_x = 280; // необходимая ширина, но я тоже передаю ее в параметре

if (!file_exists($from_name)) die("Source image does not exist!");

list($orig_x, $orig_y, $orig_img_type, $img_sizes) = @GetImageSize($from_name); // получаю нужные параметры

// тут высчитываю размеры эскиза с пропорциональным сжатием
$thumb_y = floor($orig_y * $thumb_x / $orig_x);
$per_x = $orig_x / $thumb_x;
$per_y = $orig_y / $thumb_y;
if($per_y < $per_x){
  $thumb_x = $orig_x / $per_y;
}
else{
  $thumb_y = $orig_y / $per_x;
}  

// создаю пустой холст, заливаю белым фоном и накладываю изображение
$ni = ImageCreateTrueColor($thumb_x,$thumb_y);
$white = imagecolorallocate($ni, 255, 255, 255);
imagefilledrectangle($ni, 0, 0, $thumb_x, $thumb_y, $white);
$im = ImageCreateFromJpeg($from_name);
imagepalettecopy($ni,$im);
imagecopyresampled($ni, $im, 0, 0, 0, 0, $thumb_x, $thumb_y, $orig_x, $orig_y);
header("Content-type: image/jpeg");
ImageJPEG($ni, NULL, 85); // последний параметр - качество (100-0)
?>


From the client side it looks like this:
<a class="group" rel="group1" href="/photos/uploaded/8316e6749c40428ab140c79a7dbbf789.jpg">
    <img src="/thumb.php?name=/photos/uploaded/8316e6749c40428ab140c79a7dbbf789.jpg">
</a>


If there are 15-20 such inserts on the page, it gets all the pictures > 5 seconds, despite the fact that the size of each bedside table is ~ 10-15 kb.

Is there any way to speed up this process? If you set the quality to 100, then the upload time can take up to 30+ seconds.
Is there a smarter and easier way? Or still save sketches and give them static?

UPD:
1) Even if the quality is reduced to 5%, page loading takes 2.5-3 seconds, which is also very long. Given that all thumbnails are no more than 5 kb.
2) Static images are given almost instantly, which means that the plug is definitely in a long processing.

Answer the question

In order to leave comments, you need to log in

5 answer(s)
F
FanatPHP, 2020-03-06
@FanatPHP

Generate in advance only the largest of the previews. there will be a deliberately smaller 5000x5000 monster and previews will be generated from it much faster than from the original
one, instead of where you can still use an image - it is better, faster and smarter

T
ThunderCat, 2020-03-06
@ThunderCat

You need about 5-6 pieces of different sizes.
Why the hell are there so many of them? 2, well, 3 is the limit, the rest are just bare with html to the desired size. Keep unequivocally ready.
UPD: use webp for thumbs

N
nokimaro, 2020-03-06
@nokimaro

The generation of images is essentially long, as it rests on the resources of the server (CPU) and it is unlikely that it will be possible to speed up here (on the fly).
The most correct option is to cache the generated previews on disk.
You can leave thumb.php as an intermediate link, at the first generation we save the result to the disk, and on repeated requests, check that if the thumbnail file is on the disk, read it and return it.
But the fastest and best option is still to cache and give previews directly to the web server (without PHP), for example, when uploading, immediately make the right sizes.
You can do without PHP at all and use the ngx_http_image_filter_module
https://habr.com/en/post/94435/
https://nginx.org/en/docs/http/ngx_http_image_filt...

A
AlexisKmetik, 2020-03-07
@AlexisKmetik

Cache, use cache. Generate at the least busy time, write it to disk, then give it to clients from the cache. Under such a task, the CPU will not save enough.
I don’t understand why there is so much trouble and each time to give different sizes? Just wondering.

O
OnYourLips, 2020-03-07
@OnYourLips

You need to use the cache, but not a local drive, but a tool specialized for this, for example s3.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question