S
S
stopkilling_dolphins2019-06-25 10:56:57
PHP
stopkilling_dolphins, 2019-06-25 10:56:57

Cleaning up php script while processing images?

Hello, there is a database on Maria DB that stores pictures in a field with the LONGBLOB type. I know that this is evil, but there is no way to get away from it yet. Pictures weighing approximately 1 megabyte each.
My task is to create a preview for each picture, and so that it is saved to the database as well, and not generated on the fly.
I sketched a script that, in principle, does what is needed, but the memory gorge is huge. For ~ 1600 records, it consumes about 2 GB of memory. The server on apache is local, so, of course, you can add more memory, but what if there are ten times more records? (and there will be).
The script itself

<?php
include("bd.php");

    $query = "SELECT * FROM images"; //
    if (!$result = mysqli_query($con, $query)) {
     exit(mysqli_error());
    }
  
    if(mysqli_num_rows($result) > 0) {
    $total = 0;
           while($row = mysqli_fetch_assoc($result)){

      $data = $row['content'];
      $guid = $row['guid'];
      $sort = $row['sort'];
      $image = imagecreatefromstring($data);
      $image = imagescale($image, 300);

      // start buffering
      ob_start();
      imagejpeg($image);
      $contents =  ob_get_contents();
      ob_end_clean();

      $contentsthumb = base64_encode($contents);
      $total++;
      $sqlin = "INSERT INTO images_thumb (guid, content, sort)
      VALUES ('$guid', '$contentsthumb', '$sort')";

        if (mysqli_query($con, $sqlin)) {
          // echo "New record created successfully";
        } else {
          echo "Error: " . $sqlin . "<br>" . mysqli_error($con);
        }


      imagedestroy($image);

      }
    mysqli_free_result($result);
    echo "Добавлено записей " . $total;
    }
  
  mysqli_close($con);


?>

Quickly googling, I saw solutions in which, when a certain amount of memory consumption is reached, a temporary file is created with the current position, the script stops, and starts again from the specified position.
Is this the only solution or does anyone know a better option?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
P
Pavel Belyaev, 2019-06-25
@PavelBelyaev

The simplest solution would be SQL_CALC_FOUND_ROWS + LIMIT, for example, LIMIT 0 10 (from the zero line and get 10), and then find out how many records were found up to the limit through the query, we will find out how many records there are, although it’s easier to get COUNT (*) if you stupidly all Records suffice without selection on a condition (it is processed or not). The point is that you should take, for example, 10 records from the database, process, and then the next 10, for example, LIMIT 10 10 LIMIT 20 10 LIMIT 30 10 , take 10 records in a small batch, process and then the next 10, knowing how many records iteration do something like pagination of goods, look at how pagination is done on sites, and do it

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question