A
A
alekssamos2019-08-21 05:11:23
PHP
alekssamos, 2019-08-21 05:11:23

Long run, queue?

I have a PHP script, for example, for processing photos, each one is processed in six seconds.
But he manages to process only eight of them. Further, despite the set_time_limit, (shared) hosting terminates the execution.
Are there options for how to solve? Without raising your personal VDS server, for example?
Some kind of queue so that each is processed in a separate request.
The order is important, you need to process and return the results as soon as possible, cron will not work in a minute.
Is there a solution?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
R
Rsa97, 2019-08-21
@alekssamos

Option 1: Through curl_multi_exec, call handlers locally for all photos at once. It is fraught with a large peak load.
Option 2: After receiving the images, return to the client a list of links of individual handlers so that it calls them when the processing of the previous image is completed. It is fraught with the fact that the client may close before the end of processing all the pictures. In such a case, you can add cron jobs to process everything that is not processed.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question