Answer the question
In order to leave comments, you need to log in
Solution to implement file processing in parts (eg for import)?
You need to import records for the site. The import will be either XML or CSV.
The problem is that it takes a long time to process a large file and the script will terminate either when PHP is running, or the user may inadvertently close the page on which processing is in progress.
I want to make PHP process one line first, then return a success message, and a request comes from the client to process the next line.
Accordingly, based on this, I will make both a progress bar and a percentage of completion. Then it will be possible to see how many lines have been processed, so a file of any size can be processed accordingly.
I wonder if there is, on the one hand, a simple, on the other hand, a universal solution to such a problem?
Answer the question
In order to leave comments, you need to log in
You can upload the entire file to the server with a form that displays the download percentage (done through APC).
Remove the execution time limit in the script via set_time_limit(0);
And continue working after closing the browser through ignore_user_abort(true);
Read the file line by line so as not to fill up the entire RAM and execute SQL INSERT on each line.
The processing process can be monitored by creating an additional table where the download identifier, the number of lines in the file, the number of processed lines will be recorded. And on a separate page to monitor the progress.
To increase speed, it is recommended to INSERT not one line at a time, but at least 100.
Squeeze the search, this is a very common case.
As a rule, the Handler is left to work in the background, and is forced to store the "current line" somewhere. For example, in memcache.
And your script with the progressbar looks into the memcache and gives the client a number.
Something like this.
Well, do not forget to run the handler after importing the file. =) someone uses fork, someone uses task queues. Who is more comfortable.
You are doing nonsense. For complex tasks, so-called. task queue, that is, the file is uploaded to the server, a task is added to the queue (process the file), the daemon kicks, the script ends on this, shows the message “wait”, and at the same time the daemon sees a new task, starts the processing script, periodically updates the progress perform the task in the table.
And the client pulls a simple script that checks the progress in percent according to this table and displays it.
I strongly recommend that you at least familiarize yourself with the existing solutions before writing your bike.
If free hosting does not allow you to run daemons, it's easier to change it to a normal VPS, rather than invent inefficient crutches.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question