Answer the question
In order to leave comments, you need to log in
How to solve the time limit problem when parsing a large XML file?
When parsing XML ( php using XMLReader ) of a large number of goods ( about 10,000 items ), after 30 seconds of script operation, a Maximum execution time error occurs. Of course, you can increase the allowable script execution time to an hour, for example. But is there a more beautiful way of parsing?
Answer the question
In order to leave comments, you need to log in
Alternatively, you can read the XML first, then convert it to an array. and save, then, for example, slice the array into 5 parts, and process it all in iterations. Those. there will be 2 workers: one converts the file into an array, the second processes the arrays and puts them in the database. This allows you to parse quite large files, but if the file is too big (does not fit into the RAM), then a good solution is described here:
stackoverflow.com/questions/911663/parsing-huge-xm...
You are a separate daemon to take out this process, and for a web request, only add information about the upcoming task
ini_set( 'max_execution_time', 300 ); // 300 seconds = 5 minutes
In general, it's better to think about how such healthy files are not parsed entirely.
Transfer the process launch to a separate script and run it on the console or by cron. There is usually no execution time limit.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question