I
I
Ivan Antonov2015-10-02 11:10:48
PHP
Ivan Antonov, 2015-10-02 11:10:48

How not to crash the site by running a huge script in the cron?

The script has the following actions:
1. it downloads a large gz archive.
2. unpacks and reads the nested xml.
3. takes objects from xml to the database and downloads photos.

When monitoring the performance, the server shows:
a236197df9d94fae875630690b4b1b40.png
With such a load, the site becomes unavailable.

Tell me what to do, is it possible to limit the number of processes used by the script.
And what should be avoided in the script itself?

Answer the question

In order to leave comments, you need to log in

5 answer(s)
I
Ivan Antonov, 2015-10-05
@antonowano

Found the problem in the script. The simplexml_load_file function ate all the RAM, writing hefty XML into memory, which caused the server to hang. After the script was rewritten under the XMLReader class, the script stopped eating up all the RAM. I did as in the example: how to use XMLReader .
Thanks to Adamos for the tip:

I used XMLReader to parse the file.

T
Thomas Storm, 2015-10-02
@v_sadist

Hello, topic starter.
And where does he download the archive from? It is worth attending to the issue of network load.
Provide a script, it will help to better think over the solution.

M
mureevms, 2015-10-02
@mureevms

At what step does the load occur? If when unpacking, then How to extract certain files from a tar archive

N
neol, 2015-10-02
@neol

LA 3.5 with 8 cores does not look like a reason for the site to be unavailable.
Maybe the problem is in the lock of the table at the time of the database update?

Y
yarofon, 2015-10-02
@yarofon

Run cron on another server. For example, start an EC2 instance on Amazon (if not frequently).

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question