Answer the question
In order to leave comments, you need to log in
Because of what the work of the script through CURL is bent?
There is a script that I want to write to the database (mysql 5.7) ~ 67k records, the data is received by CURL via API requests. API data comes 20 elements per 1 page (per 1 request), 3371 pages in total.
It seems that everything is OK, the received response from the server via api is bypassed in the loop, the "parsed" data is written to the database, after the next request indicating the next page, etc. But when it reaches about 19-20 pages, the script stops writing data to the database, as if it freezes.
I understand that the problem ( error ) is typical and is clearly related to the fact that somewhere something is clogged, how to solve this problem, what should I pay attention to?
This is what the off looks like. a request to the API (this is wrapped in a loop for me):
<?php
$curl = curl_init();
curl_setopt_array($curl, array(
CURLOPT_URL => "https://api.dev/?include_null_first_air_dates=false&timezone=Europe%2FMoscow&page=1&sort_by=popularity.desc&language=ru-RU",
CURLOPT_RETURNTRANSFER => true,
CURLOPT_ENCODING => "",
CURLOPT_MAXREDIRS => 10,
CURLOPT_TIMEOUT => 30,
CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
CURLOPT_CUSTOMREQUEST => "GET",
CURLOPT_POSTFIELDS => "{}",
));
$response = curl_exec($curl);
$err = curl_error($curl);
curl_close($curl);
if ($err) {
echo "cURL Error #:" . $err;
} else {
echo $response;
}
Answer the question
In order to leave comments, you need to log in
Maybe there is simply a limit on the execution time of the script?!
run the script in the console, not the browser. log events. so you will see where the script dies.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question