Answer the question
In order to leave comments, you need to log in
There are 100k URLs. You need to parse their contents. How to bypass this amount in 5th thread at the same time (php+curl)?
The database has a list of 100,000 URLs.
If you go through one thread in a loop, it will take a lot of time.
How to bypass this amount but run every 5 URLs at the same time?
Interested primarily in CURL+PHP.
Answer the question
In order to leave comments, you need to log in
You can use any HTTP client like Guzzle .
But, inside it still uses curl_multi_*.
use GuzzleHttp\Pool;
use GuzzleHttp\Client;
use GuzzleHttp\Psr7\Request;
$client = new Client();
$requests = function ($total) {
$uri = 'http://127.0.0.1:8126/guzzle-server/perf';
for ($i = 0; $i < $total; $i++) {
yield new Request('GET', $uri);
}
};
$pool = new Pool($client, $requests(100), [
'concurrency' => 5,
'fulfilled' => function ($response, $index) {
// this is delivered each successful response
},
'rejected' => function ($reason, $index) {
// this is delivered each failed request
},
]);
// Initiate the transfers and create a promise
$promise = $pool->promise();
// Force the pool of requests to complete.
$promise->wait();
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question