Answer the question
In order to leave comments, you need to log in
Is php multicurl reliable for a large number of requests? What are the alternatives?
Good evening everyone!
Now there is a need to generate several calls to the php script (with different parameters) in parallel, that is, at the same time. Made through multicurl. But I can have 1000 calls at once. Will there be any problems?
Will multicurl hang after generating 1000 hits?
What could be a more robust solution for this task using php?
Answer the question
In order to leave comments, you need to log in
1000 requests in parallel is really not a little.
It can fall in memory, and sag in traffic (if not a dedicated server), and in terms of speed, and antiddos.
Nuances are many, test.
I would rethink the architecture either towards minimizing requests (if at all possible) or towards introducing queues.
Hypothetically reliable.
I would take Guzzle.
docs.guzzlephp.org/en/latest/quickstart.html#concu...
PyCurl buzzes normally on about 300 requests, 1000 does not pull - the heap falls off by timeout, 500 probably pulls, but at the same time it's still about 300
The solution could be pthreads: https://github.com/krakjoe/pthreads , only there is not much documentation and the developer does not always answer questions.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question