I
I
Igor Samokhin2016-01-27 18:55:23
PHP
Igor Samokhin, 2016-01-27 18:55:23

Is php multicurl reliable for a large number of requests? What are the alternatives?

Good evening everyone!
Now there is a need to generate several calls to the php script (with different parameters) in parallel, that is, at the same time. Made through multicurl. But I can have 1000 calls at once. Will there be any problems?
Will multicurl hang after generating 1000 hits?
What could be a more robust solution for this task using php?

Answer the question

In order to leave comments, you need to log in

4 answer(s)
D
Dmitry Entelis, 2016-01-27
@grigor007

1000 requests in parallel is really not a little.
It can fall in memory, and sag in traffic (if not a dedicated server), and in terms of speed, and antiddos.
Nuances are many, test.
I would rethink the architecture either towards minimizing requests (if at all possible) or towards introducing queues.

M
Mikhail Osher, 2016-01-27
@miraage

Hypothetically reliable.
I would take Guzzle.
docs.guzzlephp.org/en/latest/quickstart.html#concu...

D
Dimonchik, 2016-01-27
@dimonchik2013

PyCurl buzzes normally on about 300 requests, 1000 does not pull - the heap falls off by timeout, 500 probably pulls, but at the same time it's still about 300

A
astrotrain, 2016-01-31
@astrotrain

The solution could be pthreads: https://github.com/krakjoe/pthreads , only there is not much documentation and the developer does not always answer questions.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question