M
M
muhasa2019-09-28 23:09:14
Parsing
muhasa, 2019-09-28 23:09:14

How to send hundreds of requests at the same time?

People, the question is this - there is a project of my friend who pulls the results of online games from a remote European site. He does this in the simplest way - once a minute cron polls files that beat requests. Moreover, the situation is such that in file 1.php, for example, there is a main request for all games, and then in the cycle for each game additional requests to the url, which returns additional fields. It turns out that in 1 file there can be 20-30 requests. If we take 1 second - 1 request as a calculation, then we get about 20-30 seconds per parsing file.
Now the task is to speed it all up, since another game can create up to 100 requests in 1 file, but here it won’t be able to start in a minute - 1 file will be executed for 2 minutes, approximately.
The first thing that comes to mind is asynchronous or parallel requests (it's the same thing, right?)
Parsing is written in php, I read it on the Internet - php for parallel requests, they say, is not very good. In this regard, a few questions:
1) Doesn't php really solve this problem - a dead number? Or just frozen? I know that there are libraries like GuzzleHTTP, the same curl has curl_multi_exec and in general libraries that declare asynchrony decently, but is this what we are looking for?
2) I know that node.js is good for such tasks, is it so? And I also heard the same thing about python, write a daemon on it and don’t blow it into your mustache. Which of their technologies copes with this task better? I would prefer python, I read it a bit in my time.
Thanks for support!

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
Andrey Dugin, 2019-09-28
@adugin

Python asyncio & aiohttp , also check if the site is transmitting data via websocket (can be seen in the browser).

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question