Answer the question
In order to leave comments, you need to log in
What is the best way to start a lot of processes with long http requests?
I need to receive information via VK LongPoll simultaneously from multiple accounts (about 90). I have a function that sends requests and processes them, but for one user. At first, everything seemed obvious - to run a function for each user in a separate thread using threading. Then I learned about the GIL and there were doubts whether such a number of threads would work too slowly. Doing the same thing through multiprocessing is probably even worse in terms of performance. Maybe I'm worrying in vain and the threads, which most of the time are waiting for a response from the server, will not "slow down"? Or should the function be rewritten using aiohttp? (although I'm afraid this will be too difficult for me) What is the best thing to do if I run the script on a server with a small number of processor cores (2-4)?
Answer the question
In order to leave comments, you need to log in
I have about 300 acs in VK on a long field and nothing even starts to sneeze, I think up to tens of thousands everything will work without problems
In general, I asked the question a little wrong, I'm sorry, I was not interested in the speed of work, but in resource consumption.
In general, as a result, I wrote an LP in Golang, and I still process requests using Python (not all events are processed), it has become much better
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question