Answer the question
In order to leave comments, you need to log in
What library to use?
I have a task: to generate a random and add it to a certain container or pool of randoms.
I started the process, and I got the output: 475 123 654 789 654 231 ....
While this process is running, I want to add another process (the same instance of the program), but so that the randoms of the second process are also added to one and the same same pool.
At the time of these processes, I want to add a couple more of the same ones (the same code, just launched in several instances, several independent terminal windows).
If I kill one of the processes, it shouldn't hang up all the others running.
What python library should i use to do this?
multiprocessing, which is already built-in, does not support this, there you can only set the number of processes already directly in the program code (a constant value).
In a generally distributed network of processes, I want to make it easy to add and remove processes, so that each of them knows about each other, but at the same time they all perform one common task together.
Ready to consider options NOT in python, if they are easier to implement. The essence of the process being performed is not important, random is the first thing that came to mind.
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question