Answer the question
In order to leave comments, you need to log in
Why is Python multiprocessing unstable?
Hello community. Attempts to deal with multiprocessing in python led to a serious snag. I would be very grateful for any help in parsing the problem, or for recommendations on how to fix the code.
In the program for mathematical calculations I use the part with multiprocessing. The parallel block of code looks like this:
worker_count = multiprocessing.cpu_count()
jobs = []
print "---> Starting multiprocessing #", series_number
for i in xrange(worker_count):
s = solver.get_solution(copied_system)
p = multiprocessing.Process(target=combinatorial_set.find_nearest_set_point, args=(s, result_queue))
jobs.append(p)
p.start()
print p
for w in jobs:
w.join()
print w
results = []
while not result_queue.empty():
results.append(result_queue.get())
for i in xrange(len(res)):
print results[i], is_solution(copied_system, results[i])
if is_solution(copied_system, results[i]):
func_value = f(results[i])
experiment_valid_points[func_value] = results[i]
#End of parallel
print "---> End of multiprocessing #", series_number
Answer the question
In order to leave comments, you need to log in
Good thing you tagged Windows, that explains it all. Under Windows, there is no easy way to "fork" a process when called multiprocessing.Process
, so a very complex emulation of this behavior is performed. In this case, the function is target
torn out of the module, launched in a separate interpreter, and all parameters are serealized, passed and de-realized before calling target
, while the module initialization in the new interpreter is performed partially (only the global context is initialized). More about this, for example, here , there is another very good article where this mechanism is discussed in detail, but now I can’t find a link.
Briefly about how to prepare multiprocessing under Windows:
multiprocessing.Process()
) as early as possible in the code.args
any complex objects with "behavior" (except for objects from multiprocessing itself, it knows how to pass them correctly), only bare data (primitives or objects consisting only of primitives), which are serialized without side effects.process.join()
, but just read the results from the output Pipe, they will be read only after they get there, what happens next with the process is no longer important (you can put it return
after writing to the Pipe in the child process).3. Third "worst"
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question