Answer the question
In order to leave comments, you need to log in
Why is MongoDB not writing to multiple threads?
When studying MongoDB, I came across the possibility of using ordered (1, 2, 3...) instead of the generated _id value:
def get_next_sequence(collection, name):
return collection.find_and_modify({'_id': name}, update={'$inc': {'seq': 1}}, new=True).get('seq')
def insert_in_db():
client = MongoClient(mongo_url)
db = client['']
collection = db['']
print(collection.insert_one({'_id': get_next_sequence(collection, 'userid'), 'value': f'{random.randint(10000, 2147483647)}'}))
client.close()
with Pool(processes=200) as pool:
for _ in range(100000):
pool.apply_async(insert_in_db)
pool.close()
pool.join()
Answer the question
In order to leave comments, you need to log in
You don't get execution result in created processes from pool.apply_async
. This is bad practice because when code is executed in child processes, there may be exceptions that should be handled in the main process. Read more documentation . Example:
import multiprocessing
def f():
raise ValueError()
with multiprocessing.Pool() as pool:
for _ in range(10):
pool.apply_async(f) # no errors
with multiprocessing.Pool() as pool:
for _ in range(10):
result = pool.apply_async(f)
result.get(timeout=1) # raise ValueError
MongoClient
or query .insert_one|.find_and_modify
throws an exception associated with some timeout being exceeded (see mongo_client optional arguments and exceptions )
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question