A
A
Arsen Abakarov2018-02-17 11:20:14
Python
Arsen Abakarov, 2018-02-17 11:20:14

Gracefully closing third-party resources in a multi-process environment?

I am writing a small program, in a certain place I need to operate with docker entities, I do this using docker.py,
I can run the program through the celery scheduler, so we have many application instances
Containers must be rummaged between instances.
When exiting the program, you need to clean up the garbage in docker, I decided to make a handler for sys.exit and OS signals,
but you can’t beat everything at once, since other instances can be launched, I thought and decided to keep statistics on the redis side, which container is used by whom and all that, created something like a manager, entrusted him with these responsibilities.
But again, when exiting the program, it now turns out that we must turn to redis,
how fair it is to do such things when exiting the program, because there may be a problem with the network and we will have a problem with closing the
application will eventually take shape in a package, I would not want to hang up resource cleaning on the client of the package

Answer the question

In order to leave comments, you need to log in

1 answer(s)
V
Vladimir, 2018-02-17
@vintello

I think not only you have such problems and probably someone has already solved it all

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question