Answer the question
In order to leave comments, you need to log in
How to properly organize logging in multiprocessor scripts?
What is the best way to do logging in scripts that run in multiple processes (not threads). I would like to log into one file, and not each process into its own file. I found it on the Internet only through the creation of a queue in which each process puts a line, after which the cycle in the main process takes out the lines and writes to the file. Maybe there are other approaches? And it's strange that the standard logging library does not already have a ready-made implementation of something like this.
The main condition is operability under Windows, so the multiprocessing-logging library is not suitable.
Why log necessarily in one file? It is necessary, because the logs are then read by the analyzer, which can work with only one file and there is no way to fix it.
Answer the question
In order to leave comments, you need to log in
to put/nakolhozit analog of syslog for Windows.
to listen to ip:port or socket and store everything received in a file.
Another option: write to different files, then aggregate everything using Logstash or even ELK .
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question