Answer the question
In order to leave comments, you need to log in
Interaction of several python scripts?
Essence: there is a basic script that generates a file with a base (json) and several auxiliary ones that once every N seconds check if the file has changed and read it if necessary.
It works, but architecturally so-so. In addition, when a file is read while it is being written, auxiliary scripts catch an exception (invalid json). It is processed, of course, and after N seconds the file will be read again.
What are the options for interprocess communication (we are talking about python)?
It should be noted that scripts run on both windows and linux, so platform-specific options are not suitable.
Installing mongo/mysql is a last resort.
File specificity - each new version completely replaces the previous data.
Answer the question
In order to leave comments, you need to log in
There is no need to query the file, it is enough to use inotify from the OS itself.
Under Python, there are cross-platform implementations of file change tracking through inotify-like subsystems. Tracking scripts will catch the event after the file is written and closed, even without communication with the main process.
And so, scripts can interact with each other both via HTTP (by publishing a small web server on their own), and via any self-written RPC protocol over TCP or unix sockets.
As the simplest solution, you can put json into sqlite, this does not require a separate application, one module is enough.
You can also use flock on the file (but this does not seem to work on all systems) or create a flag file after creating json, according to the presence of which applications will react to changes (as an option, you can react to changing the timestamp of the flag file).
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question