K
K
KEKSOV2013-09-17 01:16:05
Delphi
KEKSOV, 2013-09-17 01:16:05

Ready implementation for IPC with buffering?

Two processes work within one computer - Process A collects data from the equipment, it is very fast and you need to finish it very carefully, so writing data to MySQL is moved to a separate Process B. The peculiarity of the task is that databases can at some point not to be - the server was turned off, the network disappeared, etc., while the data should not disappear - when the database appears again, they should be written there. It turns out that process A should save them to some intermediate buffer (file), and process B should read them from this buffer (and delete them from there after successful writing to the database). It begs the implementation of something like a MemoryStream over a memory mapped file, with synchronization via semaphores / events. Another option is a SQLite database, but I don’t want to drag it here to solve such a narrow task.
Perhaps someone has come across a similar problem and can suggest a ready-made solution.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
I
Ilya Evseev, 2013-09-18
@IlyaEvseev

1) The universal option is Redis, i.e. fast client-server nosql storage in RAM with snapshots on disk.
2) If something extremely simple is required, then you can create a pipe with a buffer of the maximum size.
stackoverflow.com/questions/4624071/pipe-buffer-size-is-4k-or-64k
Process A reads data from and writes to a pipe. Process B reads pipe and writes to the database. Buffering is done by Linux.
The main points that you need to pay attention to:
- the maximum buffer size for pipe,
- the possibility of losing data due to crash B.
3) If you write to a mysql table of the Memory or Archive type using INSERT DELAYED directly from A, is the speed satisfactory?

M
mayorovp, 2013-09-17
@mayorovp

And than does not suit the second flow within the limits of the same process?

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question