S
S
SEOD2018-08-03 18:01:18
JavaScript
SEOD, 2018-08-03 18:01:18

How to deal with the problem of parallel reading and writing data to a file under node.js?

There are two independent node.js applications. One writes data to a file via: fs.writeFileSync . The other reads from the same file via: fs.readFileSync .
The file weighs about 70-100kb. Recording goes every few seconds.
Reading goes at a random time. The problem is that the node does not always have time to add the file and I read a piece of it and the program after such a successful reading continues to work as if nothing had happened :) but for me, broken data is critical.
Is there really such a problem, or am I implementing writing and reading incorrectly?
Of course, I can go easy on the md5 sum of the content at the beginning of the file.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
L
Lynn "Coffee Man", 2018-08-03
@Lynn

Write to a temporary file nearby, and then rename. Renaming is an atomic operation, so the reading process will either read the old file or the new one, but never half of the new file.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question