Answer the question
In order to leave comments, you need to log in
How to work with large files in continuous reading mode?
Good day!
There is a certain program which writes some data to a file. Suppose only the format of the string is known (for example, "record time;number;number;number"), i.e. string length is not fixed. This program does not record data at constant time intervals, it can be recorded either once per second or once every 10 seconds. My task is to read data from this file and enter them into the database. How to implement it at all? The file can be large, I have to write only new data, those that I have already written should not be taken into account. When my program starts, rows that have not yet been written must be read from the file and written to the database. In which direction to dig? So far, I have no idea how this can be implemented.
Language: C++/Qt
Answer the question
In order to leave comments, you need to log in
Linux has tail -f. You can see how it is implemented there and do the same, or run your program tail -f /path/to/file | myprogram and read lines from stdin and write to the database.
In Windows, it seems to me, it is worth digging towards fseek (). For example, store the date of the last message, and read the last 5-10% of the file through fseek and enter only new entries. Clumsy, but will work.
Dig towards indexing. Divide into several files by recording time, for example.
Or store "recording time -> offset inside file" somewhere separately.
Yes, you can even use some kind of pipes - let the data be written simultaneously (or in turn) to the file and to your application.
Well, there is nothing wrong with a large file size. In your case, it is necessary to read only the last records - i.e. not the whole file, but only its end.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question