D
D
Dmitry Korolev2017-05-18 21:38:03
C++ / C#
Dmitry Korolev, 2017-05-18 21:38:03

How should processing a large file not take up a lot of memory?

I have a synchronous operation that iterates over and processes 100 mb through the handicap. It looks like this:
https://pastebin.com/v83kvDRB
It seems that everything was taken into account. But here's the problem. with each handicap pass, the program is loaded more and more, just as when it was written in the handicap reading stream with the writing stream. I'm trying to work out the speed in a synchronous operation, since the application does not need a response.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
Alejandro Esquire, 2017-05-19
@adressmoeistranici

Didn't look at the link, sorry. Well, for example, text editors write either to try to immediately load the entire file into memory (which is fraught with an attempt to open a multi-gigabyte file in the editor), or in a smart way that it loads only a small fragment into memory and loads the following if necessary. Also, you probably need...

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question