I
I
isvirin2013-06-13 01:25:02
Programming
isvirin, 2013-06-13 01:25:02

Brakes when working with large files in Windows7

There is a system that was ported from Linux to Windows, which lived perfectly in Linux, and inexplicable phenomena began to be observed in Windows.

The system uses a large file located on the file system to store data. File size, for definiteness 30Gb. We somehow structure this file inside, placing and reading data. Under Linux, the role of such a file was performed by a block device, which we thus used at a low level without any file system. Under Windows, the customer really wanted us to create a file on his existing NTFS file system. This is where our troubles began...

I must say right away that we place the files in places where Windows7 officially allows user applications to do this. We stepped on the rake with the inability to put the file in the root of the system disk, wiped the snot, fixed it and moved on.

First, we created an ordinary huge file and tried to work with it in various ways available to us: positioning inside and performing normal read-write operations, or alternatively, using a map of the areas we need and working with them. Faced with the fact that if you try to work with a file just created by the program, then read-write operations in it are hung up for an indefinitely long time, and when using mapping-a, write attempts of course skip (without blocking), but at the same time, the consumption of the system grows terribly memory (not our process!) - apparently Windows clogs the data we write into some kind of its own hdd cache.

We managed to solve the situation, as it seemed to us, by creating "leaky" files. We really create true-leaky files, which is clearly indicated by the window panel with information about the file (the file size is 30GB, and it takes several kilobytes of disk space). All problems seem to have been resolved, the file is quickly created, and work is being done with it through read-write and mapping, no locks and no memory growth are observed. Everything seems to be fine, but we paid attention to an unpleasant phenomenon that has now become critical

: based on information read from this file).
2. As soon as we restart the program, which starts working with an already existing file, inexplicable slowdowns begin when working with the file. In reality, the speed of performing the same read operations slows down dozens of times.

What could it be? In which direction should you dig? What black magic to cast?

Thanks in advance for your replies.

Answer the question

In order to leave comments, you need to log in

3 answer(s)
A
Andrey Kravchuk, 2013-06-13
@WhiteD

The fact is that when you open a file and write data to it, and then read it, you are actually reading from the cache. When you open the file a second time, you are reading this data from disk. Hence the time difference.

E
Evgeny Kunitsyn, 2013-06-13
@LordMerlin

Perhaps you should still pay attention to the database.

V
v2v, 2013-06-14
@v2v

The described problems with the usual huge file are a little alarming.
Perhaps the brakes are associated with mixing normal read-write and projections.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question