M
M
Maxim Anarchistov2014-03-31 10:07:37
MySQL
Maxim Anarchistov, 2014-03-31 10:07:37

How to store a large (2400*1800) two-dimensional array in a database?

There are several two-dimensional arrays with dimensions of 2400*1800 and above, the values ​​are numbers 0-255. (The essence is a map like Terraria, the value is the type of the element on the map). Reading from this base must occur by request from any part of the array of a subarray of 100 * 100 elements.
Actually, the problem is that the simplest way to store and use 4+ million values ​​per array is somewhat expensive in terms of performance.
The data is dynamic
Actually, therefore, there were several suggestions for solving the problem:
1. Stupidly put a label on each array in MySQL (PostgreSQL), thinking to the last that performance will not fall.
2. Split the array into sub-arrays of 100 * 100 elements, and store these arrays as formatted TBLOB text in the database. It is not known to what extent this method is used.
3. Use Sqlite and others like it.
4. Use non-table storage of information, for example, in a file without using any systematic markup methods.
5. Use xml language or something similar.
6. Another option.
Actually, which is better?

Answer the question

In order to leave comments, you need to log in

6 answer(s)
R
rPman, 2014-03-31
@rPman

Your task is very effectively solved by a regular file on the disk (open with lazy writes disabled, or forced flush after each write). If the toolkit (programming language and libraries) allows - open the file by memory mapping.
An index is not needed for such queries, because cells can be indexed primitively - (x + maxx * y). a 100x100 block request turns into 100 quick reads of 100 bytes. If this is justified, you can store not cells but 100x100 blocks, then when reading, 4 times more data will be read but four readings. But if the screen data block fits into the read-ahead buffer of the operating system (file system driver), then this storage method will be irrelevant.
Any other method will be either slow in writing (for example, store not cells in the database but 100x100 blocks, respectively, when a block is requested, four neighboring ones will be read), or slow in reading (store one record per cell) and inefficient in terms of disk space.

D
Denis Morozov, 2014-03-31
@morozovdenis

the idea with the file seems attractive to me,
here's how I see it (I'll simplify it to 5x5):
create a file of 25 bytes in size filled with zeros:
and write an array read / write function, let it be 2x2 ():

псевдо код
bool write(f, char[2][2] data, start_x, start_y, width)
{
         f_seek(f, start_y * width + start_x);
         f_write(f, data[0]);
         f_seek(f, (start_y + 1) * width + start_x);
         f_write(f, data[1]);
         return true;
}

char[2][2] read(f, start_x, start_y, width)
{
         char[2][2] result;
         f_seek(f, start_y * width + start_x);
         result[0] = f_read(f, 2); // читаем два элемента
         f_seek(f, (start_y + 1) * width + start_x);
         result[1] = f_read(f, 2); // читаем два элемента
}

there is no protection against failures
over look for NoSQL databases with the ability to store arrays

S
Sergey, 2014-03-31
@begemot_sun

IMHO. Depends on your tasks. I would stop at 100x100 .. and why do you need to store the map in the database? What are the samples?

V
Vitaly Zheltyakov, 2014-03-31
@VitaZheltyakov

If the data is dynamic or you need to search by cells (all), then we make just such a table in the database 2400 * 1800. MySQL should overpower without brakes.
If the data is not dynamic and cell search is not needed, then it is better to use a file.

S
svd71, 2014-03-31
@svd71

And why the usual table with three fields does not arrange?

create table dimmension (
dimIdx1 int not null,
dimIdx2 int not null,
val char
);

V
Vitaly, 2014-04-11
@vipuhoff

store the map as a bitmap picture, 255*255*255 colors will be enough to store your object's code 100500 times. You can read accordingly as a normal picture. It turns out "Array" at least 25000 by 25000 with a certain approach will work faster than this in the database :)

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question