C
C
Cornelian2017-04-10 16:46:40
SQL
Cornelian, 2017-04-10 16:46:40

In what form is it more correct to store large tabular data on the server?

Good afternoon.
There was a task of creating a site with a database of the results of certain scientific measurements.
Initially, there is a set of csv files. One column is DateTime, the rest can be selected from some known set. ~300,000 rows and ~150 columns each file (~400Mb). The total volume is 15-20Gb.
Upon request, it will be necessary to give the generated csv with the selected piece of dates and columns from the required file.
Does it make sense to store such data not in the form of csv, but in some kind of PostgreSQL? How fast will a query of ~100,000 rows and 100 columns be processed? Will the answer to the first question change when the total volume is increased to 2Tb?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
D
dinegnet, 2017-04-10
@Cornelian

I also laugh at nothing.
Scientists are idiots, yes.
Well so put a test to your smart student ....
Kanechna needs to use a database server. Yes, for example, PostgreSQL.
For him, the volumes you mentioned are simply ridiculous.
Although, CSV is fine, as long as you need to return the results CONCERNING from the CSV, and not look for them in different places in the CSV files.
When extracting data in a scatter, it is better to use a specialized DBMS.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question