P
P
Pantene7422020-05-30 16:48:40
Web development
Pantene742, 2020-05-30 16:48:40

What is the best way to calculate statistics for large tables?

I am a front-end developer but would like to make the project as full-stack. I will use Angular + Nest + ????(some_data_base). I will need to select data according to certain criteria (supplier country, port, and others) from a table where there should be a million or more records and make calculations with this data. As I understand it, you need to use SQL queries to the maximum to conduct these calculations using the database without selecting data. For specific calculations, of course, I understand that I already have to work programmatically. But for a banal amount for some field, I think the database is the most ideal option (calculation using the database)
Calculations for certain search criteria will not be able to be cached, since there will be many filters (date from to), including and this is not Highload - cheap for a small amount users.
* for example - you need to make a Pie Chart for the user from the countries of suppliers for a certain period. And these records (supplies) can be 100k for a given period of time. Having calculated the amount of deliveries (tons) for each country,
I think that the best option is to choose PostgreSQL.
Please tell me how would you solve this problem. I worked a little on the back, the most difficult thing in SQL was Join in queries. I don't know how best to do it.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
S
Sergey Gornostaev, 2020-05-30
@Pantene742

from a table where there should be a million or more entries

This is not big data. Any DBMS will cope with such volumes, even primitive SQLite.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question