L
L
last72015-12-06 18:40:18
SQL
last7, 2015-12-06 18:40:18

Which database to choose to work with complex filters (WHERE conditions)?

Let's say there is an online store that collects information about the behavior of customers and such information already on 50M rows in the database (Postgres). And the task was to view (in real time) how many buyers did this and that. For example, I bought a TV, looked through the catalog with teapots 5 times, which has a last name, and visited the site 4 months ago. In other words, to aggregate information, according to different filters, not known in advance, a quick glance at the TOR revealed that the minimum will include 10-12 conditions.
Thoughts were different: create a huge table with all the data, and then use simple WHERE / GROUP BY to select the data; the table will be large, but it will be possible to get rid of JOINs. There was an idea to take MongoDB and its aggregations, but they write that they are not so fast. It was advised to take a columnar database, but in my opinion there is not much data and it is hardly advisable.
Who faced, tell me what technologies to watch?
PS Real-time is 15 seconds maximum.

Answer the question

In order to leave comments, you need to log in

3 answer(s)
P
Puma Thailand, 2015-12-06
@opium

well, let everything cook in postgres

R
romy4, 2015-12-06
@romy4

is it a Reports type? if so, then each has its own aggregation table.

S
Sergey, 2015-12-06
@begemot_sun

You probably need some kind of OLAP cube.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question