Answer the question
In order to leave comments, you need to log in
Which database to choose to work with complex filters (WHERE conditions)?
Let's say there is an online store that collects information about the behavior of customers and such information already on 50M rows in the database (Postgres). And the task was to view (in real time) how many buyers did this and that. For example, I bought a TV, looked through the catalog with teapots 5 times, which has a last name, and visited the site 4 months ago. In other words, to aggregate information, according to different filters, not known in advance, a quick glance at the TOR revealed that the minimum will include 10-12 conditions.
Thoughts were different: create a huge table with all the data, and then use simple WHERE / GROUP BY to select the data; the table will be large, but it will be possible to get rid of JOINs. There was an idea to take MongoDB and its aggregations, but they write that they are not so fast. It was advised to take a columnar database, but in my opinion there is not much data and it is hardly advisable.
Who faced, tell me what technologies to watch?
PS Real-time is 15 seconds maximum.
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question