Answer the question
In order to leave comments, you need to log in
How can Postgresql achieve the performance of ElasticSearch?
There is an analytics system - which receives a lot of metrics. Now ElasticSearch is used for this case. It works very fast (especially considering the amount of data). It was necessary to expand the system - the relational model clearly suggests itself. Of course, you can mix, but if you can get by with one system, it would be great.
What are the tasks?
Aggregate data as close as possible to real time, volumes 1,500,000 - 2,000,000 per day, most often you need data for a day, a week, a month, 60 days. Based on this data, build reports - graphs, tables. Filtering by a large number of fields - 20-40, in the future 100 or more.
The question is, is it possible, and if so, how? I have no experience with postgres. Seems like such a big monster in which you can do everything, but you need to know how. In which direction to dig, which "engines" to choose, etc.?
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question