A
A
Artem Smirnov2017-10-26 21:57:32
PostgreSQL
Artem Smirnov, 2017-10-26 21:57:32

What architectural solution is suitable for storing and working with geo-coordinate data under high loads?

Good day to all!
I'm involved in the development of a highly loaded system , a significant part of the data of which is geographic coordinates (a pair of latitude, longitude, in degrees, the accuracy is 0.00001 degrees).
The project is designed according to microservice architecture.
The system has several tables (entities) for which GPS tracking is organized. Geocoordinates are added to tables associated with these entities (current moment - 2). Requests for adding geocoordinates - 5 million/minute , for each entity.
The current task is to read statistics on several indicators, grouped relative to geocoordinates.Moreover, statistics should be able to filter by all fields (including by geocoordinates, including by radius in km, relative to the desired geocoordinate).
Question number 1 is how best to organize the calculation and storage of such statistics?
At the moment, each "report" of statistics has its own table. The base is PostgreSQL, with the PostGis extension.
For statistical "report" tables, GiST indexes are assigned. They allow you to quickly filter geocoordinates relative to the desired geocoordinate (Index Scan).
The data in the tables of statistical "reports" are recalculated by a timer.
Question #2 is the current existing solution okay. Is it possible to work with him further?
Thanks in advance!

Answer the question

In order to leave comments, you need to log in

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question