Answer the question
In order to leave comments, you need to log in
Best practice for geoservice?
There was a task - Portal with tracking. The task is pretty hackneyed, but new to me.
Are there any established practices for this? The choice of a DB and architecture of basis interests.
The estimated maximum is about 700 thousand records per day.
Answer the question
In order to leave comments, you need to log in
Did such a thing, though on .net. There are no particular problems on the server side, except for the web part itself, there was a service that listened to trackers, wrote messages to the database and sent a message to the client via web sockets on the page. In the database, in addition to the standard part for users, there were a couple of tables: for trackers and their messages. Trackers were accordingly tied to users.
The main trouble in the web part is to do it interactively, with a normally understandable interface and sufficient opportunities for the user to see not only markers on the map. The ability to set a polygon, signal about entering / exiting the polygon or approaching key points, it is convenient to store the history and search through it, or draw a beautiful trajectory. Well, a bunch of other rubbish.
In half a day I assembled the following service: https://github.com/cutecare/gas-map , under the hood:
1. elasticsearch accepts data (json post) and sends it to a map, or to analytics (kibana)
2. to Yandex. The data that is taken from elasticsearch is marked with circles on the map
. It works very quickly, 700 thousand per day is a childish babble for Elasticsearch. There are plenty of alternatives, of course.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question