I
I
Ivan Palamarchuk2017-10-29 23:45:56
PHP
Ivan Palamarchuk, 2017-10-29 23:45:56

Delayed saving of a large amount of statistics?

Hello!
Need help with the issue. I understand that the current implementation is as inefficient as possible, and I would like to look at the problem from the other side.
The service collects user activity statistics (the simplest), save user_id, app_id (web/mobile/ios/android, etc), date. We save the "mark" only once with a unique index, in order not to unplug the DBMS once again - we make a mark in Redis that the "mark" has been set. In the office, we update the statistics every 3-5 minutes (we save the already aggregated data in another database).
The crux of the matter is that recently user activity has grown and it often happens that Redis does not have time to save a row, as 2-3 write attempts go. A possible solution to the issue is to save this data somewhere and somehow save it to the database the entire pool once every 1-2 minutes, but how can this be done more "normally"? Given that there are more than 1 million unique users per day

Answer the question

In order to leave comments, you need to log in

1 answer(s)
T
terrier, 2017-10-30
@terrier

A million users per day is 11 requests per second - a penny for postgresql on at least some normally configured hardware, you don't need any Redis.
Okay, given that users come in unevenly throughout the day plus there are requests for duplicate inserts, but in any case, a simple
INSERT ... ON CONFLICT DO NOTHING
is absolutely enough for you.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question