A
A
Alexander Savchuk2016-03-10 08:41:39
Redis
Alexander Savchuk, 2016-03-10 08:41:39

How to organize deferred processing of statistics?

Hello.
There is a simple system for recording link click statistics. The frontend, after redirecting, RPUSH sends the click data (LinkID, session, timestamp) to a key named "clicks_queue". Every 5 seconds, the worker wakes up and uses the LRANGE clicks_queue 0 -1 command to extract all clicks from the queue and writes changes grouped by links to the database. With small loads, everything is OK, but as soon as the load increases, there is a discrepancy in the logs and statistics. I suspect that while the worker is reading data from the queue, the frontend adds more clicks. But when the worker has retrieved all the elements, it clears the ltrim'om queue and the data that has been added is also deleted. What am I doing wrong, and what is the best way to organize deferred processing of statistics?
Thanks in advance.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
D
Dimonchik, 2016-03-10
@dimonchik2013

look towards the queue server,
I don’t know how RPUSH works, but in Python, this relies on zeromq or gearman at worst

@
@mgyk, 2016-03-10
_

Pull data and clear the list inside a transaction:
redis.io/topics/transactions

MULTI
LRANGE clicks_queue 0 -1
DEL clicks_queue
EXEC

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question