D
D
Dima Barsukov2015-05-15 14:26:58
MySQL
Dima Barsukov, 2015-05-15 14:26:58

How to avoid duplicate data in MySQL with parallel queries?

The problem is this: you need to write the UserAgent of the request to the log, and realtime, that is, post factum analysis of server logs is not suitable. There is a table uag with two fields - id, value. Accordingly, if we meet a new uag, we write it to the table with the assignment id, and write the id to the log.
The problem is that sometimes several competitive requests can pass from one client, then the check for the presence of a record does not work for us and several duplicates are obtained. It would be possible to solve this with unique index/constraint, but the maximum key length does not allow (uags can be quite long). Lock tables can cause various dumb errors.
So far, only the mutex table comes to mind, but it looks somehow complicated. Maybe there are working options?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
D
D', 2015-05-15
@Denormalization

If you can't make it unique by the UA itself, then why not make it unique by its hash? Add one more column, and write the UA hash there, and hang unique on it.
The hash can be taken by any available function.

P
Puma Thailand, 2015-05-15
@opium

use upsert and everything will be a bunch
of unique user agents, of course

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question