Answer the question
In order to leave comments, you need to log in
What DB format to use for a large number of records and readings and tables?
Hello, It is
necessary to implement statistics of visits and reading it.
On the day ~ 750000 calls to the script for 13 select and update for 2 tables.
Requests are simple
SELECT count(*) as users_more_3 FROM `users` WHERE date=? AND script_show>2 LIMIT 1
Answer the question
In order to leave comments, you need to log in
Too little specifics
Look at
redis.io/commands/INCR
Or do the caching by hand
Or muscle
tools Or php-memcash
tools Or nginx tools
If MySQL is fundamental - try HandlerSocket , it will improve writing and reading accordingly, but require learning a new API.
A small description with links is in my other answer on a related question: Question about connecting a caching system?
You can allocate one table row per day for this query.
In general, if you have a lot of parameters there and you need to store everything, then you can simply upload linearly to the file, one server with caching can write + 10 million / sec records, then the workers slowly rake these files and put the final numbers, for example, in redis for a quick return , and raw data, for example, is packed into archives if they are needed for full analytics.
As a result, the database will not be a problem, and the server will run into http request parsing if you use PHP / Python, etc., because there is a limit of up to 100k / sec, although PHP will pull for your volumes.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question