S
S
serjioms2015-11-09 20:44:02
PHP
serjioms, 2015-11-09 20:44:02

How to save a lot of php + mysql statistics?

+ About 800,000 POST/GET requests with data per day come from the service (about 600 requests per minute, excluding peaks). The request contains up to 10 parameters, which must be stored in the database, this is the data necessary to build life statistics.
+ It is necessary to store them in the mysql database (possibly using intermediate solutions).
+ The server uses a bunch of php + mysql.
How to most competently implement the preservation of statistical data that generate requests, without possible losses and blocking
. What and where to read on this issue.

Answer the question

In order to leave comments, you need to log in

3 answer(s)
I
Immortal_pony, 2015-11-09
@serjioms

For example, like this:
1. The message receiver puts the raw data into some kind of table.
2. The aggregator reads batches of raw data, aggregates them and adds data prepared for reports into other tables. Raw data is marked as processed.
3. The archiver monitors the raw data table and transfers the old processed data to archive tables. After the work of the archiver, the table is optimized.

S
serjioms, 2015-11-09
@serjioms

I see such a solution. How fast is it?

<?php

    $mysql = \mysql::_init();

    $rawData = ( \request::is_post ) ? \request::get_raw_post_data : \request::get_raw_get_data;

    $mysql->query("
        INSERT INTO `userStat`
        ( `data` ) 
        VALUES ( '" . $mysql->real_escape_string( json_encode( $rawData) ) . "' )
    ");;

Then the cron run and process the data for further statistics.

K
Kirill, 2015-11-09
@kshvakov

pinba.org

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question