I
I
Ivan2021-01-02 20:19:51
PHP
Ivan, 2021-01-02 20:19:51

How to optimize these queries on Laravel?

Requests take up too much time and memory because there is a lot of data in the database. I need to speed up somehow. I have changed the code many times and still do not know how to make it even faster.

I have three graphs on the main page. There we choose from which date to which to show and I get data for every day.

First, I convert the date to the desired format:

if ($from) {
     $dateFrom = Date::createFromFormat('d.m.Y G:i', $from);
 }
if ($to) {
     $dateTo = Date::createFromFormat('d.m.Y G:i', $to);
}

I create an array in which I will add the values:
$tableDate = [
    [
        'count_sales' => [],
    ],
    [
        'new_users' => [],
    ],
    [
        'success_payments' => [],
    ],
    'days' => [],
];

Then I receive kol-in days between these dates.
// Все дни с первого по последнего.
$count_days = floor(abs(strtotime($dateFrom) - strtotime($dateTo)) / 86400) + 1;

After that, I make requests through the constructor. I specifically group them by dates in order to parse them later.
$count_sales = ActiveService::orderBy('created_at')
    ->whereBetween('updated_at', [$dateFrom, $dateTo])
    ->get('created_at')
    ->groupBy(function($events) {
        return Carbon::parse($events->created_at)->format('d.m.Y');
});

$new_users = User::orderBy('created_at')
    ->whereBetween('created_at', [$dateFrom, $dateTo])
    ->get('created_at')
    ->groupBy(function($events) {
        return Carbon::parse($events->created_at)->format('d.m.Y')
});

$success_payments = HistoryPayment::orderBy('created_at') 
    ->whereBetween('created_at', [$dateFrom, $dateTo])
    ->where('operation', 'top_up')
    ->where('status_transaction', 'success')
    ->where('payment_id', '!=', '0')
    ->select(['amount', 'created_at'])
    ->get()
    ->groupBy(function($events) {
        return Carbon::parse($events->created_at)->format('d.m.Y');
});


Here is what I get:
(on the example of $count_sales)
61d1dd1cb9d01824793669.png

Now I do the following (I will explain below)
for ($i=0; $i < $count_days; $i++) {
    $day = date('d.m.Y', strtotime($dateFrom) + 86400 * $i);
    array_push($tableDate['days'],  $day);

    $tableDate[0]['count_sales'][$i] = 0;
    $tableDate[1]['new_users'][$i] = 0;
    $tableDate[2]['success_payments'][$i] = 0;

    // Кол-во успешных активаций (продаж) по дням.
    if($count_sales->get($day) !== null) {
        $count = $count_sales->get($day)->count();
        $tableDate[0]['count_sales'][$i] = $count;
    }

    // Кол-во регистраций по дням
    if($new_users->get($day) !== null) {
        $count = $new_users->get($day)->count();
        $tableDate[1]['new_users'][$i] = $count;
    }

    // Сумма успешных оплат по дням
    if($success_payments->get($day) !== null) {
        $amounts = $success_payments->get($day);
        foreach ($amounts as $value) {
            $tableDate[2]['success_payments'][$i] += $value->amount;
        }    
    }
}


I create a loop in which I iterate every day.
Those. the $day variable is a specific day in the dmY format.
As you can see, I have checks like: Just the DB queries shown above (the screenshot above) return only the days that have the records we need. And it can calmly be that there will be some date, for example, 12/22/2021 on which nothing will be bought / replenished, etc. Therefore, I first check the date that we are parsing now to see if we have data for it. Well, if there is, I enter them. At the end, I just return everything:
if($count_sales->get($day) !== null)

return new Collection([
    'tableDate' => $tableDate,
]);


Well, queries take a lot of time and memory. Quietly, an error 500 can come out well, or it just loads 30 seconds - a couple of minutes.

I would like some advice or point out what I did wrong and how it can be done better.

Answer the question

In order to leave comments, you need to log in

4 answer(s)
V
Vyacheslav Plisko, 2022-01-03
@youmixx

As far as I remember, ->get() makes a request to the database and returns a collection. And this is a lot of data and your project will die, use groupBy as part of the query and calculate on the DB side before the get () call, and not after, when you are no longer working with a query builder, but with a huge collection.

V
vism, 2021-01-02
@vism

This is a task, not a question.
Well, if the question is, then rewrite the code so that 1 request to the database is formed and data is already retrieved from it.
No problem, I've done this many times.
Well, if the task, then $200/h and in a few hours we will make the necessary request :)

A
Alex Wells, 2022-01-03
@Alex_Wells

Write optimized queries to the database. Grouping is mandatory on the request side. 99% that this will solve your problem, because you definitely don’t have much data. And if suddenly, then explain and indexes to help.

S
Sergey, 2022-01-03
@KingstonKMS

1. Place the correct indexes in the tables.
2. Build queries in the mysql client, debug them and make sure they run with minimal time.
3. Build queries through the framework tools.
* Select from the database only those columns that are necessary for further use.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question