N
N
Nikita Zelikov2021-12-21 15:27:17
PHP
Nikita Zelikov, 2021-12-21 15:27:17

API optimization, what to use?

Good day, all the proletariat.
There was such a question, I wrote a couple of classes for working with the apish, receiving data from one apish, pulling them into the database as expected (muscle), the interface is implemented on the blade + livewire
So, when uploading to the page, another call to the apish occurs, through loop for + if for the lack of data I need, so the whole run takes 30s, this is TERRIBLY a lot) Tell me
, can I somehow significantly reduce the time spent on my run?

Render class

for($cart_min; $cart_min <= $cart_max; $cart_min++)
        {
            $image = DB::table('room_images')
                ->where('Room_id', $this->room[$cart_min]->Room_id)
                ->get();

            $price = new CartController();
            $this->price_new = $price->price($this->room[$cart_min]->cafe_id, 40, $this->room[$cart_min]->Room_id);
            $this->price_old = round($this->price_new * 1.2);
            if($this->price_new == 0)
            {
                $cart_max += 1;
                continue;
            }
            $this->array[] = [$this->room[$cart_min], $image, $this->price_old, $this->price_new];
        }


In the price class, the logic is simple, I get the values ​​\u200b\u200bby api + pull the data from there in a cycle according to my ID.
How can I optimize this work, can I replace file_get_content, can I start some kind of asynchronous, for now I don’t understand which method to shove)

Answer the question

In order to leave comments, you need to log in

2 answer(s)
V
Vladimir Obabkov, 2021-12-21
@Enroller

I would not "shove" any method
If you are interested in parallelizing work with external APIs, there is multicurl, and you should use it in the first place
To be honest, on the first run with your eyes, there is only one answer - Rewrite the question, being generous with a detailed description and big cuts (C comments), optimization issues are not easy even for good code, but here (Without aggression opinion), it is not very difficult to understand your picture.
So as far as I can see, although I'm developing on symfony with doctrine, you access the database in a loop, it's oh so bad .. And it's not clear why.
There is a batch request, make the list of parameters flat and go ahead, if there is a lot of data, organize the work in chunks, wrap the call to the database in a transaction, then everything will be brought into the RAM.
Things like$this->room[$cart_min]it’s better to do everything safely through isset()
Ideally, you spit on these arrays and describe + - a clear model of the process with abstractions, so the overall order will increase greatly and the speed will hardly suffer )

R
rPman, 2021-12-21
@rPman

The reason for the brakes is an incorrectly formatted request, something that can be calculated on the sql server, for some reason you think on the backend.
Even without studying the code, it is clear that instead of making 100500 requests, one for each $this->room[$cart_min]->Room_id, it is enough to collect these identifiers in a list and form one request to upload all room_images at once (since the work list does not depend on the response from the sql server)
If the number of identifiers is less than a thousand, then this can be done using if more, then think, maybe let the server itself immediately manage the list of identifiers, storing them in some kind of plate select ... from ... where id in (1,2,3,4,5,...)

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question