V
V
Vlad2021-07-07 21:09:47
PHP
Vlad, 2021-07-07 21:09:47

How to cache requests to the service API correctly?

Each time you visit the php page of the site, a certain service is called through their API to obtain a certain array of information. But the service has a limited number of API calls per day for one account. I noticed that I very quickly used up all available calls to the service via API, although there were much fewer clicks to the site than calls to the service, even taking into account all search engines and not only bots. That is, ~ 100 people went to the site, for example, and ~ 500 requests to the service.
Here is the call code:

//Запись определённых ключей из массива в переменные
  $key1= getData()['Key1'];
  $key2= getData()['Key2'];
  $key3= getData()['Key3'];
  $key4= getData()['Key4'];
  $key5= getData()['Key5'];
  $key6= getData()['Key6'];

  //Отправка запроса к сервису и получение массива информации
  function getData() {
    return curl('https:/api.example.com/json_array');
  }//Функцию curl сюда вставлять не буду, там обычный curl запрос
  //название там другое, тут указал curl для примера
//Далее как то манипулирую с этими данными...


Only now, at the time of writing the question, I realized that I did something stupid, I had to put the function into one variable, and only then take information from this variable.
$myarray = getData();

  $key1= $myarray['Key1'];
  $key2= $myarray['Key2'];
  $key3= $myarray['Key3'];
  $key4= $myarray['Key4'];
  $key5= $myarray['Key5'];
  $key6= $myarray['Key6'];


I decided to try to do this, if I have already accessed the service, save the array in the cookie and then in all other cases read this array from the cookie and use it when this information from the array is needed again, so as not to constantly access the service.

function getData() {
    if (isset($_COOKIE['arrayData']) && !empty($_COOKIE['arrayData'])) {
      return unserialize($_COOKIE['arrayData']);
    } else {
      $arrayData = curl('https:/api.example.com/json_array');
      setcookie('arrayData', serialize($arrayData));
      return curl('https:/api.example.com/json_array');
      //И тут оказывается глупость сделал, два раза curl запрос сделал правильнее наверное так:
      //return $arrayData;
    }
  }


I decided to do it only through cookies, I can also add saving to the session by the same analogy. But here's the problem. For ordinary people, this might have worked, but a lot of all sorts of search bots / robots and other bot evil spirits come to the site, which take infections and initialize the function with a request to this service.
And as I understand it, some bots have cookies and / or sessions disabled (here is a screenshot -> yeah, links are not allowed to leave =(), and for bots this method will not work, because after each new page refresh, cookies will be clean and the request will be sent again and again =(.

60e5ed9106668045676838.png

Please tell the newbie how in this situation it is more correct (if possible, a simpler option is desirable) to cache these requests to the service from new visitors, so that these requests are at a minimum, that is, ideally, for one new visitor, only 1 request is sent to obtaining an array of information and then the information should be taken only from anywhere from the cache, so that even if some kind of robot or other evil spirits updates the page a lot of times, the request is not sent each time with a new one, but is cached on the first visit to the page and is taken from the cache on the next visit. The storage time of this cache can be at least indefinitely, provided that this is the same device.

Who will answer on the topic - thank you very much and plus you in karma!

PS By the way, I wonder if I understand correctly that if I receive an array from the site via file_get_contents or via curl, for example, there will be no difference in speed in this example?
And do I understand correctly, if I initialize, for example, 5 variables that refer to this one function, then the function is executed 5 times and 5 requests are made?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
E
Exploding, 2021-07-10
@michisvlad

Sessions and cookies work differently, if in the case of the latter, the data is stored on the client side, then with sessions it’s the other way around, and only its identifier “chases” over the network, so if cookies don’t work, then sessions should work like always.
Try to store in the session, but it's better not to, because what will you do when the number of uniques exceeds your limit of 500 requests? Redo everything?
I'm assuming you're using a free API with a daily request limit and not using an API auth key? If so, I would take it all to the client side: a regular ajax request to the API, which, if successful, also sends the finished data to your php handler with ajax, and you can already use sessions in it, so that after reloading the page, the client did not resubmit requests to the API.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question