F
F
Fayozzhon Berdiev2015-12-24 16:19:18
PHP
Fayozzhon Berdiev, 2015-12-24 16:19:18

How to solve the problem with Shared hosting, CSV, php for the new year?

Good day, dear colleague, brother in arms, friend, brother and just reading this text!
First of all, I would like to take this opportunity to wish you a Happy New Year!
Wish:
All the best!
Compilable code, cross-browser layouts, 100% uptime, increased CTR and all the best! :-)

The task for the holidays went to
Veda a drug discovery startup, the drug database should be updated every day by importing CSV files, there was a time when the standard loop could read the file into php memory and send ActiveRecord directly to the BATCH INSERT database.

Some time has passed, the number of connected pharmacies has increased, respectively, the import file has become large.
I decided to divide the file and then cycle through the import.
In this case, due to the fact that Shared hosting, when importing files into during the Apache cycle, goes into a deep 500.
How would I correctly implement the CRON Task so that it imports all the files completely.
For example apt_1.csv - apt_14.csv and there can be different numbers of these.

I will be glad for your tips, discussions, switching to VDS \ VPS please will not be accepted as an answer to the problem

Thank you very much in advance

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
Alex Safonov, 2015-12-24
@elevenelven

Import from file line by line. Every 5 minutes 100 lines (for example).
Organize the storage of the string pointer in the database/memcahce.
-------------
You can try trolling via usleep()

$z=0;
foreach($data as $k=>$v){

  $z++;
  if($z===50){
    $z = 0;
    usleep( 200000 );// 0,2 sec
  }
}

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question