V
V
Vladimir Kokhan2019-03-18 23:02:52
PHP
Vladimir Kokhan, 2019-03-18 23:02:52

How to upload a large XML file to a database?

Good day everyone.
There is a rather large xml file, approximately 50,000 lines will work out. I receive by choosing from a computer

$data = new \SimpleXMLElement($request->file('feed'), null, true);

Next, you need to upload it to the database. I do as usual:
foreach ($data as $item) {

                $information = new Data([
                    'type'  => $item->type,
                    ...............
                    ...............
                ]);

            $information->save();

        }

On a local computer in OpenServer everything goes fine. But there are problems on the hosting - in the browser there is an error
504 Gateway Time-out
In the server logs
[Mon Mar 18 22:49:55.049174 2019] [cgi:error] [pid 27832] (70008)Partial results are valid but processing is incomplete: [client 37.115.231.73:37864] AH01225: Error reading request entity data, referer: https://vkohan.tk/feed/create?id=1

As I understand it, you need to increase the max_execution_time parameter (in OS it costs 180), but it is advisable not to contact the hoster.
I was advised to split the file into parts and load in queues, but I'm not too good at this yet.
Can you tell me how to properly organize the download?
Thanks
Laravel Version 5.8
PHP 7.1
MySQL - 5.7

Answer the question

In order to leave comments, you need to log in

2 answer(s)
T
Talyan, 2019-03-18
@flapflapjack

Just break the xml file into several xml's in notepad, and add one at a time to several execution passes. One script execution to load one piece of xml

A
Adamos, 2019-03-18
@Adamos

$informations = [];
foreach ($data as $item) {
  $informations[] = new Data([
    'type'  => $item->type,
    ...............
    ...............
  ]);
}
Data::insert($informations);

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question