Answer the question
In order to leave comments, you need to log in
How to properly parse a large xml file and save the data?
I need to "read" a large xml file and add values to the database.
For parsing I use XMLReader + DOMDocumnet and as a result I get an array of this kind.
[
'categories' => [
0 => [
'name' => 'Category',
'id' => 1456,
'parent_id' => 284
]
],
'products' => [
0 => [
'category_id' => 1456,
'id' => 135,
'available' => true,
'price' => 22
...
],
....
]
]
insert
(add a product, link to a category, picture, properties, etc.) are also needed select
to check the uniqueness.Answer the question
In order to leave comments, you need to log in
The fastest way to insert a huge amount of data into MySQL is to import from a csv file
In a similar task, where I had to process 10+GB xml and 150+million rows for insertion, I solved it in 2 stages
1. xml stream parsing so as not to crash from memory, and writing the results to csv
2. importing data from csv into the database via LOAD DATA LOCAL INFILE
. No other speed data insertion method could bypass loading from a csv file.
LOAD DATA LOCAL INFILE '{$csv_file}'
INTO TABLE `{$table_tmp}`
FIELDS TERMINATED BY ','
ENCLOSED BY '\"'
LINES TERMINATED BY '\\n'
IGNORE 1 ROWS;
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question