Answer the question
In order to leave comments, you need to log in
How to import large JSON (18gb) into MySQL?
There is a 18GB JSON file. It is essentially a list of line-by-line objects and is not valid JSON.
Looks like this:
{"address": "Gabriele-Tergit-Promenade 19, Berlin", "amenity_groups": [{"amenities": ...}
{"address": "Passeig De Gracia, 68, Barcelona", "amenity_groups": [{"amenities": ...}
...
{"address": "Great Cumberland Place, London", "amenity_groups": [{"amenities": ...}
Answer the question
In order to leave comments, you need to log in
You don't need to load the file into memory, there is threading for that:
<?php
$fp = fopen('big-file.json', 'r');
if (!is_resource($fp)) {
throw new \RuntimeException('Failed to open file for reading');
}
while (!feof($fp)) {
$line = fgets($fp, 32768); // Укажите лимит на одну запись
if (empty($line)) {
// Пропускаем пустые строки
continue;
}
$data = json_decode($line, true, 512, JSON_THROW_ON_ERROR);
// ... записываем в базу
}
fclose($fp);
One of the first solutions that came up was to use SplFileObject .
Open as a file and in batches of 500/1000 writes to the database
$file = './file.json';
$spl = new SplFileObject($file);
$spl->seek(177777);
echo $spl->current()
This is a JSONL
format file .
JSON is the native format for MySQL 5.7+ and each row can be entered into a JSON database column as is, without prior decoding in PHP. It is possible and in the temporary table, and then to extract specific fields.
That is, it is enough to read the lines and, using yield
, send them to the buffer-collector. When the size is reached, let's say 1000 lines, execute , flushing the contents of the buffer. https://www.w3schools.com/sql/sql_insert.asp https://www.w3schools.com/sql/sql_insert_into_sele...
Or import the file using the Mysql Shell utility . INSERT INTO ... VALUES (...)
I need to drive this whole thing into MySQL. What are the best ways to do this?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question