M
M
microf2020-05-04 08:09:44
PHP
microf, 2020-05-04 08:09:44

How not to add duplicates in MySql when importing from csv?

Good afternoon. I am importing a large table from csv - 2,050,000 rows. Split the `emeditor` files into files with a maximum of 10,000 lines and import with a simple script via exec

for ($i = 1; $i <= 205; $i++) {
  $fileNameWithPrefix = "data-2020-04-05_$i.csv";    
  addDataToTable($dbconn, $table_name, $fileNameWithPrefix);  
}
function addDataToTable($dbt, $table, $file) {
    if ($file) {
        try {
            $fieldseparator = ";";
            $lineseparator = " ";
            $dbt->exec
                    (
                    "LOAD DATA LOCAL INFILE "
                    . $dbt->quote($file)
                    . " INTO TABLE `$table` FIELDS TERMINATED BY "
                    . $dbt->quote($fieldseparator)                    
            );
echo 'Данные добавлены';
            return;
        } catch (PDOException $e) {
            die('Ничего не добавлено' . $e->getMessage());
        }
    } else {
        echo 'Нет файла';
    }
}

And I get 3,600,000 rows in the table. I load something several times. There are no duplicates in csv, they appear in the database.
This is how I wrote the script on my knee with $i++ or I don't know something about it LOAD DATA INFILE?
Before loading I bang the table and create anew.

Answer the question

In order to leave comments, you need to log in

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question