Answer the question
In order to leave comments, you need to log in
How to optimize parsing of large csv files?
There is a 300mb csv file. It has 200,000 lines of the form:
текст текст; "текст в двойных кавычках; текст в двойных кавычках; текст в двойных кавычках"; текст; текст
текст текст; "текст в двойных кавычках; текст в двойных кавычках; текст в двойных кавычках"; текст; текст; текст; текст
текст текст; текст; текст; "текст в двойных кавычках;текст в двойных кавычках; текст в двойных кавычках"; текст; текст
$handle = fopen($file_save . $name_csv, "r");
$data_import = array();
if(empty($handle) === false) {
while(($data = fgetcsv($handle, 0, ";")) !== FALSE) {
array_push($data_import, $data);
}
fclose($handle);
}
Answer the question
In order to leave comments, you need to log in
considering that then you push it into the database - push it immediately into the database and then do the logic in SQL. will be orders of magnitude faster and many times less memory-hungry
Read about generators Why do we need Generators (yield) in php? (I gave the link as an example, I just got the first one and there, just the same person showed csv processing)
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question