Answer the question
In order to leave comments, you need to log in
Uploading excel to mysql database on laravel, how to upload large amount of data?
There is a site on Laravel, the customer asks to add bulk data loading to the database from Excel tables.
I tried to fasten this thing www.maatwebsite.nl/laravel-excel/docs , I test with a small amount of data (2 sheets of 20 records) - it works, I try to test with large data (3 sheets of about 100 records) - it hangs wildly and requires more and more memory and time. I came to the conclusion that I added memory_limit = 1024M and max_execution_time = 96000 to php.ini and went to the store.
I come, I see that the browser does not load anything anymore, in the Apache logs PHP Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 72 bytes) in /var/www/html/instabasket/vendor/phpoffice/phpexcel/Classes/PHPExcel /Cell.php on line 583 and that's it.
The problem is that the customer has 2000 Excel records, and here Apache dies by 300.
Who faced such a problem, what are the solutions?
Laravel Framework version 4.2.17
PHP 5.5.9-1ubuntu4.13 (cli)
Apache/2.4.7 (Ubuntu)
Ubuntu 14.04.3 LTS
Answer the question
In order to leave comments, you need to log in
I worked with this package, did not encounter such a problem.
Should the file be in xls? Maybe just convert it to csv, and read line by line with fgetcsv and write to the database? Easier in my opinion nowhere.
join the chorus "for CSV". Muscle has a wonderful LOAD DATA INFILE command - for example, downloading a file of 4.5 million! lines takes minutes.
in this scenario, upload the file to the server, pull xls2csv, then pull LOAD DATA, specifying the path to the file - profit.
keep in mind that in ubuntu the muscle can not read from all directories! you have to edit apparmor, well, that's not a problem.
This question is constantly asked here, only I answered it twice already. Why don't you search?XLS -> CSV -> LOAD DATA [LOCAL] INFILE
Recently, for a project I’m working on, they ordered a parser from another person, which received xlx files at the input, then scanned each file in turn, climbed onto the victim’s site, parsed the values and created a new file with new values. At first he applied PHPExcel, naturally everything was bent almost at once. Loads into memory entirely, and does not release how not to anset and debuild it. With each file, the RAM becomes noticeably smaller, a 5 MB file turns into 200 in memory. In general, it didn’t work and he screwed this https://github.com/box/spout
where everything is read at least the code and there are no problems with memory. https://github.com/box/spout#reader
and so I had to translate it into TSV and read it line by line, because there was not a single normal file for excel
if the operation is one-time, you can apply the load date infile, as advised
Well, actually, after converting to CSV, everything went like clockwork, yes.
Thank you all gentlemen!
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question