Answer the question
In order to leave comments, you need to log in
How to optimize PDO to work with MySql tables of 45,000,000 rows?
I have a table of 45 million. lines. I need to break it down into a certain field that contains the year the record was made (there are about 6-7 years in total, I don’t remember exactly, the number of records for one year is always different). I do all this with PDO, and there is not enough memory. All the hacks that I found on the Internet did not help. php.ini has already been configured, both from the side of the script, and from the side of the actual config. Can you suggest something in terms of solving a similar problem? Unfortunately, I do not have the opportunity to increase the amount of memory, the maximum is that I can try to raise memcached, but I'm not sure if this is the best idea, quite a lot of information comes with one line of the answer. I also tried to play around with request buffering, but
->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false)
Does not help. Tell me, what do you do in such cases?
Answer the question
In order to leave comments, you need to log in
Are you trying to select all 45k rows because there is not enough memory?
Use limit and cursor.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question