Answer the question
In order to leave comments, you need to log in
How to deal with very large data in MySQL?
Hello dear programmers.
--
Background: Somehow I came to help one customer with a table in csv . Especially, I felt sorry for the poor fellow who compiled this very table.
Perhaps I won’t surprise anyone, maybe I’ll stun anyone, but the table for 6 years of programming in the web sphere seemed to me too big for MySQL.
I decided to take a screenshot so that there were no conversations that I made up, or this is impossible, so I found out that the column limit in MySQL = 1500 st. turns out to be against 137 Art. It doesn't seem like much, but it's enough for me. To say that I was shocked and inadequate is to say nothing. Naturally, I converted the excel table to csv in the old fashioned way, but a surprise was waiting for me even before I switched to PhpMyAdmin )) - the table was in windows 1251 encoding !!!
Naturally, it looked more like a message from aliens than a table with goods.
Converted to utf-8, understandable Russian letters appeared. Then I ran into a limit of 300 seconds.
How did I know that there were 76 thousand lines!!! Maybe for someone it's not so scary, but the table weighed almost 50 Mb. Colleagues in the workshop reassured me, they say, you have not worked with a database of 40 Tr yet !!!
Damn optimists.
Well, ok, I think, since the limit is 300 seconds , I rustled to the OpenServer bookmarks and opened the "MySQL Manager" program, such a light one, it helped out more than once. However, in order to upload csv, it was necessary that the database had a table where to upload, but it is not, through PhpMyAdminwhen I uploaded excel files for OpenCart CMS, a new table was created immediately without creating a table, it only remained to set the column names and their properties. Great, I had to manually make a table, 137 columns, for some reason the manager didn’t want to do them, he poured errors, there was no time to figure it out. Only then filled in 76 thousand lines of information. What am I, sick or something, to fill in so much by hand? )))
As a result, I fiddled with this table for half an hour. Do not scold too much, I first encountered such a volume, but I overcame it in view of my experience. Nothing is impossible for a person, especially for a stubborn one.
QUESTION: Is there any adequate solution to such problems, advise programs, Can I upload the created table to the server (combat, not local), using the "MySQL manager", or will you have to turn on the PuTTy console?
Where can I learn how to make migrations for Yii2?
Answer the question
In order to leave comments, you need to log in
Current versions of PhpMyAdmin easily bypass the 300 second limit by breaking up data insertion into parts. If your file does not have time to unload in 300 seconds, then you can try to pack it in gz for example. Well, the data is not "very large", but very small (the number of columns IMHO is not entirely adequate, but this does not really change anything).
HeidiSQL - suitable for working with large tables, at least for PhpMyAdmin.
More MariaDB .
Heidi has already tried it, it's a fast program, edits, executes SQL queries quickly, does not slow down, clear interface, there is a connection to a real server even over a secure connection.
I have not tried Maria yet, but I often heard about her from those who worked with a large amount of information.
Heidi opens tables much faster than the browser-based PhpMyAdmin .
The pluses include the following:
You do not need to enter your login and password every 1440 seconds.
Operations are performed instantly, at the bottom of the program you can see all the requests that have been completed.
It is easier to copy the necessary elements.
Connect to multiple servers in one window.
Connect via SSH tunnel or perform SSL settings.
Manage user privileges
Import text files
Export table rows as CSV, HTML, XML, SQL, LaTeX, Wiki Markup and PHP Array
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question