D
D
davin4u2016-07-14 11:34:07
PHP
davin4u, 2016-07-14 11:34:07

How to optimize MySQL query and is it necessary?

Good afternoon, I want to ask more experienced colleagues for advice :)
There is a database with companies, there is a main table with data, there is a table with staff, there is a table with employee data, there are a couple more tables with additional data, about 6 tables in total. There are 1.5 million records in the main table with companies, in the rest from 1 to 2.5 million
Task: the user enters keywords, the system must go through all these tables and find matches (LIKE and MATCH are used, yes, I heard that full-text search does not work well in mysql, but alas, this was already before me), the algorithm works in such a way that first a search is made for one key, everything superfluous is eliminated, then a search is made for the next key among what was found earlier.
Tried several approaches
1. Used LEFT JOIN to connect all used tables (the method was effective, but there was a problem with duplicate results, trying to use GROUP BY and DISTINCT significantly increased the execution time)
2. Used nested subqueries
3. Split the query into pieces, searched in turn for each key, and the result in PHP was substituted into the query through WHERE IN (...)
Well, and various other variations.
The maximum that we managed to achieve is 30 seconds
If there is someone who is well versed in this and has a lot of experience, I will be VERY glad for advice
Interested in the following:
1. In general, should mysql behave adequately on such a volume of data? Seems not as much as I think
2. How do smart people generally act in such situations? Is the query being optimized, if yes, what methods are used?
3. Or maybe mysql can't work fast with this amount of data (but I think it's ridiculous) when selecting from many tables, and you need to use other search tools, such as ElasticSearch?

Answer the question

In order to leave comments, you need to log in

5 answer(s)
R
riot26, 2016-01-29
@riot26

code.stephenmorley.org/php/sending-files-using-curl

O
Oleg Krasavin, 2016-01-29
@okwinza

$data = array( // тут $data
    "file1" => "/image.png"
);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $urlUploadServer);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $postData); // тут $postData
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$response = curl_exec($ch);

M
Maxim Fedorov, 2016-07-14
@davin4u

1. For MySQL, these are not large amounts of data, if everything is correctly implemented.
2. You need to choose a tool according to the task - your task is to search for data in all tables, the muscle is not fully suitable for this. Use ElasticSearch or Sphinx, all your attempts to get out through standard queries will not lead to anything, so even now you are optimizing something - it will work faster, but what will happen in a year or two or five when the data volumes grow?

D
davin4u, 2016-07-14
@davin4u

I also found an interesting thing - performances. As far as I understand, this is a virtual table, which is formed on the basis of a selection request from other tables. I just didn’t understand, this selection is constantly stored in memory, which could help speed up the work, or is it just a mechanism to simplify working with the database, i.e. just at the time of the request to the view, the same request is executed that was used to create this view, after which it expands?

N
napa3um, 2016-07-14
@napa3um

www.mysql.ru/docs/man/Fulltext_Search.html , if you don’t want to install a separate ElasticSearch (which may have a little more features for fulltext search that you won’t need in your task).

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question