V
V
Vyacheslav Uspensky2020-02-17 03:45:37
MySQL
Vyacheslav Uspensky, 2020-02-17 03:45:37

Are there ways to optimize a large table with sparse data?

The essence is the following, there is a table with 200 million rows (char32, char32, char32, char64)
The first three fields are uid, the last is text id
Sampling by all fields, key length is 127

The bottom line is that the selection for this table is carried out through where exists 13 thousand lines.
The size of an index with such keys you understand what. There are no analogues of hstore, jsonb in mysql. Partial indexes, partitioning too. I thought all over my head how to get out.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
X
xmoonlight, 2020-02-17
@xmoonlight

1. Create indexes for each field.
2. Optimize the query so that substrings are searched last and use limit when fetching data.
And show an example query...

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question