A
A
Alexey2016-02-29 14:19:53
MySQL
Alexey, 2016-02-29 14:19:53

Choosing a large data subd?

Actually, the question arose of choosing a subd / optimization of the current one.
now the whole economy is spinning on mysql.
in 3 months, one tablet has grown to 7GB, (innodb) sampling takes a very long time.
it saves all clicks on ads. further volume will be more.
the current structure is: id, campaign, banner, user, date, utm_source, utm_campaign & etc.
when selecting by an arbitrary date (for example, a month) and grouping by the campaign, banner, user, date fields, the selection is made from 30 seconds or more.
index: date, campaign, banner, user
tried to partition everything on mong, the result is not much better. given that I did not transfer data about utm tags. about 6-10sec.
data must be stored for at least 3 months, ideally half a year.
the rest of the data is summarized and entered into the pivot table, but everything is simple and there are no questions.
Anyone with experience please share. Links to related articles are welcome.
vps server: ubuntu 15.10, 4core, 16gb memory, 1000gb ssd

Answer the question

In order to leave comments, you need to log in

4 answer(s)
S
Sergey, 2016-02-29
Protko @Fesor

did you set the indexes? explain requests did? Something doesn't seem to me.
For your task, mysql + tune is enough. In general, you can still store ready-made aggregations instead of making them for each sneeze.

B
beduin01, 2016-02-29
@beduin01

Look at this database https://www.arangodb.com/

D
Dimonchik, 2016-02-29
@dimonchik2013

use what you know
indexes, calculation in the background, storage of frequently requested in separate tables
in general, get comfortable with Postgres, of course, suitable for almost all tasks, but mastering takes time

U
un1t, 2016-02-29
@un1t

7gb is not enough. Stopudov indexes are not enough.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question