Answer the question
In order to leave comments, you need to log in
What database is better to use for logging and reporting?
Lear digression: I haven't been involved in development for a long time, I went to support, I'm not aware of trends, etc., but I have ideas ... I ask for advice.
There is one task (idea) in the project:
it is necessary to write a lot of logs (records can be different, depending on the number of "sensors") for various sources. Data after some time, say a month, should be deleted. One group can contain (for a month) from 9k records (1 source) to ... conditionally 90k (10 sources), depending on the number of sources in the group, they may appear, they may "disappear", there may be NN such groups (a constant increase is assumed in the future, how much is not known, even if at least up to 500). Based on the received data, it is necessary to build reports, reports by groups of sources (source group type). On the client to push the received collections in charts (graphs). Sampling in two types - for the entire period of storage by group (into a graph) and current readings.
Something like this.
What is better to use for this: MySQL,
If MongoDB, what kind of indexes should be done at all and is it worth it? while I think only one expireAfterSeconds
PS: Speed of an insertion is necessary, selection will be made much less often than an insertion.
Answer the question
In order to leave comments, you need to log in
Postgres for everything
As a result, only 100M records per month
Insertion everywhere suffers if the disk speed is less than the stream for insertion
90k entries per month? 500 groups. 45,000,000 entries?
Yes, anything from regular MySQL to trendy ClickHouse.
Mongu - well, feel it, but it seems to me that everything is really bad with performance there.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question