Answer the question
In order to leave comments, you need to log in
How to process and analyze logs?
Good day
Modern applications generate a huge number of logs. A natural question arises: how to analyze such large volumes? grep on files, of course, saves in a number of tasks, but still I want some kind of smart search with the identification of correlations and any dependencies.
Share your experience (or a link to the experience of others), how do you solve this problem?
Answer the question
In order to leave comments, you need to log in
It seems that there is something that can help you in this article:
habrahabr.ru/post/150657/
... or maybe not)
Perhaps for you it will be from the “gun to sparrows” series, but there is a separate class of systems - SIEM . Personally, I have worked with Cisco MARS equipment (currently end of sale, but still on sale) and RSA Envision.
The above-mentioned complexes have a branched architecture, with the possibility of placing local collectors (log collectors) on "removals".
Systems can not only send events when a particular message arrives, but also create incidents based on chains of events. They have pre-created sets of "parsers" for creating events, as well as templates for checking for compliance with security standards such as PSI DSS.
Take any log analyzer, the same awstat or sawmill google
everything from half a kick.
Well, this is exactly what happens in SIEM systems, but there it is based on the parsers generated by the manufacturer + you can add it yourself, there are also correlation rules based on which an incident is formed if necessary.
But even when these systems are working, there are a lot of false positives, so you have to tune the system ...
* Flume
* Graylog2 *
LogStash *
Kibana
, UI for LogStash
* Scribe
* Fluentd
* elasticsearch I can't give more advice.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question