V
V
Vadim Yakovlev2015-02-21 23:00:18
Cisco
Vadim Yakovlev, 2015-02-21 23:00:18

Problem with Logstash and processing logs from Cisco ASA. Tips, ideas?

Given:
Ubuntu Server 14.04 LTS
Logstash 1.4.2
OpenJDK Runtime Environment (IcedTea 2.5.4) (7u75-2.5.4-1~trusty1)
ElasticSearch 1.4.4
Kibana 4.0
Logstash config:

input {
  tcp {
    type => "syslog"
    port => 5140
  }
}

input {
  udp {
    type => "syslog"
    port => 5140
  }
}

input {
  # Receive Cisco ASA logs on UDP port 5141
  udp {
    port => 5141
    type => "cisco-asa"
  }
}

filter {
  if [type] == "cisco-asa" {
    # Split the syslog part and Cisco tag out of the message
    grok {
      match => ["message", "%{CISCO_TAGGED_SYSLOG} %{GREEDYDATA:cisco_message}"]
    }

    # Parse the syslog severity and facility
    syslog_pri { }

    # Parse the date from the "timestamp" field to the "@timestamp" field
    date {
      match => ["timestamp",
        "MMM dd HH:mm:ss",
        "MMM  d HH:mm:ss",
        "MMM dd yyyy HH:mm:ss",
        "MMM  d yyyy HH:mm:ss"
      ]
    }

    # Clean up redundant fields if parsing was successful
    if "_grokparsefailure" not in [tags] {
      mutate {
        rename => ["cisco_message", "message"]
        remove_field => ["timestamp"]
      }
    }

    # Extract fields from the each of the detailed message types
    # The patterns provided below are included in Logstash since 1.2.0
    grok {
      match => [
        "message", "%{CISCOFW106001}",
        "message", "%{CISCOFW106006_106007_106010}",
        "message", "%{CISCOFW106014}",
        "message", "%{CISCOFW106015}",
        "message", "%{CISCOFW106021}",
        "message", "%{CISCOFW106023}",
        "message", "%{CISCOFW106100}",
        "message", "%{CISCOFW110002}",
        "message", "%{CISCOFW302010}",
        "message", "%{CISCOFW302013_302014_302015_302016}",
        "message", "%{CISCOFW302020_302021}",
        "message", "%{CISCOFW305011}",
        "message", "%{CISCOFW313001_313004_313008}",
        "message", "%{CISCOFW313005}",
        "message", "%{CISCOFW402117}",
        "message", "%{CISCOFW402119}",
        "message", "%{CISCOFW419001}",
        "message", "%{CISCOFW419002}",
        "message", "%{CISCOFW500004}",
        "message", "%{CISCOFW602303_602304}",
        "message", "%{CISCOFW710001_710002_710003_710005_710006}",
        "message", "%{CISCOFW713172}",
        "message", "%{CISCOFW733100}"
      ]
    }
  }
}


#Syslog
filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}

output {
elasticsearch_http { host => "localhost" }
}

Problem:
piece /var/log/logstash/logstash.log
{:timestamp=>"2015-02-21T23:37:01.319000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}
{:timestamp=>"2015-02-21T23:37:01.334000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}
{:timestamp=>"2015-02-21T23:37:01.370000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}
{:timestamp=>"2015-02-21T23:37:01.388000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}
{:timestamp=>"2015-02-21T23:37:01.458000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}
{:timestamp=>"2015-02-21T23:37:01.461000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}
{:timestamp=>"2015-02-21T23:37:01.533000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}
{:timestamp=>"2015-02-21T23:37:01.540000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}
{:timestamp=>"2015-02-21T23:37:01.595000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}
{:timestamp=>"2015-02-21T23:37:01.650000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}
{:timestamp=>"2015-02-21T23:37:01.658000+0400", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Feb 21 2015 22:37:01", :exception=>java.lang.IllegalArgumentException: Invalid format: "Feb 21 2015 22:37:01", :level=>:warn}

Constant warnings of the same type, and for the third day I can not understand where I have an error.
Logs are visible in Kibana. In addition to ASA, no one else sends logs. Help advice, well, or tell me where to dig.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
Armenian Radio, 2015-02-21
@gbg

It also directly says that the date format is not correct. Try writing the year marker in caps.

V
Valentin, 2015-02-21
@vvpoloskin

Here I look at the config, I see the date formats for parsing are indicated. I look at the result with errors - indeed, there is not a single template in which the bydata began with a year.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question