S
S
SchmeL2017-02-09 16:44:39
JSON
SchmeL, 2017-02-09 16:44:39

How to create json template for elasticsearch?

I have ELK set up. The application sends the log via http in json format. In my case, this is the play framework.
Logstash is configured with http_pooler to read the log.

input {
  http_poller {
    urls => {
      finac => {
        method => get
        url => "http://192.168.1.100:9010/health/status"
        headers => {
          Accept => "application/json"
        }
      }
    }
    request_timeout => 20
    interval => 60
    tags => ["finac"]
    type => "app"
    codec => "json"
    metadata_target => "http_poller_metadata"

  }
}

filter {
 if "finac" in [tags] {

    json {
     source => "message"
    }

 }
}

output {
 if "finac" in [tags] {
    elasticsearch {
      codec => "json"
      hosts => "localhost:9200"
      index => "debug-%{+YYYY.MM.dd}"
    }

 stdout { codec => rubydebug }

 }
}

But when I try to write to Elasticsearch, I get an error
{:timestamp=>"2017-02-09T16:07:20.880000+0300", :message=>"Failed action. ", :status=>400, :action=>["index", {:_id=>nil,
..........
{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [play.CorePlugin.monitors.max] of different type, current_type [double], merged_type [long]"}}}}, :level=>:warn}

As I understand it, he cannot parse it correctly and swears. On the Internet, it is advised to write a template for json.
For example like this:
{
  "template": "userkeyword*",
  "order": 1,
  "mappings": {
    "logs": {
      "dynamic_templates": [{
        "string_template": {
          "match": "*",
          "match_mapping_type": "string",
          "mapping": {
            "type": "string",
            "index": "not_analyzed"
          }
        }
      }],
      "properties": {
        "keyword": {
          "type": "string",
          "analyzer": "keyword_analyzer",
          "fields": {
            "autocomplete": {
              "type": "string",
              "search_analyzer": "autocomplete_search_analyzer",
              "analyzer": "autocomplete_index_analyzer"
            },
            "readingform": {
              "type": "string",
              "search_analyzer": "readingform_search_analyzer",
              "analyzer": "readingform_index_analyzer"
            }
          }
        }
      }
    }
  }
}

The application whose logs I want to drive into the elasticsearch json log contains about 2000 components.
Are there any automated systems for converting the generated json log into a template?

Answer the question

In order to leave comments, you need to log in

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question