F
F
Fenderas2016-09-18 17:12:45
JavaScript
Fenderas, 2016-09-18 17:12:45

How to fix errors when importing data from MySQL to Elasticsearch?

Elasticsearch Distr Details
==========================
Installed Packages
Name : elasticsearch
Arch : noarch
Version : 2.4.0
Release : 1
Size : 29 M
Repo : installed
From repo : elasticsearch-2.x
Logstash Distr Details
=====================
Installed Packages
Name : logstash
Arch : noarch
Epoch : 1
Version : 2.4 .0
Release : 1
Size : 137 M
Repo : installed
From repo : logstash-2.4
Good afternoon. There is a small problem with the logstash + elasticsearch bundle.
At the moment I use this bundle to generate analytics for the HelpDesk OTRS system.
Logstash uses a JDBC plugin to get data directly from a MySql database. Further, this data is processed a little and sent to c, after which we visualize the information through Kibana.
The query and sending rules in elasticsearch can be found in the configuration file has - otrs-statistics.conf.
Everything worked great until I decided to add a second entity to the analytics and put the data in elasticsearch.
I added one more configuration file - otrs-statistics_actions.conf, which makes a slightly different selection in the database and sends data to another index.
Separately, both files work great, but when I send both files to logstash, magic starts that I can't figure out.
Here is the output of the query to the index before connecting the second file (otrs-statistics.conf), only otrs-statistics_actions.conf worked

curl -get http://a5a5a5a5a5:9200/logstash-otrs_tickets_actions-2016.09/_search?size=1
{"took":1,"timed_out":false,"_shards":{"total":1,"successful":1,"failed":0},"hits":{"total":608,"max_score":1.0,"hits":[{"_index":"logstash-otrs_tickets_actions-2016.09","_type":"logs","_id":"123282","_score":1.0,"_source":{"ticket_id":4342,"ticket_number":"1111111","ticket_status":"closed successful","ticket_type":"Hourly_Rate","agent_name":"Ivan","ticket_action_id":123282,"ticket_action_spent_time":3.0,"ticket_action_create_time":"2016-09-01T03:18:02.000000Z","ticket_customer_id":"00-000666","customer_company_name":"Bari","customer_user_name":"Bari","customer_user_email":"[email protected]","@version":"1","@timestamp":"2016-09-01T03:18:02.000Z"}}

Here is the output of an index query with two files
curl -get http://a5a5a5a5a5:9200/logstash-otrs_tickets_actions-2016.09/_search?size=1
{"took":1,"timed_out":false,"_shards":{"total":1,"successful":1,"failed":0},"hits":{"total":608,"max_score":1.0,"hits":[{"_index":"logstash-otrs_tickets_actions-2016.09","_type":"logs","_id":"123282","_score":1.0,"_source":{"ticket_id":4342,"ticket_number":"1111111","ticket_status":"closed successful","ticket_type":"Hourly_Rate","agent_name":"Ivan","ticket_action_id":123282,"ticket_action_spent_time":3.0,"ticket_action_create_time":"2016-09-01T03:18:02.000000Z","ticket_customer_id":"00-000666","customer_company_name":"Bari","customer_user_name":"Bari","customer_user_email":"[email protected]","@version":"1","@timestamp":"2016-09-01T03:18:02.000Z","ticket_time_registration":"not registred","otrs_link":"https://a5a5a5a5a5/otrs/index.pl?Action=AgentTicketZoom;TicketID=4342"}}]}}

For some reason, fields from another request, otrs_link and ticket_time_registration, are starting to be added. Tell me
which way you can drop to fix this miracle. Thank you.
The contents of the logstash config - otrs-statistics_actions.conf
input {
  jdbc {
    jdbc_driver_library => "/tmp/mysql-connector-java-5.1.36/mysql-connector-java-5.1.36-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"

    jdbc_connection_string => "jdbc:mysql://blabla:3306/otrs"

    jdbc_user => "blabla"
    jdbc_password => "blabla"

    schedule => "*/3 * * * *"

    last_run_metadata_path => "/tmp/logstash/.logstash_jdbc_last_run"

    statement => "
          запрос
                "
  }
}

filter {
        date {
                match => [ "ticket_action_create_time", "yyyy-MM-dd HH:mm:ss Z","ISO8601" ]
                target => "@timestamp"
        }

        if [ticket_action_spent_time] {
                ruby {
                        code => "event['ticket_action_spent_time'].gsub!(/^%%/,'')"
                }
                ruby {
                        code => "event['ticket_action_spent_time'].gsub!(/%%.*/,'')"
                }
        }

        mutate {
            convert => [ "ticket_action_spent_time", "float" ]
        }
}

output {
    elasticsearch {
                hosts => "blabla:9200"
                document_id => "%{ticket_action_id}"
                index => "logstash-otrs_tickets_actions-%{+YYYY.MM}"
        }

        #stdout {
        #       codec => rubydebug
        #}
}

The contents of the logstash config - otrs-statistics.conf
input {
  jdbc {
    jdbc_driver_library => "/tmp/mysql-connector-java-5.1.36/mysql-connector-java-5.1.36-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"

    jdbc_connection_string => "jdbc:mysql://blabla:3306/otrs"

    jdbc_user => "blabla"
    jdbc_password => "blabla"

    schedule => "*/3 * * * *"

    last_run_metadata_path => "/tmp/logstash/.logstash_jdbc_last_run"

    statement => "
    запрос
    "
  }
}

filter {
        date {
                match => ["ticket_create_time","yyyy-MM-dd HH:mm:ss Z","ISO8601"]
                target => "@timestamp"
        }

        # Оцениваем час работы каждеого сотрудника
        if [agent_name] in ["blabla", "blabla", "blabla"] {
                mutate {
                        add_field => { "cost_hour" => 10}
                }
        }
        else if [agent_name] in ["blabla", "blabla"] {
                mutate {
                        add_field => { "cost_hour" => 20}
                }
        }

        else if [agent_name] == "blabla" {
                mutate {
                        add_field => { "cost_hour" => 30 }
                }
        }

    if [cost_hour] {
                mutate {
                        convert => [ "cost_hour", "float" ]
                }
    }

        # Проверяем было ли зафиксированно время по заявке
        mutate {
                add_field => {"ticket_time_registration" => ""}
        }
        if [time_accounting_time_unit] {
                mutate {
                        convert => [ "time_accounting_time_unit", "float" ]
                        update => { "ticket_time_registration" => "registred" }
                }
        }
        else {
                mutate {
                        update => { "ticket_time_registration" => "not registred" }
                }
        }

        # Преобразуем поле в часы из минут
    if [ticket_transportation_time] {
                mutate {
                        convert => [ "ticket_transportation_time", "float" ]
                }
                ruby {
                        code => "event['ticket_transportation_time']/=60.0"
                }
    }

        # В случае если заявка открыта то убиваем время закрытия
    if ([ticket_status] in ["open", "new"]) {
                mutate {
                        remove_field => [ "%{ticket_close_time}" ]
                }
    }

        # Украшательства
    if [agent_name] {
                mutate {
                        gsub => [agent_name, "IT Support", "Default Agent"]
                }
    }

        if [ticket_service] {
                mutate {
                        gsub => [ticket_service,"blabla::","IT "]
                        gsub => [ticket_service,"blabla::","ERP "]
                }
    }

        # Добавляем поле для быстрого перехода в заявку из Kibana
        mutate {
                add_field => {"otrs_link" => "https://blabla/otrs/index.pl?Action=AgentTicketZoom;TicketID=%{ticket_id}"}
        }
}

output {
    elasticsearch {
                hosts => "blabla:9200"
                document_id => "%{ticket_number}"
                index => "logstash-otrs_tickets-%{+YYYY.MM}"
        }

        #stdout {
    #   codec => rubydebug
    #}
}

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
Abdula Magomedov, 2016-08-25
@Avarskiy

Here. Read. Everything is perfect. https://learn.javascript.ru/keyboard-events

M
Max, 2016-11-18
@MaxDukov

logstash has one stream of inputs-filters-outputs.
those. if you uploaded 2 configs at the same time, it will start reading for you from both selects and writing to both indexes.
use
type => "sometype"in input and then validate

if [type] == "sometype" {
  ...
}

in filters and output block.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question