I have a logstash
configuration that gets data from a MySQL database and sends the data to elasticsearch
.
This is my configuration:
JavaScript
x
25
25
1
input {
2
jdbc {
3
clean_run => true
4
jdbc_driver_library => "/usr/share/java/mysql-connector-java.jar"
5
jdbc_driver_class => "com.mysql.jdbc.Driver"
6
jdbc_connection_string => "jdbc:mysql://IP:PORT/DATABASE"
7
jdbc_user => "myuser"
8
jdbc_password => "mypassword"
9
use_column_value => true
10
tracking_column => "field1"
11
schedule => "*/2 * * * *"
12
statement => "SELECT * FROM test"
13
}
14
}
15
16
**need something here before the 'output' section ?**
17
18
output {
19
elasticsearch {
20
hosts => ["http://ELASTICSEARCH_IP:PORT"]
21
index => "myindexname"
22
document_id => "%{field1}"
23
}
24
}
25
Everything’s working fine, but I need to add some columns that have values dependent upon other column values, so I tried writing a Python script to do this. Is there a way to execute a python script to add/edit columns before data are sent in elasticsearch
? Do I need the filter
option?
EDIT : For exemple, I use my python script to :
- create a week number column based on a datetime field.
- create a month number column based on a datetime field.
- edit the ‘name’ column and replace some special characters (‘/’, ‘-‘, ‘:’, etc…)
- create linear trendline based on an another column.
- create an moving average line based on an another column.
- replace some columns values (ex : replace ‘y’ by ‘yes’ and ‘n’ by ‘no’).
Advertisement
Answer
I finally did the thing with the ruby code. Thanks mates for your help!