Logstash: Data Ingestion
I would like to ingest some data from CSV and JSON files with Logstash. The destination could be elasticsearch, but also could be simple transformation from a JSON file, to another JSON files a little more simple.
Let´see some logstash samples
I am following the official logstash tutorial
Logstash Commands
Some useful logstash commands
# List installed plugins:$ /usr/local/Cellar/logstash/6.1.1/bin/logstash-plugin list
Sample #01: Simple Logstash sample
The most basic Logstash pipeline is with -e option (command line), when you run it, you should wait until logstash is ready … (I have to wait almost 1 minute …. )
$ /usr/local/Cellar/logstash/6.1.1/bin/logstash -e 'input { stdin { } } output { stdout {} }'
Ok, “Hello Logstash worked!”
Sample #02: Parsing logs with Logstash
Following official tutorial, this ask me install Filebeat to do this step… So, I will install Filebeat from the official guide to install it…
Let´s start with Filebat, on OSX, I have two options, the first one is with typical wget, but I prefer the brew installation alternative:
$ curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.1.1-darwin-x86_64.tar.gz
tar xzvf filebeat-6.1.1-darwin-x86_64.tar.gz# Another installation option: (http://brewformulas.org/Filebeat)$ brew install filebeat
So, let´s use brew
brew filebeat service ready:
Let´s continue
$ /usr/local/Cellar/filebeat/6.1.1/bin/filebeat -e -c filebeat.yml -d "publish"
I have to replace the default “filebeat.yml”, on my PC
We can see that It try to install it, and after that, connection refused
ok … I will stop this example at this point, I don´t understand very well the reason because filebeat can´t connect with logstash on port 5044 (Possible solution in this link)
SOLUTION:
Based on the official doc:
- (this link to configure logtash) and
- (this link to configure beat against logstash instead elasticsearch)
a) Start logstash with Beat config:
input {
beats { port => 5044 }
}
# filter is optional.
# filter { }output {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
Next Step: Show logs Beats on kibana
We can feed not only logstash, but also elasticsearch through the pipeline… Let´s show it on Kibana:
But, It should have a better visualization (like the official tutorial)
Ok…. let´s stop this log at this point, I could connect BEAT with the ELK stack…
I will add some new examples later…
See son.