Logstash to MongoDB
2 min readJan 6, 2018
I would like to send data from a CSV to a collection in MongoDB (mlab cloud)
Let´s start
First, what are the default plugins? There is a mongodb available? Nop:
╰─$ /usr/local/logstash/6.1.1/bin/logstash-plugin list|grep db
logstash-filter-jdbc_streaming
logstash-input-jdbc╰─$ /usr/local/logstash/6.1.1/bin/logstash-plugin listlogstash-codec-cef
logstash-codec-collectd
logstash-codec-dots
logstash-codec-edn
logstash-codec-edn_lines
logstash-codec-es_bulk
logstash-codec-fluent
logstash-codec-graphite
logstash-codec-json
logstash-codec-json_lines
logstash-codec-line
logstash-codec-msgpack
logstash-codec-multiline
logstash-codec-netflow
logstash-codec-plain
logstash-codec-rubydebug
logstash-filter-aggregate
logstash-filter-anonymize
logstash-filter-cidr
logstash-filter-clone
logstash-filter-csv
logstash-filter-date
logstash-filter-de_dot
logstash-filter-dissect
logstash-filter-dns
logstash-filter-drop
logstash-filter-elasticsearch
logstash-filter-fingerprint
logstash-filter-geoip
logstash-filter-grok
logstash-filter-jdbc_streaming
logstash-filter-json
logstash-filter-kv
logstash-filter-metrics
logstash-filter-mutate
logstash-filter-ruby
logstash-filter-sleep
logstash-filter-split
logstash-filter-syslog_pri
logstash-filter-throttle
logstash-filter-translate
logstash-filter-truncate
logstash-filter-urldecode
logstash-filter-useragent
logstash-filter-xml
logstash-input-beats
logstash-input-dead_letter_queue
logstash-input-elasticsearch
logstash-input-exec
logstash-input-file
logstash-input-ganglia
logstash-input-gelf
logstash-input-generator
logstash-input-graphite
logstash-input-heartbeat
logstash-input-http
logstash-input-http_poller
logstash-input-imap
logstash-input-jdbc
logstash-input-kafka
logstash-input-pipe
logstash-input-rabbitmq
logstash-input-redis
logstash-input-s3
logstash-input-snmptrap
logstash-input-sqs
logstash-input-stdin
logstash-input-syslog
logstash-input-tcp
logstash-input-twitter
logstash-input-udp
logstash-input-unix
logstash-output-cloudwatch
logstash-output-csv
logstash-output-elasticsearch
logstash-output-email
logstash-output-file
logstash-output-graphite
logstash-output-http
logstash-output-kafka
logstash-output-lumberjack
logstash-output-nagios
logstash-output-null
logstash-output-pagerduty
logstash-output-pipe
logstash-output-rabbitmq
logstash-output-redis
logstash-output-s3
logstash-output-sns
logstash-output-sqs
logstash-output-stdout
logstash-output-tcp
logstash-output-udp
logstash-output-webhdfs
logstash-patterns-core
Step #01: Install MongoDB logstash plugin
The official plugins are available on:
- https://github.com/logstash-plugins/logstash-output-mongodb
- https://www.elastic.co/guide/en/logstash/current/plugins-outputs-mongodb.html
For plugins not bundled by default, it is easy to install by running bin/logstash-plugin install logstash-output-mongodb
╰─$ /usr/local/Cellar/logstash/6.1.1/bin/logstash-plugin install logstash-output-mongodb
Validating logstash-output-mongodb
Installing logstash-output-mongodb
Installation successful╰─$ /usr/local/Cellar/logstash/6.1.1/bin/logstash-plugin list | grep db
logstash-filter-jdbc_streaming
logstash-input-jdbc
logstash-output-mongodb
Ok! Now, we have the logstash mongodb available!
Step #02: Using MongoDB logstash plugin
mongodb {
id => "my_mongodb_plugin_id"
collection => "bitcoin"
database => "uela"
uri => "mongodb://<USER>:<PASS>@ddddddd.mlab.com:63156/uela"
codec => "json"
}
With that configuration, I was able to populate the mongoDB in mlab
Let´s see the bitcoin historical data
Let´s stop the post at this point….
I believe that I have reached the target proposed when started to write this article…