ELK Stack, from Zero to Demo (OSX)

Pablo Ezequiel Inchausti
7 min readJan 4, 2018

--

Do you want to start with ELK Stack? It seem to be complex for you? Don’t be affraid about that and keep reading! In just a few minutes let’s how easy is to install ELK Stack on OSX and get ready te get a very first demo! Keep focused and let’s start with it!

Step #00: Services on OSX with Homebrew

In OSX you can use launchctl to start and stop services based a .plist file, but also, with Homebrew you can do it as well, and may be, it result easier to you (There is a nice post of thanks to ivanturkovic.com about it)

$ brew tap gapple/services
$ brew services list
$ brew services start <service>
$ brew info <service>
# Services
[CTRL + Z]
$ jobs
$ fg
$ fg %1

After that, we will use homebrew

Step #01: Install Logstash on OSX

To install Logstash on OSX there are different alternatives:

I will use the third one, brew install logstash on OSX, may be one of most popular ways to do it (About homebrew, you can see official explanation)

$ brew install logstash

We can see that it is installing logstash on an internal folder “/usr/local/Cellar

brew Install logstash on “/usr/local/Cellar”

After the installation we can start it as a service with homebrew:

logstash homebrew services started
$ brew services list
$ brew services start logstash

With homebrew we have the following dirs to consider about logstash config:

$ /usr/local/Cellar/logstash/6.1.1/libexec/config

Or, we should pay attention how to pass config parameter by command line. May be we don’t need logstash running as parameter, just available to run it after installation … let’s continue with the elk stack

Step #02: Install Elasticsearch on OSX

The next step, we will install elasticsearch, and for it there are two possibilities: the first one is download from elastic from official website, and the second one, again with homebrew.

I will install elastic with homebrew, and we have the formula available

$ brew install elasticsearch
brew install elasticsearch
after install elasticsearch

We have after the installation the config files on:

And the idea, is to start elasticsearch as a service with homebrew:

$ brew services start elasticsearch
$ brew info elasticsearch
info service elasticsearch
service elastic search started with homebrew
$ tail -f  /usr/local/var/log/elasticsearch/elasticsearch_pabloinchausti.log

Elastic search is alive, we can get the info with http://localhost:9200/

http://localhost:9200/

Step #03: Install Kibana on OSX

Again, we will install Kibana with homebrew, the formula is available on http://brewformulas.org/Kibana

$ brew install kibana
kibana install
starting kibana
info service kibana

The service kibana has the config file on /usr/local/etc/kibana/kibana.yml but at first every line is commented. (we could uncomment some lines to change default kibana config options)

Some default kibana options:

  • localhost:5601
  • elasticsearch on 9200
http://localhost:5601
http://localhost:5601/status
http://localhost:5601/
http://localhost:5601/status

Step #04: Loading Some Data on the ELK Stack

Ok, at this point I have installed Kibana, Elasticsearch and Logstash, all of them with default config values. I would like to feed some data in this ELK Stack to play with the visualization dashboard…

I will follow the official tutorial in the step how load some datasets and I download and unzip the datasets:

Understanding “Mappings” according to official doc:

Mapping is the process of defining how a document, and the fields it contains, are stored and indexed. For instance, use mappings how:

Mapping example

With the sample datasets (“shakespeare”):

In Kibana DevTools we create the index “shakespeare”:

On elasticsearch:

Let’s create the indexes for the logs:

The accounts data set doesn’t require any mappings, so at this point we’re ready to use the Elasticsearch bulk API to load the data sets with the following commands:

$ curl -H 'Content-Type: application/x-ndjson' -XPOST 'localhost:9200/bank/account/_bulk?pretty' --data-binary @accounts.json$ curl -H 'Content-Type: application/x-ndjson' -XPOST 'localhost:9200/shakespeare/doc/_bulk?pretty' --data-binary @shakespeare_6.0.json
$
curl -H 'Content-Type: application/x-ndjson' -XPOST 'localhost:9200/_bulk?pretty' --data-binary @logs.jsonl

Shakespeare:

curl -H ‘Content-Type: application/x-ndjson’ -XPOST ‘localhost:9200/bank/account/_bulk?pretty’ — data-binary @accounts.json
curl -H ‘Content-Type: application/x-ndjson’ -XPOST ‘localhost:9200/shakespeare/doc/_bulk?pretty’ — data-binary @shakespeare_6.0.json
… running ….
end

Logs:

curl -H ‘Content-Type: application/x-ndjson’ -XPOST ‘localhost:9200/_bulk?pretty’ — data-binary @logs.jsonl

What else?’

We should see:

GET /_cat/indices?v

Step #05: Discovering the data

First, we installed the ELK Stack, next, we feed elasticsearch with three datasets (Shakespeare, logs and accounts), and now we are going to discover some data:

kibana discover see the datasets

#05.01 Let´s create shakes* index:

Index created Shakes*

#05.02 Let´s create bank* index:

#05.03 Let´s create logstash-2015.05* index:

In this case, we have a Index contains time-based events, so we

We are selecting a @ timestamp field
this index have almost 100 fields …
this index have almost 100 fields …

With timestamp:

Step #06: Visualizing the data

Apply Changes
Agregation: another field

Final Words

Ok, I will finish the post at this point,

We could see how to install the ELK Stack on OSX, mainly base with homebrew, and later, how to ingest with logstash some data, index on elastisearch and visualizate it with kibana

I hope you enjoyed it, and learn something new about the ELK Stack

Regards! That was easy! See you on a next one!

Pablo

Useful links

When you are dealing with kibana and elasticsearch, some useful links are:

Resources

--

--

Pablo Ezequiel Inchausti

#cloud . #mobile ~} Sharing IT while learning It! ... Opinions are for my own