Installing ELK Stack (latest) on OSX (Sierra)



Step #01: Install all prerequisites for ELK

*Homebrew

If you have not already homebrew, open the terminal and run the following command:

Then update brew with the command

The ELK Stack requires Java 8 to be installed.

To verify what version of Java you have, run

To install Java 8 go here. (if it is not already installed).

After the installation, run again java -version to verify that java was successfully installed.

Command output:

Step #02: Elasticsearch Installation

Now that our environment is ready, we can start installing Elasticsearch, first part of ELK stack.

We choose to start Elasticsearch with brew

Command output:

Open Safari (or any other browser) and verify that Elasticsearch has been correctly installed by connecting to http://localhost:9200

If everything went as expected, you will see something like:

Step #03: Logstash Installation

The next ELK stack component to be installed is Logstash:

Command output:

Now we can start Logstash, via homebrew:

Command output:

Since we haven’t configured a Logstash pipeline yet, starting Logstash will not result in anything meaningful. We will return to Logstash configuration later in this guide.

Step #04: Kibana Installation

Next component to be installed is kibana, the UI of the ELK stack.

Command output:

Now we can start kibana, via homebrew:

Command output:

Step #05: Verify all components installation

Unless something went wrong, we are now able to see that all ELK stack components are up and running.

Command output:

Although in any other case that is slightly different from that guide, Kibana configuration files may need editing before connecting it to the Elasticsearch instance, we should be able now to view Kibana live.

Open Safari (or your preferred browser) and connect to http://localhost:5601. You will see Kibana’s welcome page.

At this point we have not any data stored to our Elasticsearch instance.

Step #06: Logstash Configuration

Before being able to send our first OSX logs to Elasticsearch, we should configure Logstash properly.

To test our Logstash installation, we can run a basic Logstash pipeline.

Command output:

When we are able to see the above message “pipeline started successfully” we can enter

at the command prompt.

Logstash adds timestamp and IP address information to the message and we should see the following message on the console:

Logstash configuration files are in JSON-format and are usually located in /etc/logstash/conf.d. But since we installed it with Homebrew all logstash conf files are now located in /usr/local/Cellar/logstash/6.2.4/libexec/config.

There are 3 types of elements in a typical logstash pipeline: inputs, filters, and outputs. The input plugins consume data from a source, the filter plugins modify the data accordingly to our needs while the output plugins write the data to Elasticsearch.

Let’s start with an input configuration file called 00-syslog-input.conf:

Go to /usr/local/Cellar/logstash/6.2.4/libexec/bin

Create the 00-syslog-input.conf file

Add input, filter and output configuration to the file.

Exit editor and run Logstash from the current directory (/usr/local/Cellar/logstash/6.2.4/libexec/bin).

Logstash will start now feeding Elasticsearch with logs.

Step #07: Configure Kibana index

Open Safari again (or your preferred browser) and connect to http://localhost:5601. We can see now that a single index (syslog) is already there.

Now we enter syslog as the defined index pattern

Next, we choose a time filter field name and create the index pattern

Congratulations! We can now open the Discover Tab and see our logs.

 

Explore the power of Data Analytics.<< >>How to Monitor HAProxy with ELK Stack

Leave a Reply

Your email address will not be published.