ELK Logging Platform Setup and Configuration


· · · ·

Centralized logging is critical in organizations for ease of fault finding, and it is often required for auditing and compliance purposes. Getting the tools to do the job at a reasonable price can often be hard. Commercial solutions such as Splunk cost large amounts of money and are often not feasible for smaller companies.

There are, however, alternatives. One of them is the ELK stack, which is free and easily deployed. ELK comprises three applications: Elasticsearch, which is used for storing and indexing your log files; Logstash, that can be used as both a forwarder (which we will not be covering in this article) and as a server to receive and store the log files; and Kibana, which is the front-end dashboard that we use to search through our log files.

Installation and configuration of ElasticSearch

For the purposes of this article, I will assume that you are working from a CentOS 7 x64 server, which is fully up-to-date. Note also that all commands below should be run as root in the terminal.

The first thing we need to do is install openjdk as a dependency.

Now that we have Java installed we can go ahead and start installing Elasticsearch. Let’s first import their gpg key.

Next we will add the elasticsearch repository.

Now we can go ahead and install Elasticsearch.

For security before we start, it is best to restrict access to localhost. This is fine for our use case, as we’ll have everything on one box. However, if you intend to have Kibana or Logstash on separate servers, you would be better doing this with firewall rules.

We can now go ahead and start Elasticsearch, and make it start on boot.

If you are using a box with less than 2GB of RAM, you may also need to adjust /etc/elasticsearch/jvm.options and modify the -Xms value—and change it to, say, 1G.

We now have Elasticsearch configured and ready to go, so let’s move on to Kibana.

Installation and configuration of Kibana

We can now go ahead and add the Kibana repository. This uses the same GPG key we configured earlier for Elasticsearch, so we can just add the repository.

Now that we have added the repository, we can install Kibana.

Prior to starting Kibana, we need to configure it to listen on all interfaces so we can access it remotely.

We can now go ahead and start Kibana and make it start at boot.

We now have Kibana ready to go!

Installation and configuration of Logstash

Finally, we will install Logstash. Again, we already have the elasticsearch gpg key added, so we can go straight to creating the repository and installing it.

Now that the repository is added, go ahead and install Logstash.

We can now go ahead and set up our Logstash configuration file. This will consist of an input which will be syslog, and an output which will be elasticsearch. Create a config file in /etc/logstash/conf.d/ with the following configuration. In our example, we will create /etc/logstash/conf.d/syslog.conf with the following content:

We can now go ahead and start Logstash and enable it to start at boot. (Please note that it may take a few minutes for it to start up.)

Final Thoughts

We now have ELK up and running and ready for you to push your logs into. When you browse to Kibana on http://:5601, Kibana should automatically configure Logstash indexes for you, provided logs have already been processed by Logstash and stored in Elasticsearch, and allow you to pick the timefield it will index on. This is usually @timestamp. Once you have clicked “Create” and have been redirected to the next page, go to the “Discover” tab on the left, and you will be able to view your logs coming in.

With everything now set up you can point your servers to log syslog to your Logstash host and you will see them appear in Kibana. Remember we are using a non-standard port of “10514” when configuring this.

If you have any questions, please leave comments on this article and I will try and get back to you.

Do you think you can beat this Sweet post?

If so, you may have what it takes to become a Sweetcode contributor... Learn More.

Casey Rogers is an experienced Senior Infrastructure Engineer with more than 8 years’ experience working with some of the UK’s biggest companies and multinationals.Largely self-taught, Casey has long had an interest in IT and has been brought up around technology.


Click on a tab to select how you'd like to leave your comment

Leave a Comment

Your email address will not be published. Required fields are marked *