Install & Configure Elastic Curator for Index Management

After you have set up your ELK Stack and have been using it for a while (see Step-By-Step Install ELK Stack), a question should start creeping into your head; How do I stop my Elasticsearch Indexes from growing endlessly?  Now you could occasionally go into Kibana and delete the indexes via the GUI found there, but we’re sysadmins!  We automate things!  Luckily Elastic has provided a utility for managing indexes named Curator, which is easily ran as a cron job.  Win!  Be sure to visit the Elastic Curator page and get an idea of what you can do with it, and roughly how it is configured.  We are going to configure it to delete all indexes beginning with winlogbeat- and filebeat- that are older than 90 days in this example, so let’s get to setting that up.  I will be showing you the most recent version as of writing this, Curator version 5.5.4.

Installing & Configuring Curator

1.  Start by downloading the DEB package which is hidden on the APT install page.  I usually place these in /tmp for easy cleanup.

2.  Once you have that downloaded, let’s install it.

3.  Curator will now be installed at /opt/elasticsearch-curator, but we don’t want to mess with anything in there.  I created a hidden directory in my home directory named curator as the documentation suggests as default, and then created a configuration file in that directory for Curator.

I placed the following configuration in the curator.yml file.

This is pretty straightforward.  It tells Curator to connect to Elasticsearch found on localhost ( on port 9200 without SSL and send basic log info to a file at /home/username/.curator/log.log.

4.  Now we need to tell Curator what to do now that it knows what to connect to and what not.  We do this with an action file, which I creatively named action.yml.

I placed the following configuration in the action.yml based off of the example found here in the documentation.

5.  Now we can test it with the –dry-run argument.  This will log what Curator WOULD do if it were not being run with the –dry-run argument, it does not actually perform the actions.

Check the log to see all indexes older than 90 days would have been deleted.

6.  Assuming that all checks out, we just have to make a cron job to run this thing for us.  I like to run it daily, but it’s dealers choice.  Start by making your script in the appropriate directory.

And added the following:

7.  Now we just need to add the actual cronjob.

Append the following to the bottom of the file:

This configures the cronjob to run the script that we made in step 6 to run every day at 6am as root.  Curator is now set up to manage your indexes!

Step-By-Step Install ELK Stack on Ubuntu 18.04

Elasticsearch, Logstash, and Kibana (aka ELK Stack) are very powerful tools for storing, analyzing, and visualizing log data in a centralized location.  That being said, it can be quite the headache to actually get up and running if it is your first experience with it.  Having spent the time pouring over the documentation provided by Elastic, which I must say is quite impressive, and struggling through getting the ELK stack up and running I figured I would make a step-by-step.  Some things to note before we get started:

  • This will be for the current most recent version 6.3.2 released on July 24, 2018.
  • I will set up an nginx reverse proxy to access Kibana.
  • will not be including SSL setup.
  • I will be installing all components of the stack with DEB packages.
  • Java 8 is required prior to ELK setup.  See my post Install Java JDK/JRE on Ubuntu Server without APT.
    • Java 10 is not compatible with the Logstash 6.3.2.  I learned this the hard way, take my word for it.
  • I will be showing BEATS configurations in a separate post.  This will be only the ELK stack setup.

I am not using APT repositories for anything because I have been burned by the upgrade process in the past with ELK, so I just manually upgrade as necessary.  Now, let’s get this thing started.

Install & Configure Elasticsearch

1.  Start by navigating to the Elastic downloads page and select Elasticsearch.

2.  Login to your Ubuntu box and download the DEB package.  I put it in tmp for easy cleanup.

3.  Now download the checksum and compare against your downloaded package.  It should return OK.

4.  Install Elasticsearch.

5.  Open the Elasticsearch config file found at /etc/elasticsearch/elasticsearch.yml and uncomment/edit the following settings.

This will configure your cluster name as my-cluster-name, your node (or server) as my-node-name, your data storage location as /var/lib/elasticsearch, your log location as /var/log/elasticsearch, and your host as localhost (  These are pretty default settings and I don’t see many reasons to change them, but do so if you wish.

6.  Restart/Reload the service, daemon, and enable the service.

7.  Test Elasticsearch.

Which should return:


Install & Configure Kibana

1.  Download Kibana DEB package from Elastic downloads page.

2.  Install Kibana.

3.  Open the Kibana config file found at /etc/kibana/kibana.yml and uncomment the following:

4.  Restart/Reload daemon, enable and start the service.


Install & Configure Nginx (Source)

1.  Install nginx.

2.  Setup user for basic authentication.

Enter a password for the user when prompted.

3.  Configure nginx by clearing /etc/nginx/sites-available/default and inputting the following:

This will configure nginx as a reverse-proxy for Kibana, while also requiring the username and password set up in step two.

4.  Test nginx configuration and restart service.


Install & Configure Logstash

1.  Download the Logstash DEB package from the Elastic downloads page.

2.  Install Logstash.

3.  Create the file /etc/logstash/conf.d/10-beats.conf and input the following:

This will configure logstash to listen for beats applications on port 5044 without requiring SSL.

4.  Create the file /etc/logstash/conf.d/50-output.conf and input the following:

This will configure logstash to output beats data to elasticsearch on this host to index which named is determined by specified variables.  In this case, the beats application name – date.  Ex. winlogbeat-2018.08.23.

5.  Test your Logstash configuration.

6.  Restart and enable Logstash service.


At this point you should now have a functional ELK server that will accept input from BEATS!