Cloudforms / ManageIQ automation and centralized logging with logstash kibana and elasticsearch

  • laurent 

Cloudforms / ManageIQ has a very powerful automation engine. There is basically nothing you can’t do with it. You could even trigger an api call to make coffee during a provisioning run. But what if your provisioning fails?

Every step which is executed by the cloudforms automation engine gets logged into /var/www/miq/vmdb/log/evm.log or /var/www/miq/vmdb/log/automation.log. As you can imagine the log is very verbose which in one way is awesome (you have all the data) but in the another way all this data makes it difficult to troubleshoot.


If you do not have a centralized logging in place today you would have ssh to cloudforms and tail the logs to troubleshoot a failed run.

This post will help you build a logstash server on rhel7 with a log filter for the cloudforms evm and automation log. This will empower you to filter the logs based on taskid / Next Stage etc… which give you a complete history on what exactly happened during provisioning run. So for example you could go check the clone_options during provisioning which would look like this:


You can find all the code snippets as an install script here
1. Add the hostname to /etc/hosts (this is important as the hostname needs to resolve)

2. Install a rhel7 server and register it to rhn. You will need the following repos:
rhel-7-server-optional-beta-rpms (for golang to create a working ssl cert)

3. Disable firewalld and ipv6

4. Set Selinux to premissive mode

5. Add the elastic search repo

6. Install elasticsearch, java and apache

7. Disable dynamic scripts for elasticsearch

8. Configure Systemd to start elastic search on boot

9. Now that we have elastic search running we can install kibana. Go download and unpack the kibana tarball and make sure to create a /var/www/kibana3/pub dir which we will serve as the directory for the client certs and scripts.

10. Change the listen port for kibana in kibana-3.0.1/config.js from 9200 to 80 as we will proxy all traffic through port 80 in apache

11. All the config for kibana is now done. Let-s create the apache config

12. set a user name and password for the web interface

13. Start apache and make sure it’s started on boot

14. Add the logstash repo

15. Install logstash

16. As we are going to use logstash-forwarder we want to have a secure connection. to make this happen and to not run into any ssl issues we need to install the golang package from the beta channel so that we can build the cert generator.

17. Once you build the lc-tlscert you can execute it and answer all the questions to create an ssl cert.

18. Create an ssl directory in /etc/logstash and copy the 2 keys into the newly created dir.

19. Copy the created crt to the kibana3 pub folder

20. Next we need to create the input configuration for lumberjack. We will be listening on port 5000

21. Finally lets write the filter for the evm and automate.log. You will have a tag task_id and miq_msg

22. Last we need to tell logstash where to send its data too.

23. Restart and enable logstash to start on boot

24. We can now prepare the client configs. Cd to /var/www/html/kibana3/pub/ and download the following files

25. In the same directory add this script

26. Login to the cloudforms appliance, download and run the

27. Start the logstash forwarder

28. You just finished the setup. Login to your elk server. You should already have data which you can work with on the logstash server.

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.