ansible-container fix mysql_config not found issue

It has to be noted that ansible-container is not a supported redhat product. So everything you see here is unsupported. If you want to use ansible to package containers look at the Ansible Playbook Bundle in OpenShift.

I thought it would be a good idea to see where ansible container is. My goal was to create a wordpress  container backed by mariadb. Here is the github repo to my files:

based on the documentation I was supposed to be able to build my project by simply running ansible-container build. This resulted in the following error

Collecting mysql-python (from -r /_ansible/build/ansible-requirements.txt (line 3))
Downloading (108kB)
Complete output from command python egg_info:
sh: mysql_config: command not found
Traceback (most recent call last):
File "", line 1, in 
File "/tmp/pip-build-FeCnAk/mysql-python/", line 17, in 
metadata, options = get_config()
File "", line 43, in get_config
libs = mysql_config("libs_r")
File "", line 25, in mysql_config
raise EnvironmentError("%s not found" % (mysql_config.path,))
EnvironmentError: mysql_config not found

This is kind of weird as in my ansible role I am installing MySQL-python. It seems that this doesn’t matter.

A workaround to get this working is to use –use-local-python which uses the python you have on the system.

[root@host87 wordpress]# ansible-container build --use-local-python

After this builds correctly you can then run

[root@host87 wordpress]# ansible-container run 

You can now go to http://localhost or http://ipwherethecontainersarerunning and should see the WordPress install page

If you want to push this image to OpenShift you can do the following (Be aware that I OpenShift was not able to run the image due to Error: InvalidImageName and Failed to apply default image tag:)

Login to OpenShift and create new project (be aware the project name has to match your ansible-container project name)

oc new-project wordpress

In OpenShift for ansible container to work the following needs to be done so that the user can access the registry:

oadm policy add-role-to-user system:registry bob
oadm policy add-role-to-user admin bob -n wordpress
oadm policy add-role-to-user system:image-builder bob
oadm policy add-cluster-role-to-user cluster-admin bob

Now Bob can push images to openshift

then you can run as bob

oc whoami -t

now you can push the images to the registry

oc login --token $(oc whoami -t)

Then deploy the container to the OpenShift registry and create the build config

[root@host82 wordpress]# ansible-container --engine openshift deploy --push-to --username bob --password $(oc whoami -t) --roles-path ./roles
Parsing conductor CLI args.
Engine integration loaded. Preparing push.	engine=OpenShift™
The push refers to a repository []
Layer already exists
20180120192555: digest: sha256:faa6707abb3876ecf26e0580f2bd5c416bd45fdb79f39f5984a2d7e04c3d91fc size: 741
The push refers to a repository []
Mounted from wordpress/wordpress-db
20180120192707: digest: sha256:df8486c732a5092b420338dc2f9871d8554b096136b9bc0f3d2a834c05be094a size: 742
Conductor terminated. Cleaning up.	command_rc=0 conductor_id=93f470df51e1334d34fe5a18c8a73050beb44c085bfd0b7e934f4b872f14181d save_container=False
Parsing conductor CLI args.
Engine integration loaded. Preparing deploy.	engine=OpenShift™
Verifying image for db
Verifying image for wordpress
ansible-galaxy 2.5.0
  config file = /etc/ansible/ansible.cfg
  configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python2.7/site-packages/ansible
  executable location = /usr/bin/ansible-galaxy
  python version = 2.7.5 (default, Aug  4 2017, 00:39:18) [GCC 4.8.5 20150623 (Red Hat 4.8.5-16)]
Using /etc/ansible/ansible.cfg as config file
Opened /root/.ansible_galaxy
Processing role ansible.kubernetes-modules
Opened /root/.ansible_galaxy
- downloading role 'kubernetes-modules', owned by ansible
- downloading role from
- extracting ansible.kubernetes-modules to /root/ansible-container-wordpress/wordpress/ansible_deployment/roles/ansible.kubernetes-modules
- ansible.kubernetes-modules (v0.3.1-6) was installed successfully
Conductor terminated. Cleaning up.	command_rc=0 conductor_id=7d2ca733add8d6042be3f154af9ef973a8bcb8ea8d33df3da1b23824a0c0212f save_container=False

If successful you should be able to do this:

[root@host82 wordpress]# ansible-playbook ./ansible_deployment/wordpress.yml --tags start
 [WARNING]: Could not match supplied host pattern, ignoring: all

 [WARNING]: provided hosts list is empty, only localhost is available

PLAY [Manage the lifecycle of wordpress on OpenShift™] *****************************************************************************************

TASK [Create project wordpress] ****************************************************************************************************************
ok: [localhost]

TASK [Create service] **************************************************************************************************************************
changed: [localhost]

TASK [Create service] **************************************************************************************************************************
changed: [localhost]

TASK [Create deployment, and scale replicas up] ************************************************************************************************
changed: [localhost]

TASK [Create deployment, and scale replicas up] ************************************************************************************************
changed: [localhost]

TASK [Create route] ****************************************************************************************************************************
changed: [localhost]

TASK [Create route] ****************************************************************************************************************************
changed: [localhost]

PLAY RECAP *************************************************************************************************************************************
localhost                  : ok=7    changed=6    unreachable=0    failed=0