Setup a MySQL + Python Docker Dev Stack

Apr 06, 2019
 

Learning Curve

The Pain

The first time I setup an app to connect to a mysql database I spent at least a full hour fiddling with that connection string, before throwing my hands up in frustration and taking a walk. I find walking to be very therapeutic, so I calmed down and figured out that I was mixing up the ports. I had a MySQL container and a node.js container. I already had something running on port 3306 on that computer, so I exposed the port on MySQL as 3307, but tried to connect to it in the node.js container as localhost:3307. 

Figuring it Out

Now I can say, well dummy, all the containers in a docker-compose stack talk to one another because docker does magic with networking, and the hostname is the same as the service name, and the port is the default internal port of the application. Hindsight and all that. 

Onwards with an Example!

Docker-Compose networking MAGIC

If you read my learning curve shenanigans above you will generally know how this works. When you spin up a docker-compose stack it kind of acts as if there are IT goblins living in your computer and creating hostnames and connections and whatnot. 

Let's say I am using the Standard MySQL Docker Image (as one should!) to serve up my database. The default port of MySQL is 3306, and the hostname is whatever name I give to the service.

version: '3'
services:
  mysql_db:
    image: mysql:5.7
    environment:   # Set up mysql database name and password
      MYSQL_ROOT_PASSWORD: password
      MYSQL_DATABASE: employees
      MYSQL_USER: user
      MYSQL_PASSWORD: password
    networks:
      - app-tier

networks:
  app-tier:
    driver: bridge
YAML

Now, I could spin this up as is putting the above docker-compose.yml file in a directory and running:

docker-compose up -d
docker-compose exec mysql_db bash
Bash

Notice that the service name, on line 3 of our docker-compose config is mysql_db, and that this matches both the name that we give it to the docker-compose exec command.

 

MySQL Docker Image Environmental Variables

You'll notice we set some environmental variables starting with MYSQL. These are configuration variables that are read by the init process in the mysql docker container that are used to spin up your database and set the initial username and passwords. If your stack is anything but a dev environment you would want to add your passwords through secrets, or at least through a .env file.

Test the Connection

If you haven't used docker-compose exec, it is a super handy way to drop into a shell in your container. You can also use it for more involved commands like database backups, or even to use a docker image as a precompiled binary and simply execute a single program on it. For now we will use it simply to test our MySQL database connection.

➜ docker-compose exec mysql_db bash
root@6700d86766a1:/# mysql -u root -h mysql_db -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 2
Server version: 5.7.25 MySQL Community Server (GPL)

Copyright (c) 2000, 2019, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
Bash

For the sake of illustration I used the hostname, mysql_db, here to connect to the MySQL database, but I could have just as well used localhost since this is the actual container our database sits on. For the password type in the password you set as the MYSQL_ROOT_PASSWORD environmental variable.

Instead of root I could have also used the user and password I defined in the MYSQL_USER / MYSQL_PASSWORD environmental variables.

Connect to the Database From A Python App

Here is where the fun happens. We can spin up a docker image for our python app by defining the packages we need in an environment.yml and building them. For more information on this process you can see my full tutorial at Develop a Python Flask App With Docker.

For this particular example we need sqlalchemy and a mysql connection library. I like to throw ipython in there too, because I get kind of twitchy when I don't have it. Ipython has pretty much replaced bash for me at this point. If you prefer jupyterhub notebooks you may want to add that too.

name: connect
channels:
  - conda-forge
  - defaults
dependencies:
  - python=3.6
  - ipython
  - sqlalchemy
  # Depending on your exact setup a different mysql connector may work better than another
  - pymysql
  - mysqlclient
  - mysql-connector-python
YAML

I noticed, that, unfortunately the mysql connectors on conda mostly acted a bit funny, except for pymysql. That was totally fine for my case, but if you require a different driver I would recommend testing it out with first conda, and then if that doesn't work installing it with pip.

I have a pretty standardized Dockerfile format I use for python apps. I create my stack in a conda env yaml definition, copy it over to my docker container, and install it through conda. I use the base miniconda image, but installing miniconda is a very straightforward process if you are working from a different container type.

FROM continuumio/miniconda3:4.5.11

RUN apt-get update -y; apt-get upgrade -y; apt-get install -y vim-tiny vim-athena ssh mysql-client

COPY environment.yml environment.yml

RUN conda env create -f environment.yml
RUN echo "alias l='ls -lah'" >> ~/.bashrc
RUN echo "source activate connect" >> ~/.bashrc

# Setting these environmental variables is the functional equivalent of running 'source activate my-conda-env'
ENV CONDA_EXE /opt/conda/bin/conda
ENV CONDA_PREFIX /opt/conda/envs/connect
ENV CONDA_PYTHON_EXE /opt/conda/bin/python
ENV CONDA_PROMPT_MODIFIER (connect)
ENV CONDA_DEFAULT_ENV connect
ENV PATH /opt/conda/envs/connect/bin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
Bash

You may wonder why I added the source activate connect to my ~/.bashrc AND set the corresponding conda environmental variables. The answer is convenience. I can use my conda env in the docker build itself, say to install pip packages that didn't install correctly as conda packages. I've found that dealing with the docker shell doesn't always work as I expect, and setting the conda env this way ensures it works the way I think it should. 

Add your Python App to your Stack

You can quite easily add a new service to your docker-compose stack. The whole thing looks like this:

version: '3'

# Run as
# docker-compose build; docker-compose up -d
# Check with
# docker ps
# Then check the logs with
# docker logs --tail 50 $service_name
# docker-compose images
# docker-compose logs --tail 20 $service_name

services:

  mysql_db:
    image: mysql:5.7
    environment:   # Set up mysql database name and password
      MYSQL_ROOT_PASSWORD: password
      MYSQL_DATABASE: employees
      MYSQL_USER: user
      MYSQL_PASSWORD: password
    networks:
      - app-tier

  python_app:
    build:
      context: .
      dockerfile: Dockerfile
    depends_on:
      - mysql_db
    networks:
      - app-tier
    command:
      tail -f /dev/null


networks:
  app-tier:
    driver: bridge
YAML

Specifying the command as tail -f /dev/null in the compose file is a cheap trick so I can keep all my configurations in the same compose file, and exec commands on the python_app container. With docker-compose you can only execute commands on running containers. Without this command the python_app would build and then exit.

Connect to your MySQL DB from your Python App

Let's open up a shell in our python_app service, and test out the connection from there.

➜ docker-compose exec python_app bash
(connect) root@716c9f30f2d3:/# ipython
Python 3.6.7 | packaged by conda-forge | (default, Feb 28 2019, 09:07:38)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.4.0 -- An enhanced Interactive Python. Type '?' for help.

In [1]: from sqlalchemy import create_engine
In [2]: mysql_conn_str = "mysql+pymysql://root:password@mysql_db:3306/employees"
   ...: engine = create_engine(mysql_conn_str)
   ...: connection = engine.connect()
   ...: q = connection.execute('SHOW DATABASES')
   ...: available_tables = q.fetchall()
In [3]: available_tables
Out[3]:
[('information_schema',),
 ('employees',),
 ('mysql',),
 ('performance_schema',),
 ('sys',)]
Python

As you can see we connected to the mysql_db in the docker-compose stack using the credentials we specified earlier, and were able to execute a command. Once you have these basics down the sky is the limit!

Wrap Up

That's it! I hope I demonstrated just how easy it is to create highly configurable software stacks using docker-compose.

I always recommend grabbing the code and playing around yourself. Please grab the source code and get teching!

Bioinformatics Solutions on AWS Newsletter 

Get the first 3 chapters of my book, Bioinformatics Solutions on AWS, as well as weekly updates on the world of Bioinformatics and Cloud Computing, completely free, by filling out the form next to this text.

Bioinformatics Solutions on AWS

If you'd like to learn more about AWS and how it relates to the future of Bioinformatics, sign up here.

We won't send spam. Unsubscribe at any time.