Setup a MySQL + Python Docker Dev Stack
Apr 06, 2019Learning Curve
The Pain
The first time I setup an app to connect to a mysql database I spent at least a full hour fiddling with that connection string, before throwing my hands up in frustration and taking a walk. I find walking to be very therapeutic, so I calmed down and figured out that I was mixing up the ports. I had a MySQL container and a node.js container. I already had something running on port 3306 on that computer, so I exposed the port on MySQL as 3307, but tried to connect to it in the node.js container as localhost:3307.
Figuring it Out
Now I can say, well dummy, all the containers in a docker-compose stack talk to one another because docker does magic with networking, and the hostname is the same as the service name, and the port is the default internal port of the application. Hindsight and all that.
Onwards with an Example!
Docker-Compose networking MAGIC
If you read my learning curve shenanigans above you will generally know how this works. When you spin up a docker-compose stack it kind of acts as if there are IT goblins living in your computer and creating hostnames and connections and whatnot.
Let's say I am using the Standard MySQL Docker Image (as one should!) to serve up my database. The default port of MySQL is 3306, and the hostname is whatever name I give to the service.
Now, I could spin this up as is putting the above docker-compose.yml file in a directory and running:
Notice that the service name, on line 3 of our docker-compose config is mysql_db, and that this matches both the name that we give it to the docker-compose exec command.
MySQL Docker Image Environmental Variables
You'll notice we set some environmental variables starting with MYSQL. These are configuration variables that are read by the init process in the mysql docker container that are used to spin up your database and set the initial username and passwords. If your stack is anything but a dev environment you would want to add your passwords through secrets, or at least through a .env file.
Test the Connection
If you haven't used docker-compose exec, it is a super handy way to drop into a shell in your container. You can also use it for more involved commands like database backups, or even to use a docker image as a precompiled binary and simply execute a single program on it. For now we will use it simply to test our MySQL database connection.
For the sake of illustration I used the hostname, mysql_db, here to connect to the MySQL database, but I could have just as well used localhost since this is the actual container our database sits on. For the password type in the password you set as the MYSQL_ROOT_PASSWORD environmental variable.
Instead of root I could have also used the user and password I defined in the MYSQL_USER / MYSQL_PASSWORD environmental variables.
Connect to the Database From A Python App
Here is where the fun happens. We can spin up a docker image for our python app by defining the packages we need in an environment.yml and building them. For more information on this process you can see my full tutorial at Develop a Python Flask App With Docker.
For this particular example we need sqlalchemy and a mysql connection library. I like to throw ipython in there too, because I get kind of twitchy when I don't have it. Ipython has pretty much replaced bash for me at this point. If you prefer jupyterhub notebooks you may want to add that too.
I noticed, that, unfortunately the mysql connectors on conda mostly acted a bit funny, except for pymysql. That was totally fine for my case, but if you require a different driver I would recommend testing it out with first conda, and then if that doesn't work installing it with pip.
I have a pretty standardized Dockerfile format I use for python apps. I create my stack in a conda env yaml definition, copy it over to my docker container, and install it through conda. I use the base miniconda image, but installing miniconda is a very straightforward process if you are working from a different container type.
You may wonder why I added the source activate connect to my ~/.bashrc AND set the corresponding conda environmental variables. The answer is convenience. I can use my conda env in the docker build itself, say to install pip packages that didn't install correctly as conda packages. I've found that dealing with the docker shell doesn't always work as I expect, and setting the conda env this way ensures it works the way I think it should.
Add your Python App to your Stack
You can quite easily add a new service to your docker-compose stack. The whole thing looks like this:
Specifying the command as tail -f /dev/null in the compose file is a cheap trick so I can keep all my configurations in the same compose file, and exec commands on the python_app container. With docker-compose you can only execute commands on running containers. Without this command the python_app would build and then exit.
Connect to your MySQL DB from your Python App
Let's open up a shell in our python_app service, and test out the connection from there.
As you can see we connected to the mysql_db in the docker-compose stack using the credentials we specified earlier, and were able to execute a command. Once you have these basics down the sky is the limit!
Wrap Up
That's it! I hope I demonstrated just how easy it is to create highly configurable software stacks using docker-compose.
I always recommend grabbing the code and playing around yourself. Please grab the source code and get teching!