Setup a Redis + Python Docker Dev Stack

May 25, 2019

Query a Redis DataStore from a Python Application

Redis is one of my favorite technologies to use when I'm building web apps. It's just super. If you haven't used it its sort of like a database, without the structure, and its blazing fast because its in memory. Have you ever set up a web application that took longer than the millisecond of patience people have when using the internet? There's a good bet you could cache quite a few requests and speed the whole thing up!

I used to do all kinds of nonsense where I would save things to temporary files and pick them back up later, or had cron jobs to stuff computations into pickle files. No more! Now I stick all that in a Redis in memory cache, which is way better and super fast!

By the end of this blog post you will have all the tools you need to query a Redis database, with some dummy data, from a python application, all without installing a single thing besides docker onto your computer! 

Learning Curve

If you read my Setup a MySQL + Python Docker Dev Stack, you'll know all about my how much trouble I had figuring out that pesky connection string. It was such a nitpicky little detail, but so crucial!

That single lesson, while very frustrating, has held through with every database I have ever connected to. It doesn't matter what kind of database you are connecting to. It could be mysql, postgres, mongodb, a redis cache. You just need to know the user, password, host, and default port! 

Onwards with an Example!

Docker-Compose networking MAGIC

When you spin up a docker-compose stack it kind of acts as if there are IT goblins living in your computer and creating hostnames and connections and whatnot. 

Let's say I am using the Standard Redis Docker Image (as one should!) to serve up my awesome in memory data store. The default port for Redis is 6379, and the hostname is whatever name I give to the service. Once you know figuring out the connection strings is a breeze!

  redis:
    image: redis
    ports:
      - 6379
YAML

Now, I could spin this up as is putting the above docker-compose.yml file in a directory and running:


docker-compose up -d
Bash

Redis Docker Image Environmental Variables and Security

By default Redis is fairly open and easy to get started with. Of course, if you are using this in a production setting you should always ensure it is locked down!

FROM continuumio/miniconda3:4.5.11

RUN apt-get update -y; apt-get upgrade -y; apt-get install -y vim-tiny vim-athena ssh

COPY environment.yml environment.yml

RUN conda env create -f environment.yml
RUN echo "alias l='ls -lah'" >> ~/.bashrc
RUN echo "source activate connect" >> ~/.bashrc

# Setting these environmental variables is the functional equivalent of running 'source activate my-conda-env'
ENV CONDA_EXE /opt/conda/bin/conda
ENV CONDA_PREFIX /opt/conda/envs/connect
ENV CONDA_PYTHON_EXE /opt/conda/bin/python
ENV CONDA_PROMPT_MODIFIER (connect)
ENV CONDA_DEFAULT_ENV connect
ENV PATH /opt/conda/envs/connect/bin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin

Connect to Redis From A Python App

Here is where the fun happens. We can spin up a docker image for our python app by defining the packages we need in an environment.yml and building them. For more information on this process you can see my full tutorial at Develop a Python Flask App With Docker.

In particular example we need sqlalchemy and a redis connection library. I like to throw ipython in there too, because it's just so useful.

version: '3'

# Run as
# docker-compose build; docker-compose up -d
# Check with
# docker ps
# Then check the logs with
# docker logs --tail 50 $container_id
# docker-compose images
# docker-compose logs --tail 20 repo_name


services:

  redis:
    image: redis
    ports:
      - 6379
    networks:
      - app-tier

  python_app:
    build:
      context: .
      dockerfile: Dockerfile
    depends_on:
      - redis
    networks:
      - app-tier
    command:
      tail -f /dev/null


networks:
  app-tier:
    driver: bridge

I have a pretty standardized Dockerfile format I use for python apps. I create my stack in a conda env yaml definition, copy it over to my docker container, and install it through conda. I use the base miniconda image, but installing miniconda is a very straightforward process if you are working from a different container type.

Specifying the command as tail -f /dev/null in the compose file is a cheap trick so I can keep all my configurations in the same compose file, and exec commands on the python_app container. With docker-compose you can only execute commands on running containers. Without this command the python_app would build and then exit.

You may wonder why I added the source activate connect to my ~/.bashrc AND set the corresponding conda environmental variables. The answer is convenience. I can use my conda env in the docker build itself, say to install pip packages that didn't install correctly as conda packages. I've found that dealing with the docker shell doesn't always work as I expect, and setting the conda env this way ensures it works the way I think it should! ;-)

Connect to your Postgres DB from your Python App

Let's open up a shell in our python_app service, and test out the connection from there.

docker-compose exec python_app bash
ipython
Bash

import redis

r = redis.Redis(host='redis', port=6379, db=0)
r.set('foo', 'bar')
print(r.get('foo'))
Python

As you can see we connected to the redis datastore in the docker-compose stack, and were able to execute a command. Once you have these basics down the sky is the limit!

Wrap Up

That's it! I hope I demonstrated just how easy it is to create highly configurable software stacks using docker-compose.

I always recommend grabbing the code and playing around yourself. Please grab the source code and get teching!

Bioinformatics Solutions on AWS Newsletter 

Get the first 3 chapters of my book, Bioinformatics Solutions on AWS, as well as weekly updates on the world of Bioinformatics and Cloud Computing, completely free, by filling out the form next to this text.

Bioinformatics Solutions on AWS

If you'd like to learn more about AWS and how it relates to the future of Bioinformatics, sign up here.

We won't send spam. Unsubscribe at any time.