This article originally appeared on my blog
The joys of a new job.
You get there on your first day and after the introductions, you are given your first task: to get the project up and running on your dev box.
The task comes with a link to the wiki, which has THE instructions:
- Download X
- Install Y
- Run “./build --special-flag”
- Copy Z to W
- Make sure foo but not bar
- ...
You know how it is going to end. And just in case you had any doubts, your new boss just confirms it:
“It’s probably a bit out of date, so can you please update it if you find any issues?”
Two days later, you emerge triumphant, proud of your achievement, and with a smile on your face: Your updated instructions are so accurate that the next person joining will be forever thankful.
Or maybe not
The instructions on the wiki seem like a good enough solution, given that each member of the team needs to pass the ordeal just once ... except when:
- You get a new and shiny dev box.
- Your dev box dies.
- You move to another project.
- Six month latter, you go back to the original project. Nothing works anymore.
- A new feature requires Zookeeper, and everybody to configure a local Zookeeper cluster!
- Your operating system is upgraded and “insert-your-fav-database-here” doesn't want to start anymore. Welcome to DLL hell.
- That old project that your team still maintains needs Maven 2.1.1 but the current project requires > 3.3, and requires Postgres 7.1 but the new one requires at least 9.5 with PostPic and OpenFTS extensions.
- Polyglot Microservices company! Install maven, gradle, npm, yarn, node, make, go, ruby, rake, lein, sbt, rebar, cassandra, mongodb, redis, postgresql, couchbase…
But sometimes, some component, for some unknown reason, somehow stops working, and it won't ever start again.
Enter Docker Compose
Docker Compose is a tool which allows one to define a multi-container system using one file, and to run them in Docker, creating a private network to isolate the system.
A Docker Compose file that will start a Postgres and a Redis is as simple as:
version: "3"
services:
postgres:
image: postgres:9.5
volumes:
- ./db-provision:/docker-entrypoint-initdb.d/
ports:
- 5432:5432
redis:
image: redis:3.2.9
You can configure the container for your project and Docker Compose will be able to create a new environment, mapping some of the ports and files between the internal containers and your dev box.
Something like:
Using Docker Compose to set up and run our local development environment gives us:
What | How |
---|---|
One command to set up and start the system | docker-compose up |
One command to upgrade or add new components | docker-compose up --build |
One command to stop the system | docker-compose stop |
One command to clean up any trace of the system | docker-compose down --remove-orphans --rmi all |
A precise and repeatable process | The Docker Compose file will be executed by a machine, so there is no room for imprecision. Just make sure that you use an immutable version of the image. |
A process that never gets out of date | As the team will use the Docker Compose everyday for their development, and any change in the system must be done within the Docker files, there is no opportunity for the instructions to get stale. |
A process that can be version controlled | The Docker files are plain text, they will be collocated with the rest of the project source code, they will be peer reviewed and changes can be rolled back, diffed, ... |
A process that allows for experimentation | As it is easy to destroy and recreate systems, and as the Docker files can be rolled back, there is no risk on trying new versions of components or fiddling around with the configuration. |
A way of running more than version of the same project or more than one project at the same time | Unfortunately, this is probably requires some manual fiddling to remap some of the ports exposed in the Docker Compose file, but at least all the ports are in just one file. After that is a simple: docker-compose -p my-project-one up -d; docker-compose -p my-project-again up |
A production-like environment or not?
When talking about Docker, one of the benefits usually mentioned is that it allows one to run a production-like environment anywhere.
Indeed this is a big benefit. Your CI server should run its tests against a production-like version, and you should be able to run such a version locally, but for the day to day development experience, you want a setup that allows for a fast feedback cycle.
For example, you don’t want to minimize your JavaScript files for each and every change that you make and you probably want some sort of auto refresh.
Fast feedback is just more important.
This development environment should include any build tools that your project requires, so your setup instructions should not have any “install maven/npm” step. Those tools should come within a Docker container. This way, everybody in the team will be using the same tool version in the same operative system version.
Conclusion
Even if you don’t use Docker in production, it is well worth using it just to make setting up or upgrading a dev environment an uneventful process.
This process should be as close to “docker-compose up” as possible. Nothing more.
Don’t treat your dev box as some big global mutable variable. Use Docker Compose to treat it as a nice immutable one.
You have here a detailed example of how to Docker Compose a local developer environment.
Top comments (2)
Great article, as I love docker.
I'm pretty sure though that it's
volumes
and notvolumens
Glad you liked it.
Indeed it is volumes, not sure how I managed to mangle it after copy and pasting it :(.
Thanks a lot for the correction!
Dan