DEV Community

Cover image for Using Docker in development the right way
Leandro Proença
Leandro Proença

Posted on • Updated on

Using Docker in development the right way

If you are not proficient in Docker, or that topics like containers and virtual machines are still a bit "fuzzy", have problems working with Docker in development but want to learn and work using containers, this article is for you.

Few weeks ago I wrote an article on Thinking like containers, where I did an introduction on containers and explained the problem that containers solve.

Production usage

The most popular usage of containers is at production environments, because the team can pack up the application into an image containing the runtime and all the needed dependencies.

Such a process helps to deploy the application in isolation and makes it server-agnostic, meaning that it can technically be easily deployed at any cloud provider in the world.

Containers follow a standard. They will run homogeneously anywhere.

Development usage

However, some people advocate for containers and use them in development too.

One way to do it, is by downloading the application image used in production and running the container locally.
Pretty cool, because it helps to replicate bugs with ease, since the container doesn't care whether it's running in a remote server at AWS or in your local machine. The runtime, dependencies and the application itself: exactly the same as production.

Unless you are trying to replicate some very specific bug, you don't need to download the bloated production image locally.

Using Docker the wrong way

Install Docker

Try to think on the following scenario:

  • You start working on a new project
  • They already use containers (Docker) in production
  • You configure your local environment based on the image declared in the Dockerfile

All is ok here.

  • You run docker-compose up, which then starts building the application image, installing hundreds of dependencies needed for the application
  • Afterwards, your server is running at localhost:8080. Great, you check it and start coding

Everything's pretty ok right here.

But after writing some code, you want to see it in action. You run docker-compose up again and that's where you face your worst nightmare: it will install all the dependencies over and over again, at every time you start up the server.

You then realize that Docker and all its container party are a pure waste of time. You give up and install all the application environment in your host machine.

Good luck with that.

How about fixing the Dockerfile?

Yes, chances are that the Dockerfile is not following the best practices, which makes very difficult the container usage in development.

In this article I won't cover the best practices for writing a good Dockerfile, but certainly it will be covered in a future post.

I'll focus on another aspect.

Forget how those real projects are using Docker

It sounds counterintuitive at first but my argument is that, if you start using Docker today, and thinking that containers work exactly like you see in the company's projects, you are doomed.

Containers go beyond that way. I suggest first learning how containers work. Experiment on them. Try out different things and technologies using them.

Then, only then, you can use containers on real projects the right way.

What's the right way then?

Let's supposed you don't have NodeJS installed in your host. People would first install NodeJS, depending on your operating system, configure it and do a couple of things before being able to run:



node hello_world.js


Enter fullscreen mode Exit fullscreen mode

But using Docker, you don't need to install anything else but Docker in your host computer. By doing so, you could run your command from inside a container:



docker run node hello_world.js


Enter fullscreen mode Exit fullscreen mode

In terms of performance, it takes almost the same time as running from the host. It's unnoticeable.

It also gives you the ability to have a "version manager" out-of-the-box:



docker run node:10 hello_world.js
docker run node:12 hello_world.js


Enter fullscreen mode Exit fullscreen mode

Now, there's no longer need to change your version manager every three years just because everyone is using "a fancy new cool version manager".

Your host machine will thank you.

Tips for using containers (Docker) effectively in development

In the upcoming sections I'll share some tips that maybe will help you to understand the problem containers solve.

Image !== container

Try to really understand and use containers, not images. Only then, learn how images work. Images are your last resort.

Learn volumes

Mastering volumes will save your life. Seriously.

Learn how they work and how then can effectively boost your productivity.

They are not as hard as they seem to be.

Learn the Docker network

Containers are isolated by design. You use them because you don't want to mess up with your host computer.

But in real projects containers need intercommunication. Learn how to take advantage of the Docker network and let your containers talk to each other.

Use docker CLI first. Then docker-compose

The Docker documentation reference is pretty good and will provide you almost every information you need to make your projects running on Docker.

Use the docker CLI heavily. Suffer. Feel the pain on the command-line.

Then, only then, go to docker-compose and truly understand how docker-compose CLI helps you even more on a daily basis.

Build a pet project using Docker

This is a perfect exercise for learning Docker. Resist the impulse to install or use something from your host. Put your web server in a container. Put your database in a container.

Build a real pet-project full-stack application from the scratch, this is the best way to get comfortable using Docker.

You won't regret and never go back.

Conclusion

In this article I tried to explain technically why I think Docker is misinterpreted by many developers.

Arguments such as "Docker is too much", or "Docker is only useful for production", usually come with lack of understanding. There are very well documented best practices around Docker in development that, if correctly applied, will refute those arguments.

Of course, after all, using Docker in development is not mandatory. It's just a tool, similar to saying you like coding in Vim or VSCode.

Top comments (12)

Collapse
 
itsraghz profile image
Raghavan alias Saravanan Muthu

Hi, Thanks for sharing the good insights with your experience on the most hottest topic in the industry these days! This page dev.to/leandronsp/thinking-like-co... throws a 404. Pls check that.

Collapse
 
leandronsp profile image
Leandro Proença

thanks for the feedback! just fixed it, cheers!

Collapse
 
itsraghz profile image
Raghavan alias Saravanan Muthu

Great, thank you for the quick action and confirmation.

Collapse
 
klvenky profile image
Venkatesh KL

That's an interesting perspective @leandronsp.
We've used docker primarily for maintaining the same database version across different engineers without any setup dilemma. That worked pretty well for us however recently started noticing too much friction to onboard a new engineer. So that looks pretty good for our case.
Can you share a sample on how you've achieved complete development in docker?

Collapse
 
gianiaz profile image
Giovanni Lenoci

We use docker + docker-compose in our team. (php + db + nginx)

We use makefile for automate setup and be up and running with a few commands.
No friction from new dev team members (maybe a little if you are starting from scratch with docker, but today is a required skill).

We start from a base dockerfile we extend for development (adding for example xdebug support for example).
We deploy a production image through k8s derived from the same base images.

We literally deploy the same local environment to prod.
For me is a life changing way to work.

Collapse
 
klvenky profile image
Venkatesh KL

Ohh great. I'm style little curious about a few things, ll. Mainly on this question. Does the dev server also run within docker? In real time? Don't mind if it's a bad question.
I'm very curious because, we have lots of micro services which holds their own micro front-end, which is handled by a top level front-end script.
However it has its own limitations as we won't be able to run multiple microservices together due to port issues in our scripts. So this would be an ideal solution for us where we'll have multiple docker containers running in isolation.

Please let me know if you've a working example. Thanks

Thread Thread
 
gianiaz profile image
Giovanni Lenoci • Edited

If you have many microfrontend in prod I assume each one will be accessible on a different hostname.

You can do the same on a local machine mapping local host in your /etc/hosts (or similar under windows), host 127.0.0.1 -> 127.0.0.254 will map to localhost:

127.0.0.2 microfrontend1.app
127.0.0.3 microfrontend2.app
Enter fullscreen mode Exit fullscreen mode

In your docker-compose file you can expose every microfrontend in this way:

node1:
  ports:
    - "127.0.0.2:80:8080" #

node2:
  ports:
    - "127.0.0.3:80:8080" #
Enter fullscreen mode Exit fullscreen mode

node1 will respond to microfrontend1.app in your local browser and node2 to microfrontend2.app

In this configuration you take rid of limits about port already used because there are 2 different hosts.

Hope this helps

Thread Thread
 
klvenky profile image
Venkatesh KL

That makes sense. Thanks a ton 👏

Thread Thread
 
klvenky profile image
Venkatesh KL

Only difference is that we're having a proxy service at the top level which hides all the micro apps from external world. So that's something that is have to consider while trying it out.
I'll give it a shot thanks for the motivation 👏

Collapse
 
higginsrob profile image
Rob Higgins

I've been working on my own version of docker dev environments in my very limited free time (there's no documentation): github.com/freshstacks/home. This is basically where I keep my evolving dev environment, with vim/tmux/vscode/zsh configuration and shortcuts baked in. This is the first time I've shared it. Notable lessons learned: 1) create a docker volume first, run all operations inside that volume (I use a custom git clone command that saves repos by owner/project inside the docker volume), dockerize services and force them to also run inside this volume. HUGE speed increase compared to host bind mount. 2) To use docker in docker for a non root user, attach the docker sock to a different file address (/var/run/host.docker.sock), then after you start the dev container you run a docker exec as root, using socat to copy the docker sock to the normal location with the "nonroot" users permissions. 3) You can use vscode to attach to the dev container, or when feeling old school fire up tmux and vim with all my plugins and config.

Collapse
 
klvenky profile image
Venkatesh KL

Let me check it out
Thanks

Collapse
 
lucasmacedodev profile image
Lucas Macedo

Great post. Thank you!