Introduction
In previous article, we learned:
- Basics of Docker
- Docker Images
- Docker Containers
- Building Node.js Image and Running it in Container
It looked good (for basics) to have a separate environment for our node.js application but you will have to build an image and run it again after every little change you make.
What if you want to sync your development environment such that as soon as you make a change in your code, it is automatically reflected after you hit your API again? Sounds cool for the development environment, right? π€©
Enter Bind Mounts
Official docs says, When you use a bind mount, a file or directory on the host machine is mounted into a container. The file or directory is referenced by its absolute path on the host machine. Read official details here.
In simpler terms, it synchronizes the application directory with the directory inside the container so that whatever change you make in your host machine is immediately reflected in your container.
This means if you change anything in your files, they will automatically be changed in the running container.
Nodemon and package.json
Since you are from node.js world, you would know that we have to use nodemon
which monitors our files in the root directory and then restart node.js app as soon as file change is detected.
I had nodemon globally installed so I faced issue with the docker container, so please make sure that you have nodemon
in your node_modules
by running this command:
npm install --save-dev nodemon
You will have it in devDependencies
. Now you should edit scripts
as follows in package.json
:
"scripts": {
"start": "node index.js",
"dev": "nodemon -L index.js",
"test": "echo \"Error: no test specified\" && exit 1"
},
Finally, your package.json
should look like this:
{
"name": "docker",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"start": "node index.js",
"dev": "nodemon -L index.js",
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC",
"dependencies": {
"express": "^4.18.2"
},
"devDependencies": {
"nodemon": "^3.0.1"
}
}
We are now done with package.json
changes. Lets move towards updating Dockerfile
Updating Dockerfile
Before we had this from previous article,:
FROM node:12.18.3-alpine3.12
WORKDIR /app
COPY package.json /app
RUN npm install
COPY . /app
EXPOSE 3000:3000
CMD ["node", "index.js"]
Now since we have updated package.json
and since we want to run it in development environment so we have to run this command:
npm run dev
In order for this to be in Dockerfile, we have to replace last line like this:
CMD ["npm", "run", "dev"]
Overall Dockerfile should look like this:
FROM node:12.18.3-alpine3.12
WORKDIR /app
COPY package.json /app
RUN npm install
COPY . /app
ENV PORT 3000
EXPOSE $PORT
EXPOSE 3000:3000
CMD ["npm", "run", "dev"]
We have also made these two changes:
ENV PORT 3000
EXPOSE $PORT
We are now adding environment variable of 3000
directly in our Dockerfile
and exposing it.
Building Docker Image
Since now we have a change in package.json
so we have to re-build our image from the command we learned earlier:
docker build -t shameel-node-image .
Running Docker Container
From previous article, we had learned this docker command:
docker run -p 3000:3000 -d --name shameel-node-app -d shameel-node-image
This runs docker container named as shameel-node-app
from shameel-node-image
in detached mode
and exposes
internal port of container i.e, 3000
to the hostMachine
port of 3000
.
Now, you will need to use a new flag -v
:
docker run -v <hostMachinePath>:<containerPath> -p 3000:3000 -d --name shameel-node-app -d shameel-node-image
You can give it like this if you are on windows:
docker run -v "D:\shameel-node-docker":/app -p 3000:3000 --name shameel-node-app shameel-node-image
You have to give absolute path of hostMachine
. In this case, D:\shameel-node-docker
is the absolute path where my application exists in my windows machine.
But that makes our command too long. We need a solution which should be able to give us absolute path in a cleaner way.
In Windows CMD, you can get it using %cd%
and in Linux Bash you should be able to get it with $(pwd)
For Bash
I tried it using Git Bash, I hope it should work in Bash in Linux as well.
$ docker run -v "$(pwd):/app" -p 3000:3000 -d --name shameel-node-app shameel-node-image
I got this output which means container started running:
For Windows CMD
I ran this command it worked like a charm in CMD:
docker run -v "%cd%:/app" -d -p 3000:3000 --name shameel-node-app shameel-node-image
Inspecting the Mount
You now have successfully mounted the location from your host machine to the container and would like to verify it.
To do that, the syntax of command is this:
docker inspect <conainer-name>
For my case, it becomes this:
docker inspect shameel-node-app
And then look for Mounts from the entire output.
If you do not want to inspect everything and only focus on Mount
then run the command below:
docker inspect -f "{{ .Mounts }}" shameel-node-app
You should be able to see the bind
path of your host machine to the container.
If you have Docker Desktop, you can select your container and click on Bind mounts
to check this out:
Avoid node_modules synchronization
Currently, if you delete node_modules
from your host machine. It will also be deleted from container.
This should not be a wanted behavior because we are already installing node_modules
in our Dockerfile
. So, we have to come up with a way where we synchronize everything other than node_modules
We will have to utilize -v
flag with more specific path.
I ran a command in CMD like this:
docker run -v "%cd%:/app" -v /app/node_modules -d -p 3000:3000 --name shameel-node-app shameel-node-image
Now from Docker Desktop, you can verify it as well:
Everything other than node_modules
is MOUNT
.
node_modules
are now installed from npm install
command in our Dockerfile
and are no longer synced to the host machine.
Verifying Development Environment
Currently we have this app:
After making slight changes and saving it in the file as:
app.get('/', (_, res) => {
res.send('Hello Shameel! How are you?');
}
);
You can check in below image that the nodemon
restarted in Docker Desktop logs and as soon as we refresh the page, we should be able to see the reflected changes
You can check the logs in CLI from this command as well:
docker logs <container>
For me, it is:
docker logs shameel-node-app
Why You Should Do Read Only Bind Mount
Currently, the mounting is dual-side. If you make any changes in app
directory in container, it will be reflected in your host machine.
It is dangerous because if somehow, someone got access to your container then the hacker can easily put something malicious in your main machine.
The only change in the command is addition of :ro
which ensures that the mount is READ ONLY.
Command in bash would look like this:
docker run -v "$(pwd):/app:ro" -p 3000:3000 -d --name shameel-node-app shameel-node-image
Command in CMD would look like this:
docker run -v "%cd%:/app:ro" -p 3000:3000 -d --name shameel-node-app shameel-node-image
Conclusion
In this blog, we learned how to make development in node.js easy with the help of docker by incorporating bind mounts and ensuring that our dev changes are reflected as soon as we save our file.
Happy coding and containerizing! π³β¨
Follow me for more such content:
LinkedIn: https://www.linkedin.com/in/shameeluddin/
Github: https://github.com/Shameel123
Top comments (2)
I don't get it, why don't you just
?
Why create your own container (that starts by copying things in the image, that actually won't be used)? (ok, you're using the
node_modules
that you build onceFwiw, that's the approach I use when I need to use old versions of Node (I only install LTS locally, as that should be enough for all my projects). And for
node_modules
, I add-u "$(id -u):$(id -g)"
to my command so I can runnpm ci
from within the container and it writes to thenode_modules
on my machine.But I'm on Linux and not using Docker Desktop, so there's no such thing as synchronizing things, it's just a system mount with zero overhead.
Great breakdown of using Docker for Node.js development! This will definitely save Node.js development services a lot of time and headaches. Looking forward to streamlining my workflow with this approach.