3 ways to dockerize existing Node.js app

Imagine you have Node.js app you would like to run from within Docker container. Maybe you want to check if it still works on ‘another’ machine, or it’s a test run before adopting containers as the way of software delivery. Reasons may vary.

In order to have something tangible, let’s pick hello.js app which prints out ubiquitous ‘Hello World’:

localhost-hello
How do we put it into container?

The following three approaches assume you know Docker basics.

First approach: mount project files directly to container’s file system.

Docker provides  -v command line argument that can be used for mounting paths from host files system to container’s. All we need to do is to take container with preinstalled Node.js (why bother with installing it ourselves?), mount project folder to arbitrary place in container file system and start the app. Fortunately for us, official Docker registry has node image that will do the trick. As usual, we don’t have to download it, Docker will do that automatically. So, here it goes:

As I’m running this example on Mac, Docker resides in virtual machine and I can’t use 127.0.0.1 to connect to app anymore. I have to use VM’s IP instead and  docker-machine ip can tell which one it is (mine says it’s 192.168.99.100). Go back to Chrome, copy-paste IP and voilà, it’s really working:

container-hello

This is definitely not the approach you’d use in production environment, but for quick and dirty proof-of-concept tasks it’s perfect. As a downside, you cannot move such container to another host without also copying mounted files, but the next approach can fix that.

Second approach: copy project files into container.

Docker’s command  cp can copy files between host and container files systems. This sounds so ridiculously simple, so I decided to complicate example a little bit:

  1. Start node container in interactive mode:

    docker run -ti -p8080:8080 node bash
    As usual, -ti  stands for interactive terminal.

  2. Press Ctrl+p + Ctrl+q to exit container but keep it running in background.

  3. Find container ID by executing  docker ps :docker-psIn my case it’s db8ce50cfd72

  4. Copy project files to container:

    Btw, you don’t have to type the whole container ID in any of Docker commands. First three characters is usually enough, assuming no other container ID start with them.

  5. Go back to container: docker attach db8

  6. Start hello app: node /helloapp/hello.js

Behold! Hello world is working again.

Unlike with the first approach, this container is self sufficient. Once you’ve committed it into new image with docker commit command, you can move it to different hosts. On the other hand, if you app changes its message from “Hello World” to “Goodbye cruel world”, you’ll have to repeat all these steps again. When the content can change, there’s third way.

Third approach: use Dockerfile to build new image with project files baked in.

Dockerfile makes it possible to describe image structure in plain text format and then ‘compile’ it into real image. If you think about it, all we want to achieve can be described in just four steps:

  1. Take existing node container,
  2. copy project files into it,
  3. open port 8080,
  4. run the app.

Here’s how we’d describe that in Dockerfile:

It’s almost perfect match. We took latest node container, copied files, configured container to allow connections to port 8080 and set up the entry point: whenever container starts – launch hello.js . FROM, COPY, EXPOSE and ENTRYPOINT keywords are just a few from ones you can use.

Now, if I build this Dockerfile:

docker build -t helloapp:latest .

I’ll get helloapp image tagged as latest (but I could tag it with arbitrary version number instead, e.g. 0.1-beta) with project files baked into it.

docker images

I can start it ( docker run -d helloapp), move between machines, delete and then rebuild again. Dockerfile acts like a source code for image, which I can add to version control system (VCS) along with other project files.

From these three approaches, the first two are good for occasional quick tasks, but Dockerfile is the way to go when you need to build images more repeatedly. It’s VCS friendly, automates otherwise tedious image creation process and when some of its dependences change, you’re just few keystrokes away from rebuilding it.

4 thoughts on “3 ways to dockerize existing Node.js app

Leave a Reply

Your email address will not be published. Required fields are marked *