Using Docker Compose for Local Development

26 Jan 2021 • 6 min read • 3.3k

In this post, you will see how to use Docker Compose to set up a local development environment for your project or application.

If you want to follow along, you will need to install Docker Compose first. The steps can be found here. All the code was executed using the 1.27.4 version.

The examples listed are in the context of APIs and server side applications. If you are building a different type or application, fear not! The same concepts and ideas can be applied to other types of development.

The problem

When building applications, it’s often a requirement to integrate with multiple tools, services, or external software in general.

For example, a Node.js application can connect to a database, a Redis instance, the Firebase API, or even a second application (thinking of a microservices scenario).

The more dependency services you have, the harder it gets to configure and manage a development workflow on your computer. To list a few issues:

  • Complex setup. The “getting started” section of your project can become an extensive list of software to install. Onboarding new developers takes more time as the project grows.
  • Version management. If a project uses v1 or a dependency service and another one is on v3 for example, it can be hard to work on both projects simultaneously and switch between versions.
  • Complex initialization. In a microservices scenario for example, running all services can require a lot of coordination. It’s easy to get lost between multiple terminal windows and commands to run.

Meet Docker Compose

Docker Compose is a command-line tool for running applications with multiple Docker containers.

Using the YAML syntax, you can declare a list of services and run them with a single command — docker-compose up. Services can be created from a local build or a Docker image from a remote repository.

Docker Compose solves all the problems listed above. It works in a declarative way — you specify the exact environment needed for your project, and it will take care of turning it into reality. The environment created by Docker Compose is completely isolated and can be torn down with a single command.

Below you will see how to use Docker Compose on different scenarios to build reproducible development environments.

Example #1: Node.js application with a Postgres database

To get started, all you need is a docker-compose.yaml file. The only required field is the version field.

# docker-compose.yaml

version: '3.8'

# ...more code to come...
1. Basic Docker Compose file, without any services.

This should be enough to make the docker-compose up command work (but it’s not doing anything yet).

In the same file, create a service from the Postgres image:

# docker-compose.yaml

version: '3.8'

+ services:
+  my-database:
+    image: postgres:11.4-alpine
+    environment:
+      POSTGRES_USER: root
+      POSTGRES_PASSWORD: password
+      POSTGRES_DB: db
+    ports:
+      - 5432:5432
2. Docker Compose file with a Postgres database service.

In the code above, my-database is the name of the service being created.

The values listed in environment are related to the Postgres image. They specify how the database should be configured. See other options here.

The field ports opens up the port 5432 of the container to the host — your computer. This makes the database accessible on localhost:5432.

Run the following command in your terminal to create and start the service:

$ docker-compose up --detach

Notice I am running docker-compose up with an additional --detach flag. This will run the containers in the background and give you back the control of your terminal.

The output will look like this:

Creating networkapp_default with the default driver
Creating app_my-database_1 ... done

This means that the Postgres database is running on your machine. Thanks to the magic of containers, this will work even if you never installed Postgres on your machine! How cool is that?

You can test the setup above with a simple Node.js application:

// index.js

const knex = require('knex') // `knex` and `pg` needs to be installed from npm!

;(async () => {
  const connection = knex({
    client: 'pg',
    connection: 'postgresql://root:password@localhost:5432/db',
  })

  const result = await connection.raw('select 1+1 as sum')

  console.log(`The result is: ${result.rows[0].sum}`)
})()
3. Simple Node.js script to test the connection to the database created with Docker Compose.

To stop and remove the containers created, run:

$ docker-compose down

Example #2: Postgres and Redis

Docker Compose can also run multiple services at once. Let’s extend the previous example with a Redis database:

# docker-compose.yaml
version: '3.8'

services:

  my-database:
    image: postgres:11.4-alpine
    environment:
      POSTGRES_USER: root
      POSTGRES_PASSWORD: password
      POSTGRES_DB: db
    ports:
      - 5432:5432

+  redis:
+    image: redis:6.0.10
+    ports:
+      - 6379:6379
4. Docker Compose file with a Redis and Postgres database.

The code above follows the same logic from the first example. There is no need to pass any values on environment since that was specific to the Postgres image.

Now, when running docker-compose up, it will start both the Redis and the Postgres instance in one go. You can connect to the Redis database using localhost:6739.

Example #3: Local builds

Services can be created from a local Dockerfile instead of external images.

This is useful if you want to start applications alongside your dependency services (or just applications, without any dependency). For example, you could have a database and a server both starting with the docker-compose up command.

The YAML file looks like this:

# docker-compose.yaml
version: '3.8'

services:
  my-api:
    build:
      context: .
      dockerfile: Dockerfile
    environment:
      DATABASE_URL: postgresql://root:password@localhost:5432/db
    depends_on: [my-database]
    ports:
      - '8080:8080'

  my-database:
    image: postgres:11.4-alpine
    environment:
      POSTGRES_USER: root
      POSTGRES_PASSWORD: password
      POSTGRES_DB: db
    ports:
      - 5432:5432
5. Docker Compose file with a Postgres database and a local application.

The syntax is slightly different for local builds. Instead of image, we are using the build property. It lets you specify the path or a directory where your Dockerfile is. By default, it looks for a file named Dockerfile, but this behaviour can be changed using the dockerfile property.

In the my-api service, the DATABASE_URL environment variable matches the configuration passed to the database service. The property depends_on tells Docker Compose that my-api should only be started after my-database is started.

About depends_on: there is a difference between started and ready. The property depends_on only waits for the first. If you need to wait for a service to be ready, there are other ways to control the startup order with Docker Compose.

But there is a problem with this approach. Because of how Docker works, any changes to your code require the container to be rebuilt. Even with a good image caching strategy, this process can slow down development a lot.

To solve this, you can mount the path of the host machine to the container using the volumes property:

# docker-compose.yaml
version: "3.8"

services:
  my-api:
    build:
      context: .
      dockerfile: Dockerfile
    ports:
      - "8080:8080"
+    volumes:
+      - .:/app

# ...rest of the code
6. Using `volumes` to mount your local code in the container.

This will mount the code in the current repository to /app inside the container. Any changes will be available on the fly inside the container, without having to rebuild the image.

Conclusion

The biggest advantage of integrating Docker Compose into your workflow is reproducibility. You will get the same local environment whether you run it on your computer, a coworker’s laptop, or a CI/CD virtual machine.

Using Docker Compose for local development workflows is listed as one of the use cases on Docker’s webpage, but I suspect it’s still not a widely known technique. It requires some basic understanding of containers and Docker, which are not straightforward concepts. Hopefully I was able to clarify things a bit!

Thanks for reading! Follow me on Twitter for more updates!

Ruan Martinelli

Hi! I'm Ruan – a freelance developer from Portugal. I write this blog.

Here you will find articles and tutorials about React, JavaScript, Node.js and more.

Follow me on X/Twitter to stay up to date with my latest posts.