The web
service is responsible for the React frontend and the Django API. I find that combining the whole web tier into one service is the simplest approach, as opposed to deploying a separate service for the frontend.
It utilizes a volume to map the contents of the web
directory to the defined WORKDIR
for that image. This is what allows changes from your machine to be reflected directly on the Docker container, allowing you to continue with the usual development cycle without rebuilding the containers for each change.
The following ports are then mapped to their respective counterparts on the host machine: 8000
for the Django API, 3000
is for the React dev server, and 35729
for the NodeJS livereload.
Environment Variables are then loaded from the .env.dev
file, allowing further customization if necessary.
Lastly, stdin_open
is enabled to allow us to run the React dev server in interactive mode, giving us invaluable feedback as we write code.
Here’s what the docker-compose.yml
looks like for the web
service:
version: '3.7'
services:
web:
container_name: web
build:
context: ./web
dockerfile: Dockerfile
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./web/:/usr/src/web/
ports:
- 8000:8000
- 3000:3000
- 35729:35729
env_file:
- ./.env.dev
stdin_open: true
Next up is the db
service. This is the PostgreSQL server that our Django API can communicate with for data persistence. We use the stock 12.0-alpine
image and define the credentials through environment variables. The container is then attached to a volume in order to persist the data throughout shutdowns and restarts.
We’ll also make the web
service depend on the db
service. With conjunction to the web/entrypoint.sh
file, this will ensure that Django won’t try to connect to the PostgreSQL service while it’s still booting up.
Here’s what the final docker-compose.yml
looks like after adding the db
service:
version: '3.7'
services:
web:
container_name: web
build:
context: ./web
dockerfile: Dockerfile
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./web/:/usr/src/web/
ports:
- 8000:8000
- 3000:3000
- 35729:35729
env_file:
- ./.env.dev
stdin_open: true
depends_on:
- db
db:
container_name: db
image: postgres:12.0-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- POSTGRES_USER=dev_user
- POSTGRES_PASSWORD=dev_password
- POSTGRES_DB=dev
volumes:
postgres_data:
We could run the services defined with just Docker installed in our machine, but that won’t be suitable for a productive development setup.
I suggest to have npm
installed in order to build the node_modules
on our host machine, which is then accessible through the volume in the web
service. This enables IDE autocompletion for the React part.
Python, on the other hand, is much more forgiving with this regard. There are a couple of high quality extensions for your favorite IDE that allows you to connect to a remote server (in this case to our Docker container). Here’s some helpful articles to get you started:
To get started we need to run the following commands:
$ cp .env .env.dev # <- 0
$ chmod +x web/entrypoint.sh # <- 1
$ npm install --prefix ./web/frontend/ # <- 2
$ docker-compose up -d --build # <- 3
$ docker-compose exec web npm start --prefix ./frontend/ # <- 4
.env
should be good enough as is, but you can reconfigure it as needed.web
container.You can now access the React application on http://localhost:3000/ and the Django API on http://localhost:8000/. Click on the button to test if your frontend is integrated with the API:
If you get the “API Integration Works!” alert, you’re good to go!
In this article, we took a glance on a development setup for a React+Django+PostgreSQL stack using Docker containers. We reviewed the overview of the architecture for this setup, the details of each services, and how they connect to the host (and among themselves).
For the next article, we’ll implement a Continuous Integration (CI) pipeline for our stack.
I’ll see you then.
Got any feedback or suggestions? Feel free to send me an email or a tweet.
Ciao!