When I build the container and I check the files that should have been ignored, most of them haven’t been ignored.
This is my folder structure.
Root/ data/ project/ __pycache__/ media/ static/ app/ __pycache__/ migrations/ templates/ .dockerignore .gitignore .env docker-compose.yml Dockerfile requirements.txt manage.py
Let’s say i want to ignore the __pycache__
& data
(data will be created with the docker-compose up
command, when creating the container) folders and the .gitignore
& .env
files.
I will ignore these with the next .dockerignore file
.git .gitignore .docker */__pycache__/ **/__pycache__/ .env/ .venv/ venv/ data/
The final result is that only the git
& .env
files have been ignored. The data folder hasn’t been ignored but it’s not accesible from the container. And the __pycache__
folders haven’t been ignored either.
Here are the docker files.
docker-compose.yml
version: "3.8" services: app: build: . volumes: - .:/django-app ports: - 8000:8000 command: /bin/bash -c "sleep 7; python manage.py migrate; python manage.py runserver 0.0.0.0:8000" container_name: app-container depends_on: - db db: image: postgres volumes: - ./data:/var/lib/postgresql/data environment: - POSTGRES_DB=${DB_NAME} - POSTGRES_USER=${DB_USER} - POSTGRES_PASSWORD=${DB_PASSWORD} container_name: postgres_db_container
Dockerfile
FROM python:3.9-slim-buster ENV PYTHONUNBUFFERED=1 WORKDIR /django-app EXPOSE 8000 COPY requirements.txt requirements.txt RUN apt-get update && adduser --disabled-password --no-create-home userapp && apt-get -y install libpq-dev && apt-get -y install apt-file && apt-get -y install python3-dev build-essential && pip install -r requirements.txt USER userapp
Advertisement
Answer
You’re actually injecting your source code using volumes:
, not during the image build, and this doesn’t honor .dockerignore
.
Running a Docker application like this happens in two phases:
- You build a reusable image that contains the application runtime, any OS and language-specific library dependencies, and the application code; then
- You run a container based on that image.
The .dockerignore
file is only considered during the first build phase.
In your setup, you don’t actually COPY
anything in the image beyond the requirements.txt
file. Instead, you use volumes:
to inject parts of the host system into the container. This happens during the second phase, and ignores .dockerignore
.
The approach I’d recommend for this is to skip the volumes:
, and instead COPY
the required source code in the Dockerfile. You should also generally indicate the default CMD
the container will run in the Dockerfile, rather than requiring it it the docker-compose.yml
or docker run
command.
FROM python:3.9-slim-buster # Do the OS-level setup _first_ so that it's not repeated # if Python dependencies change RUN apt-get update && apt-get install -y ... WORKDIR /django-app # Then install Python dependencies COPY requirements.txt . RUN pip install -r requirements.txt # Then copy in the rest of the application # NOTE: this _does_ honor .dockerignore COPY . . # And explain how to run it ENV PYTHONUNBUFFERED=1 EXPOSE 8000 USER userapp # consider splitting this into an ENTRYPOINT that waits for the # the database, runs migrations, and then `exec "$@"` to run the CMD CMD sleep 7; python manage.py migrate; python manage.py runserver 0.0.0.0:8000
This means, in the docker-compose.yml
setup, you don’t need volumes:
; the application code is already inside the image you built.
version: "3.8" services: app: build: . ports: - 8000:8000 depends_on: - db # environment: [PGHOST=db] # no volumes: or container_name: db: image: postgres volumes: # do keep for persistent database data - ./data:/var/lib/postgresql/data environment: - POSTGRES_DB=${DB_NAME} - POSTGRES_USER=${DB_USER} - POSTGRES_PASSWORD=${DB_PASSWORD} # ports: ['5433:5432']
This approach also means you need to docker-compose build
a new image when your application changes. This is normal in Docker.
For day-to-day development, a useful approach here can be to run all of the non-application dependencies in Docker, but the application itself outside a container.
# Start the database but not the application docker-compose up -d db # Create a virtual environment and set it up python3 -m venv venv . venv/bin/activate pip install -r requirements.txt # Set environment variables to point at the Docker database export PGHOST=localhost PGPORT=5433 # Run the application locally ./manage.py runserver
Doing this requires making the database visible from outside Docker (via ports:
), and making the database location configurable (probably via environment variables, set in Compose with environment:
).