testdrivenio/django-on-docker

postgres db is deleted every time I run

basitiyidir opened this issue · 5 comments

postgres db is deleted every time I run

docker-compose -f docker-compose.prod.yml up -d - build

My goal is just to update the interior of the /home/app/web/
Where am I making a mistake.

Did you make any changes to the Dockerfile, Docker Compose file, or entrypoint script?

Same as yours.

I made a change to a file in Django template.

Why does the build process do everything from the beginning.

deleting db in each build process.
very bad. :)

Dockerfile.prod

###########
# BUILDER #
###########

# pull official base image
FROM python:3.8.0-alpine as builder

# set work directory
WORKDIR /usr/src/myapp

# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

# install psycopg2 dependencies
RUN apk update \
    && apk add postgresql-dev gcc python3-dev musl-dev

# lint
RUN pip install --upgrade pip
COPY . /usr/src/myapp/

# install dependencies
COPY ./req.txt /usr/src/myapp/req.txt
RUN pip wheel --no-cache-dir --no-deps --wheel-dir /usr/src/myapp/wheels -r req.txt


#########
# FINAL #
#########

# pull official base image
FROM python:3.8.0-alpine

# create directory for the app user
RUN mkdir -p /home/myapp

# create the app user
RUN addgroup -S myapp && adduser -S myapp -G myapp

# create the appropriate directories
ENV HOME=/home/myapp
ENV APP_HOME=/home/myapp/web
RUN mkdir $APP_HOME
WORKDIR $APP_HOME

# install dependencies
RUN apk update && apk add libpq
COPY --from=builder /usr/src/myapp/wheels /wheels
COPY --from=builder /usr/src/myapp/req.txt .
RUN pip install --upgrade pip
RUN pip install --no-cache /wheels/*

# copy entrypoint-prod.sh
COPY ./entrypoint.prod.sh $APP_HOME

# copy project
COPY . $APP_HOME

# chown all the files to the app user
RUN chown -R myapp:myapp $APP_HOME

# change to the app user
USER myapp

# run entrypoint.prod.sh
ENTRYPOINT ["/home/myapp/web/entrypoint.prod.sh"]

docker-compose.prod.yml*

version: '3.7'

services:
    web:
        build:
            context: .
            dockerfile: Dockerfile.prod
        command: gunicorn myapp.wsgi:application --bind 0.0.0.0:8000
        ports:
            - 8000:8000
        env_file:
            - ./.env.prod
        depends_on:
            - db
    db:
        image: postgres:12.0-alpine
        volumes:
            - postgres_data:/var/lib/postgresql/data/
        env_file:
          - ./.env.prod.db

volumes:
    postgres_data:

Why do you think the DB is being deleted?

If you didn't change any of the code, and as long as the volume is being created properly, then the DB data should persist. Can you verify that the Postgres volume is being created?

Thank you for your answer.

No problem. Everything is OK. :)

I need help with something.

built the code

docker-compose -f docker-compose.prod.yml build

and run

docker-compose -f docker-compose.prod.yml up -d

everything is OK

There was a change in the my code.

I need to build it for this to happen on the server.
When I build it, it re-installs pip packs. Is there a way to just load codes?

Yes, you'll have to reorganize the Dockerfile though.

I mention caching in the blog post. See https://mherman.org/presentations/dockercon-2018/#46 as well. If you're in development you shouldn't be using the prod Docker Compose file.