R
R
RuslanSer2021-07-31 21:46:10
Docker
RuslanSer, 2021-07-31 21:46:10

How to set up docker* configs so that files in the container are replaced with new ones when docker-compose up?

In general, there is a project, with Django, Dockerfile and docker-compose.yml. (Configs will be below)
When I want to change something, I do it on my dev server (python manage.py runserver), and when I'm done, I go to the server under ssh, copy files from the host to the container like this:
docker cp ./. <container:id>:/home/app/web/.

Next I connect to the container like this: And I do migrations, collect static files:
docker exec -it <container_name> sh

python manage.py migrate
python manage.py collectstatic

And own everything.

Is there a way to immediately copy the files when the container is raised (docker-compose up)?
That is, I downloaded the necessary files through git, do docker-compose down / docker-compose up and all that remains is to go in and apply the migrations.

Configs (I don't know much about docker and followed this guide https://testdriven.io/blog/dockerizing-django-with... ):

Dockerfile
###########
# BUILDER #
###########

# pull official base image
FROM python:3.8.3-alpine as builder

# set work directory
WORKDIR /usr/src/app

# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

# install psycopg2 dependencies
RUN apk add --update --no-cache g++ gcc libxslt-dev
RUN apk update \
    && apk add postgresql-dev gcc python3-dev musl-dev g++ libxslt libxslt-dev libffi-dev openssl-dev make


#RUN apk build-dep python3-lxml
#libxslt-dev libxslt py3-lxml
# lint
RUN pip install --upgrade pip
#RUN pip install flake8
COPY . .
#RUN flake8 --ignore=E501,F401 .

# install dependencies
COPY ./requirements.txt .
RUN pip wheel --no-cache-dir --no-deps --wheel-dir /usr/src/app/wheels -r requirements.txt


#########
# FINAL #
#########

# pull official base image
FROM python:3.8.3-alpine

# create directory for the app user
RUN mkdir -p /home/app

# create the app user
RUN addgroup -S app && adduser -S app -G app

# create the appropriate directories
ENV HOME=/home/app
ENV APP_HOME=/home/app/web
RUN mkdir $APP_HOME
RUN mkdir $APP_HOME/static
RUN mkdir $APP_HOME/media
WORKDIR $APP_HOME

# install dependencies
RUN apk update && apk add libpq
#Для python-docx и lxml
RUN apk add --update --no-cache --virtual .build-deps \
        g++ \
        libxml2 \
        libxml2-dev && \
    apk add libxslt-dev && \
    apk del .build-deps
COPY --from=builder /usr/src/app/wheels /wheels
COPY --from=builder /usr/src/app/requirements.txt .
RUN pip install --no-cache /wheels/*

# copy entrypoint-prod.sh
COPY ./entrypoint.prod.sh $APP_HOME

# copy project
COPY . $APP_HOME

# chown all the files to the app user
RUN chown -R app:app $APP_HOME

# change to the app user
USER app

RUN chmod +x /home/app/web/entrypoint.prod.sh
# run entrypoint.prod.sh
ENTRYPOINT ["/home/app/web/entrypoint.prod.sh"]


docker-compose.yml:
version: '3.7'

services:
  db:
    restart: always
    image: postgres:12.0-alpine
    volumes:
      - postgres_data:/var/lib/postgresql/data/
    environment:
      - POSTGRES_USER=xxxx
      - POSTGRES_PASSWORD=xxxx
      - POSTGRES_DB=xxxxx
    env_file:
      - ./.env.prod
  web:
    restart: always
    build:
      context: ./
      dockerfile: Dockerfile
    command: >
      sh -c "gunicorn -w 2 -b 0.0.0.0:8000 project.wsgi:application"
    volumes:
      #python manage.py migrate
      #python manage.py collectstatic --noinput &&
      #python manage.py runserver 0.0.0.0:8000
      #- .:/home/app/web
      - static_volume:/home/app/web/static
      - media_volume:/home/app/web/media
    #ports:
    #- '8000:8000'
    expose:
      - 8000
    env_file:
      - ./.env.prod
    depends_on:
      - db
  nginx:
    restart: always
    build: ./nginx
    volumes:
      - static_volume:/home/app/web/static
      - media_volume:/home/app/web/media
    ports:
      - 80:80
    depends_on:
      - web

volumes:
  postgres_data:
  static_volume:
  media_volume:

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
AlexandrBirukov, 2021-08-01
@AlexandrBirukov

Do you want to download the data from the git and update it in the docker? that is, changes in the code to pull up and restart the project? then you are not doing that, docker is not for this, the easiest option for automation is to write a bash script, but you can even not go to the server, but use fabric - it saves a lot of time.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question