Answer the question
In order to leave comments, you need to log in
How do you use Docker and npm/Composer when developing locally (node_modules/vendor folder)?
The task is to install only Git, Docker, the editor you like. Execute `git clone`, `docker-compose up` and start working on the project. A project is any set of services and databases, at least one of which is written in Node.js
Composer tag because the PHP community seems to have the same problem.
FROM node:10-alpine
ARG NODE_ENV=production
ENV NODE_ENV=${NODE_ENV}
# Install latest npm
RUN npm i [email protected] -g
WORKDIR /usr/src/app
# Instal dependencies
COPY package*.json ./
RUN npm install
# Copy app code
COPY ./src ./src
# https://github.com/nodejs/docker-node/blob/master/docs/BestPractices.md#non-root-user
USER node
CMD ["node", "src/index.js"]
version: '3'
services:
api:
environment:
- NODE_ENV=development
build:
context: ./api
args:
- NODE_ENV=development
volumes:
- ./api:/usr/src/app
- /usr/src/app/node_modules # mount the node_modules directory to the host machine
ports:
- 3000:3000
command: node_modules/.bin/nodemon src/index.js
Answer the question
In order to leave comments, you need to log in
For a dev environment, it makes sense to mount all code, including the node_modules folder, as a volume. And do npm i/composer install at the right time.
The same command is executed when building the image, as a result, all the necessary packages are installed during testing and on production.
Yes, this approach leaves the possibility of data discrepancies between dev and prod versions. For my situation, this probability is small enough to be neglected. In another case, you should use your option 2.5.
Options 1 and 2 are not recommended for simple reasons:
1. Violates the principle of uniformity of the environment: if developers, stage, test and prod environments have different versions of nodejs, this can turn into big problems. Therefore, all operations must be performed using programs installed in the container.
2. You can run into a situation where the remote dependency servers do not respond (overloaded, ddos, crashed, etc.) or one of the dependencies is removed from them (I already got it). In other words, the entire contents of the node_modules/vendor folder must be stored in the container. Or let your admin raise a mirror of the same packagist and fool around with the relevance of the packages on it (which makes little sense if you can store everything in a container image).
In addition to docker, yes docker-compose, in fact, nothing is needed for the dev, because there is already npm / composer inside the container (but this is not accurate) and you can install directly from the container to the mounted volume using "docker-compose run CONTAINER_NAME COMMAND"
If there are developers on windows, then you need to keep in mind that docker is incompatible with node.js watch, golang fresh, etc., because fs.notify will not work (i.e. nodedemon will not respond to changes). Therefore, in the case of a node, I only throw bases, radishes, etc. into the docker. It is better to throw the database data into external volumes, if it is just into a folder, it will break, postgre is especially sensitive to this. It makes sense to bother with building an image, loading it into the registry if there are a lot of wheelbarrows, otherwise it will be a complete mess, it takes a long time to build, images weigh a hell of a lot, it’s easier to connect through volumes and run composer update, npm i, migrate up in the container, and so that the versions match for a long time lock files were invented, though in php, yes, it will not save, because in packagist the version is not even necessary when creating a package)
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question