Versioning of a nodejs project and dockerizing it with minimum image diff - docker

The npm version is located at package.json.
I have a Dockerfile, simplified as follows:
FROM NODE:carbon
COPY ./package.json ${DIR}/
RUN npm install
COPY . ${DIR}
RUN npm build
Correct my understanding,
If ./package.json changes, is it true that the writable docker image layers changes are from 2 to 5?
Assuming that I do not have any changes on npm package dependencies,
How could I change the project version but I do not want docker rebuild image layer for RUN npm install ?

To sum up, the behavior you describe using Docker is fairly standard (as soon as package.json has changed and has a different hash, COPY package.json ./ will be trigerred again as well as each subsequent Dockerfile command).
Thus, the docker setup outlined in the official doc of Node.js does not alleviate this, and proposes the following Dockerfile:
FROM node:carbon
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm install --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]
But if you really want to find ways to avoid rerunning npm install from scratch most of the time, you could just maintain two different files package.json and, say, package-deps.json, import (and rename) package-deps.json and run npm install, then use the proper package.json afterwards.
And if you want to have more checks to be sure that the dependencies of both files are not out-of-sync, it happens that the role of file package-lock.json may be of some help, if you use the new npm ci feature that comes with npm 5.8 (cf. the corresponding changelog) instead of using npm install.
In this case, as the latest version of npm available in Docker Hub is npm 5.6, you'll need to upgrade it beforehand.
All things put together, here is a possible Dockerfile for this use case:
FROM node:carbon
# Create app directory
WORKDIR /usr/src/app
# Upgrade npm to have "npm ci" available
RUN npm install -g npm#5.8.0
# Import conf files for dependencies
COPY package-lock.json package-deps.json ./
# Note that this REQUIRES to run the command "npm install --package-lock-only"
# before running "docker build …" and also REQUIRES a "package-deps.json" file
# that is in sync w.r.t. package.json's dependencies
# Install app dependencies
RUN mv package-deps.json package.json && npm ci
# Note that "npm ci" checks that package.json and package-lock.json are in sync
# COPY package.json ./ # subsumed by the following command
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]
Disclaimer: I did not try the solution above on a realistic example as I'm not a regular node user, but you may view this as a useful workaround for the dev phase…

Related

How to use docker build cache when version bumping a React app?

Docker doesn't use build cache when something in package.json or package-lock.json is changed, even if this is only the version number in the file, no dependencies are changed.
How can I achieve it so docker use the old build cache and skip npm install (npm ci) everytime?
I know that docker looks at the modified date of files. But package.json is not changed at all only the version number.
Below is my Dockerfile
FROM node:10 as builder
ARG REACT_APP_BUILD_NUMBER=X
ENV REACT_APP_BUILD_NUMBER="${REACT_APP_BUILD_NUMBER}"
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY .npmrc ./
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build
FROM nginx:alpine
COPY nginx/nginx.conf /etc/nginx/nginx.conf
COPY --from=builder /usr/src/app/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
Here are some solutions that should help mitigate this problem. There are trade-offs with each, but they are not necessarily mutually exclusive - they can be mixed together for better overall build performance.
Solution I: Docker BuildKit cache mounts
Docker BuildKit enables partial mitigation of this problem using the experimental RUN --mount=type=cache flag. It supports a reusable cache mount during the image build progress.
An important caveat here is that support for Docker BuildKit may vary significantly between CI/development environments. Check the documentation and the build environment to ensure it will have proper support (otherwise, it will error). Here are some requirements (but not necessarily an exhaustive list):
The Docker daemon needs to support BuildKit (requires Docker 18.09+).
Docker BuildKit needs to be explicitly enabled with DOCKER_BUILDKIT=1 or by default from a daemon/cli configuration.
A comment is needed at the start of the Dockerfile to enable experimental support: # syntax=docker/dockerfile:experimental
Here is a sample Dockerfile that makes use of this feature, caching npm dependencies locally to /usr/src/app/.npm for reuse in subsequent builds:
# syntax=docker/dockerfile:experimental
FROM node
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json package-lock.json /usr/src/app
RUN --mount=type=cache,target=/usr/src/app/.npm \
npm set cache /usr/src/app/.npm && \
npm ci
Notes:
This will cache fetched dependencies locally, but npm will still need to install these into the node_modules directory. Testing with a medium-sized project indicates that this does shave off some build time, but building node_modules can still be non-negligible.
/usr/src/app/.npm will not be included in the final build, and is only available during build time (however, a lingering .npm directory will exist).
The build cache can be cleared if needed, see this Docker forum spost.
Caching node_modules is not recommended. Removal of dependencies in package.json might not be properly propogated. Your milage may vary, if attempted.
Solution II: Install dependencies prior to copying package.json
On the host machine, a script extracts only the dependencies and devDependencies tags from package.json and copies those tags that a new file, such as package-dependencies.json.
E.g. package-dependencies.json:
{
"dependencies": {
"react": "^16.13.1"
},
"devDependencies": {
"gulp": "^4.0.2",
}
}
In the Dockerfile, COPY the package-dependencies.json and package-lock.json and install dependencies. Then, copy the original package.json. Unless changes occur to package-lock.json or package.json's dependencies/devDependencies tags, the layers will be cached and reused from a previous build, meaning minor changes to the package.json will not need to run npm ci/npm install.
Here is an example:
FROM node
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# copy dependency list and locked dependencies
COPY package-dependencies.json package-lock.json /usr/src/app/
# install dependencies
RUN npm ci
# copy over the full package configuration
COPY package.json /usr/src/app/
# ...
RUN npm run build
# ...
Notes:
If used mutually-exclusively, this solution will be faster than the first solution for small changes (such as a version bump), as it will not need to rerun npm ci.
package-dependencies.json will be in the layer history. While this file would be negligible/insignificant in size, it is still "wasted space" since it is not needed in the final image.
A quick script will be needed to generate package-dependencies.json. Depending on the build environments, this may be annoying to implement. Here is an example using the cli utility jq:
cat package.json | jq -S '. | with_entries(select (.key as $k | ["dependencies", "devDependencies"] | index($k)))' > package-dependencies.json
Solution III: All of the above
Solution I will enable caching npm dependencies locally for faster dependency fetching. Solution II will only ever trigger npm ci/npm install if a dependency or development dependency is updated. These solutions can used together to further accelerate build times.

run npm ci hangs on docker build ubuntu

I'm trying to build a docker file for an ionic project, on ubuntu virtualbox. Here's the dockerfile:
# Build
FROM beevelop/ionic AS ionic
# Create the application directory
WORKDIR /usr/src/app
# Install the application dependencies
# We can use wildcard to ensure both package.json AND package-lock.json are considered
# where available (npm#5+)
COPY package*.json ./
RUN npm --verbose ci
# Bundle app source
COPY . .
RUN ionic build
## Run
FROM nginx:alpine
#COPY www /usr/share/nginx/html
COPY --from=ionic /usr/src/app/www /usr/share/nginx/html
My problem is that the build gets stuck on step 4 (RUN npm --verbose ci) It starts downloading some packages, but then it hangs at some point.
I tried different solution:
npm clean cache
npm config set registry http://registry.npmjs.org/
removing package-lock.json
But nothing works, any help will be greatly apprecited. Thanks in advance
To whoever experienced this problem, it was to due to internet connection. Keep trying until it downloads all the packages.

What is the RUN command in Dockerfile for install vuetify?

I expected and tried to include it in Dockefile directly. Here is my whole dockerfile:
FROM node
# make the 'app' folder the current working directory
WORKDIR /app
# copy both 'package.json' and 'package-lock.json' (if available)
COPY package*.json ./
# install project dependencies
RUN npm install
RUN npm i --save #koumoul/vuetify-jsonschema-form
RUN npm install --save axios vue-axios
RUN npm install vuetify#1.5.8
# copy project files and folders to the current working directory (i.e. 'app' folder)
COPY . .
But got
Module not found: Error: Can't resolve 'vuetify' in '/app/src/views'
It is not good practice to install separately from package.json. You should just include it in your package.json.. But I am going to teach you a technique for testing cases like this.
You can run first the image on your own docker run -it node bash then do there what you want to run. You can also apply bind mount so the files that you needed are included like docker run -it -v=$(pwd):/usr/src/app node bash.. With this you can practice everything that you are trying to run in your Dockerfile more directly

Could not find a required file

I'm trying to run docker container with create-react-app App. App works fine and here's how my dockerfile looks like.
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR ./
# add `//node_modules/.bin` to $PATH
ENV PATH ./node_modules/.bin:$PATH
# install and cache dependencies
COPY package.json ./package.json
COPY ./build/* ./public/
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
# start
CMD ["npm", "start"]
When I run docker im getting error
> my-app#0.1.0 start /
> react-scripts start
Could not find a required file.
Name: index.js
Searched in: /src
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! my-app#0.1.0 start: `react-scripts start`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the my-app#0.1.0 start script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2019-07-14T08_29_30_761Z-debug.log
has anybody have any idea?
npm start is for webpack - which serves you as the dev server. you are still using directly the src files, not the minified build (dist), which will only be used on production.
#Dockerfile.dev:
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR ./
# add `//node_modules/.bin` to $PATH
ENV PATH ./node_modules/.bin:$PATH
COPY package.json ./package.json
#use the minified build file for production, not now - npm start is for development.
#COPY ./build/* ./public/
#install dependencies:
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
#copy your project files: (also bad for development, use volume(https://docs.docker.com/storage/volumes/) instead)
COPY . .
# start
CMD ["npm", "start"]
(This builds on #EfratLevitan's answer, but is a little more production-oriented. Their answer will be better if you want to use Docker as a core part of your development flow.)
If you have a working Webpack setup already, its output is static files that can be served up by any Web server. Once you've successfully run npm run build, you can use anything to serve the resulting build directory – serve it as static content from something like a Flask application, put it in a cloud service like Amazon S3 that can serve it for you, directly host it yourself. Any of the techniques described on the CRA Deployment page will work just fine in conjunction with a Docker-based backend.
If you'd like to serve this yourself via Docker, you don't need Node to serve the build directory, so a plain Web server like nginx will work fine. The two examples from the image description work for you here:
# Just use the image and inject the content as data
docker run -v $PWD/build:/usr/share/nginx/html -p 80:80 nginx
# Build an image with the content "baked in"
cat >Dockerfile <<EOF
FROM nginx
COPY ./build /usr/share/nginx/html
EOF
# Run it
docker build -t me/nginx .
docker run -p 80:80 me/nginx
The all-Docker equivalent to this is to use a multi-stage build to run the Webpack build inside Docker, then copy it out to a production Web server image.
FROM node:12.2.0-alpine AS build
WORKDIR /app
COPY package.json yarn.lock .
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
COPY . .
RUN npm run build
FROM nginx
COPY --from=build /app/build /usr/share/nginx/html
In this model you'd develop your front-end code locally. (The Webpack/CRA stack has pretty minimal host dependencies, and since the application runs in the user's browser, it can't depend on Docker-specific networking features.) You'd only build this Dockerfile once you wanted to run an end-to-end test with all of the parts running together before you actually pushed out to production.

Docker COPY . . for bundling

I am working on creating a docker image with the following
FROM node:lts-alpine
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "npm", "start"]
I am confused about the following line
# Bundle app source
COPY . .
What exactly is meant here by bundling? Copy everything? IF that is the case why is it copying the package.json file beforehand?
I was confused about the exact same thing.
The solution is the distinction between your application and its dependencies: Running npm install after copying package.json only installs the dependencies (creating the node_modules folder), but your application code is still not inside the container. That's what COPY . . does. I don't think the word "bundle" has any special meaning here, since it is just a normal file copy operation.
Separating those steps means the results of npm install (i.e. the state of the container after executing the command) can be cached by docker and thus don't have to be executed every time a part of the application code changes. This makes deploys faster.
PS: When talking about making deploys faster, have a look at npm ci: https://blog.npmjs.org/post/171556855892/introducing-npm-ci-for-faster-more-reliable
To bundle your app's source code inside the Docker image, use the COPY instruction:

Resources