Run webpack build during docker-compose build process - docker

I am trying to build a production docker container with a minified version of the js files.
In my Dockerfile, after installing the npm packages, I am trying to build the webpack compilation.
RUN npm install -g n # upgrading the npm version
RUN n stable
ADD ./webpack/package.json /package.json
RUN npm install --production
RUN npm run build-production # <<< Fails here
The docker build process will fail during the last command RUN npm run build-production with npm complaining that it can't find the installed packages (NODE_PATH is set).
However, when I add the call npm run build-production to my ENTRYPOINT script, it works fine and compiles everything as expected. However, it runs the webpack build everything I start the container, which isn't desired.
Why can't the last docker build step find the packages installed in the previous steps? But why does it work through the entrypoint script?
What is the best way to add the webpack build to the docker build in my Dockerfile?

Please use
RUN bash -l -c 'npm run build-production'
instead of your
RUN npm run build-production # <<< Fails here
this should help

The problem could be that build-production requires a devDependencies, which are not being installed.
A great way to keep your production images small is to use a tool like dobi. It makes it easy to run the build tasks in a dev container, then pack everything into a production image, all with a single command.
Below is an example dobi.yaml that might work for you.
meta:
project: some-project-name
image=builder:
image: myproject-dev
context: dockerfiles/
dockerfile: Dockerfile.build
image=production:
image: user/prod-image-name
context: .
depends: [build]
mount=source:
bind: .
path: /code
run=build:
use: builder
mounts: [source]
command: "npm run build-production"
artifact: path/to/minified/assets
Running dobi production will run any of the tasks that are stale. If none of the source files have changed the tasks are skipped. The depends: [build] ensures that the build step always runs first.

Related

How to get rid of "The Angular CLI requires a minimum Node.js" error during the creation of a docker image

Whenever I run the following command to create a Docker image,
docker build -t ehi-member-portal:v1.0.0 -f ./Dockerfile .
I get the following results
I'm not sure why it is complaining about Node version because I am currently running
And I am not sure why it is detecting v12.14.1 when you see I am running v14.20.0. I installed Node and NPM using NVM. I used this site as a reference to how to create the node and ngix image for a container.
Here is the contents of my Dockerfile:
FROM node:12.14-alpine AS builder
WORKDIR /dist/src/app
RUN npm cache clean --force
COPY . .
RUN npm install
RUN npm run build --prod
FROM nginx:latest AS ngi
COPY --from=builder /dist/ehi-member-portal /usr/share/nginx/html
COPY /nginx.conf /etc/nginx/conf.d/default.conf
EXPOSE 80
Here is more version information:
Any help would be HIGHLY appreciated. I need to figure this out.
RUN npm run build --prod is executed INSIDE the docker container, and node inside is not in required version.
Also you clearly states that you want to use node v12 with
FROM node:12.14-alpine AS builder
so this is why it is "detected" as 12 because this is the node version inside the container. Bump the version. You can use some of images listed here
https://hub.docker.com/_/node
eg
FROM node:14.20.0-alpine AS builder

Install all dependencies on CI once and use them for all next pipelines (Not doing npm i every time)

is it possible to install node_modules with all dependencies one time and save these files, not doing npm ci, npx playwright i, etc commands again?
I'm doing test automation on playwright and started facing with problem on gitlab, when I'm installing its components, there are some connection problems(according to the error code) on playwright's side as I understand and full pipeline fails. I'd like to solve it somehow.
Also I know that there is another way to solve that problem, maybe use docker image with installed playwright and other things from the start.
What can you suggest me, guys ?
An idle way could be to use docker multi stage builds to get this done. Attaching a sample dockerfile that does npm i then copy static files to nginx container.
Dockefile
ARG DistArtifact=teapot
FROM krravindra/angularbuilder:15 as builderstep
WORKDIR /app
ADD . /app
RUN npm i
RUN ng build
FROM nginx:1.17-alpine
COPY --from=builderstep /app/dist/$DistArtifact /usr/share/nginx/html/
Your CI might use
docker build -t myimagename .
You can always change your builder stage with a different version of angular or npm as required.

How to install #swc/core-linux-musl on windows, to make it work in docker container?

I'am working on windows. I docernize Next.js with Typescript app.
Here is my dockerfile:
FROM node:alpine
# create directory where our application will be run
RUN mkdir -p /usr/src
WORKDIR /usr/src
# copy our files into directory
COPY . /usr/src
# install dependences
RUN npm install
EXPOSE 3000
ENTRYPOINT ["npm", "run" ,"dev"]
During development I bind host catalogue to container by --mount type=bind,source=d:/apps/library.next.js/,target=/usr/src. When I start container I get error: error - Failed to load SWC binary for linux/x64, see more info here: https://nextjs.org/docs/messages/failed-loading-swc.
That's fine, I understand error and know what to do. To fix this I need to install #swc/cli #swc/core #swc/core-linux-musl, but I can't doing it because npm complain:
ERR! code EBADPLATFORM npm
ERR! notsup Unsupported platform for #swc/core-linux-musl#1.2.42: wanted {"os":"linux","arch":"x64"} (current: {"os":"win32","arch":"x64"})
How to install it on windows, or how to change docker setup to make it work? I must install it locally then it will be linked (by binding !) into container.
My workaround for now is to get into container by docker exec -it <id> /bin/sh then manually type npm install -save-dev #swc/cli #swc/core #swc/core-linux-musl. But doing that every time I recreate container is annoying.
The docs state: The -f or --force will force npm to fetch remote resources even if a local copy exists on disk. And it should be in the docs v6 legacy, the one you posted and v8 version. (See the section after --package-lock-only. It comes with an example npm install sax --force). So, you shouldn't have issues with that every time your container is recreating.

How to resolve "The cypress npm package is installed, but the Cypress binary is missing."

I'm trying to download and install Cypress within GitLab CI runner and getting this error output:
The cypress npm package is installed, but the Cypress binary is missing.
We expected the binary to be installed here: /root/.cache/Cypress/4.8.0/Cypress/Cypress
Reasons it may be missing:
- You're caching 'node_modules' but are not caching this path: /root/.cache/Cypress
- You ran 'npm install' at an earlier build step but did not persist: /root/.cache/Cypress
Properly caching the binary will fix this error and avoid downloading and unzipping Cypress.
Alternatively, you can run 'cypress install' to download the binary again.
I ran the suggested command cypress install but it didn't help.
Next it says You're caching 'node_modules' but are not caching this path: /root/.cache/Cypress
I don't understand how you can cache the modules and leave out the path to it.
Next is You ran 'npm install' at an earlier build step but did not persist I did have npm install in earlier builds so I replaced it with npm ci as it's recommended in Cypress official docs in such cases.
No resolution though.
Here are relevant lines where the error occurs:
inside of Dockerfile:
COPY package.json /usr/src/app/package.json
COPY package-lock.json /usr/src/app/package-lock.json
RUN npm ci
inside the test runner:
docker-compose -f docker-compose-prod.yml up -d --build
./node_modules/.bin/cypress run --config baseUrl=http://localhost
inside the package.json:
{
"name": "flask-on-docker",
"dependencies": {
"cypress": "^4.8.0"
}
}
Can anyone point me in a right direction ?
You probably are running npm install and cypress run in two different stages. In this case, cypress cache could not be persisted, So it is recommended to use CYPRESS_CACHE_FOLDER option while running install and as well as cypress run/open. The command will looks like this,
CYPRESS_CACHE_FOLDER=./tmp/Cypress yarn install
CYPRESS_CACHE_FOLDER=./tmp/Cypress npx cy run [--params]
This helped me (Windows):
.\node_modules\.bin\cypress.cmd install --force
Or if you're using a UNIX system:
./node_modules/.bin/cypress install --force
https://newbedev.com/the-cypress-npm-package-is-installed-but-the-cypress-binary-is-missing-591-code-example
yarn cypress install --force before running of tests worked for me
I had the same problem
I run this code to grant jenkins user to be the owner of my cypress project folder
and after that everything was ok.
sudo chown -R jenkins: /your cypress project path/

Compile webpack on docker production server

I am setting up docker for my React/Redux app, and I was wondering how to set it up in such way, that in production, on container setup, webpack compiles my whole code with production configuration, and then it removes itself, or something like that. Because the only thing I will need for my project is production code, and a simple node server that will serve it.
I'm not sure if I explained it well, since docker and webpack are still new things for me.
EDIT:
Alternatively I can even serve everything with an apache server, but I want everything to compile and setup just when I run docker-compose.
If I understand correctly, you want to trash your node dev dependencies from your image after your npm run build during the docker build.
You can do it but there is a little trick you must be aware of.
Each line in your Dockerfile result in a new step in the image and is pushed with the image.
So, if you execute in your Dockerfile :
RUN npm install # Install dev and prod deps
RUN npm run build # Execute your webpack build
RUN npm prune --production # Trash all devDependencies from your node_modules folder
Your image size will contains :
The first npm install
The npm run build
The result of the npm prune
Your image will be bigger than just :
RUN npm install # Install dev and prod deps
RUN npm run build # Execute your webpack build
Wich contains :
The first npm install
The npm run build
To avoid this problem you must do in your dockerfile :
RUN npm install && npm run build && npm prune --production
That way you will get a minimalistic image. With :
The npm run build
The result of the npm prune
Your final Dockerfile will be some sort of :
FROM node:7.4.0
ADD . /src
RUN cd /src && npm install && npm run build && npm prune --production # You can even use npm prune without the --production flag
ENV NODE_ENV production

Resources