Yarn fails when yarn install twice - ruby-on-rails

When I run yarn + rails in a docker container I cannot execute yarn twice during my Jenkins build process. Does anyone of you know a good solution for the usage of yarn in a Jenkins pipeline.
I currently create a docker-compose file and execute my commands on the container with rails + yarn installed. The first yarn install command is always passing but further yarn installs keep failing which are automatically used by rails to check integrity.

Try clearing the cache or test it with npm -g

Related

Install all dependencies on CI once and use them for all next pipelines (Not doing npm i every time)

is it possible to install node_modules with all dependencies one time and save these files, not doing npm ci, npx playwright i, etc commands again?
I'm doing test automation on playwright and started facing with problem on gitlab, when I'm installing its components, there are some connection problems(according to the error code) on playwright's side as I understand and full pipeline fails. I'd like to solve it somehow.
Also I know that there is another way to solve that problem, maybe use docker image with installed playwright and other things from the start.
What can you suggest me, guys ?
An idle way could be to use docker multi stage builds to get this done. Attaching a sample dockerfile that does npm i then copy static files to nginx container.
Dockefile
ARG DistArtifact=teapot
FROM krravindra/angularbuilder:15 as builderstep
WORKDIR /app
ADD . /app
RUN npm i
RUN ng build
FROM nginx:1.17-alpine
COPY --from=builderstep /app/dist/$DistArtifact /usr/share/nginx/html/
Your CI might use
docker build -t myimagename .
You can always change your builder stage with a different version of angular or npm as required.

How do I install npm package into Cypress container?

I'm trying to install randomstring package into Cypress container because my tests are using it.
Dockerfile:
FROM cypress/base:17.8.0
RUN npm install randomstring#1.2.2
RUN npm install cypress#7.0.0
COPY . /e2e
RUN npm cypress run
error output I'm getting:
1 error occurred:
* Status: The command '/bin/sh -c npm install randomstring#1.2.2 cypress#7.0.0' returned a non-zero code: 1, Code: 1
I'm aware that Cypress/base image comes with operating system and dependencies required in order to run Cypress but I'm not sure if npm is included.
What is the right way to install packages into Cypress container ?
Any particular reason to use base image?
Have you tried with cypress/included:7.0.0, which already has cypress + browsers installed.
Furthermore it's better to keep all your dependencies (randomstring included) in package.json file where they belong, copy it over and call npm install or npm ci.

How to resolve "The cypress npm package is installed, but the Cypress binary is missing."

I'm trying to download and install Cypress within GitLab CI runner and getting this error output:
The cypress npm package is installed, but the Cypress binary is missing.
We expected the binary to be installed here: /root/.cache/Cypress/4.8.0/Cypress/Cypress
Reasons it may be missing:
- You're caching 'node_modules' but are not caching this path: /root/.cache/Cypress
- You ran 'npm install' at an earlier build step but did not persist: /root/.cache/Cypress
Properly caching the binary will fix this error and avoid downloading and unzipping Cypress.
Alternatively, you can run 'cypress install' to download the binary again.
I ran the suggested command cypress install but it didn't help.
Next it says You're caching 'node_modules' but are not caching this path: /root/.cache/Cypress
I don't understand how you can cache the modules and leave out the path to it.
Next is You ran 'npm install' at an earlier build step but did not persist I did have npm install in earlier builds so I replaced it with npm ci as it's recommended in Cypress official docs in such cases.
No resolution though.
Here are relevant lines where the error occurs:
inside of Dockerfile:
COPY package.json /usr/src/app/package.json
COPY package-lock.json /usr/src/app/package-lock.json
RUN npm ci
inside the test runner:
docker-compose -f docker-compose-prod.yml up -d --build
./node_modules/.bin/cypress run --config baseUrl=http://localhost
inside the package.json:
{
"name": "flask-on-docker",
"dependencies": {
"cypress": "^4.8.0"
}
}
Can anyone point me in a right direction ?
You probably are running npm install and cypress run in two different stages. In this case, cypress cache could not be persisted, So it is recommended to use CYPRESS_CACHE_FOLDER option while running install and as well as cypress run/open. The command will looks like this,
CYPRESS_CACHE_FOLDER=./tmp/Cypress yarn install
CYPRESS_CACHE_FOLDER=./tmp/Cypress npx cy run [--params]
This helped me (Windows):
.\node_modules\.bin\cypress.cmd install --force
Or if you're using a UNIX system:
./node_modules/.bin/cypress install --force
https://newbedev.com/the-cypress-npm-package-is-installed-but-the-cypress-binary-is-missing-591-code-example
yarn cypress install --force before running of tests worked for me
I had the same problem
I run this code to grant jenkins user to be the owner of my cypress project folder
and after that everything was ok.
sudo chown -R jenkins: /your cypress project path/

Compile webpack on docker production server

I am setting up docker for my React/Redux app, and I was wondering how to set it up in such way, that in production, on container setup, webpack compiles my whole code with production configuration, and then it removes itself, or something like that. Because the only thing I will need for my project is production code, and a simple node server that will serve it.
I'm not sure if I explained it well, since docker and webpack are still new things for me.
EDIT:
Alternatively I can even serve everything with an apache server, but I want everything to compile and setup just when I run docker-compose.
If I understand correctly, you want to trash your node dev dependencies from your image after your npm run build during the docker build.
You can do it but there is a little trick you must be aware of.
Each line in your Dockerfile result in a new step in the image and is pushed with the image.
So, if you execute in your Dockerfile :
RUN npm install # Install dev and prod deps
RUN npm run build # Execute your webpack build
RUN npm prune --production # Trash all devDependencies from your node_modules folder
Your image size will contains :
The first npm install
The npm run build
The result of the npm prune
Your image will be bigger than just :
RUN npm install # Install dev and prod deps
RUN npm run build # Execute your webpack build
Wich contains :
The first npm install
The npm run build
To avoid this problem you must do in your dockerfile :
RUN npm install && npm run build && npm prune --production
That way you will get a minimalistic image. With :
The npm run build
The result of the npm prune
Your final Dockerfile will be some sort of :
FROM node:7.4.0
ADD . /src
RUN cd /src && npm install && npm run build && npm prune --production # You can even use npm prune without the --production flag
ENV NODE_ENV production

Run webpack build during docker-compose build process

I am trying to build a production docker container with a minified version of the js files.
In my Dockerfile, after installing the npm packages, I am trying to build the webpack compilation.
RUN npm install -g n # upgrading the npm version
RUN n stable
ADD ./webpack/package.json /package.json
RUN npm install --production
RUN npm run build-production # <<< Fails here
The docker build process will fail during the last command RUN npm run build-production with npm complaining that it can't find the installed packages (NODE_PATH is set).
However, when I add the call npm run build-production to my ENTRYPOINT script, it works fine and compiles everything as expected. However, it runs the webpack build everything I start the container, which isn't desired.
Why can't the last docker build step find the packages installed in the previous steps? But why does it work through the entrypoint script?
What is the best way to add the webpack build to the docker build in my Dockerfile?
Please use
RUN bash -l -c 'npm run build-production'
instead of your
RUN npm run build-production # <<< Fails here
this should help
The problem could be that build-production requires a devDependencies, which are not being installed.
A great way to keep your production images small is to use a tool like dobi. It makes it easy to run the build tasks in a dev container, then pack everything into a production image, all with a single command.
Below is an example dobi.yaml that might work for you.
meta:
project: some-project-name
image=builder:
image: myproject-dev
context: dockerfiles/
dockerfile: Dockerfile.build
image=production:
image: user/prod-image-name
context: .
depends: [build]
mount=source:
bind: .
path: /code
run=build:
use: builder
mounts: [source]
command: "npm run build-production"
artifact: path/to/minified/assets
Running dobi production will run any of the tasks that are stale. If none of the source files have changed the tasks are skipped. The depends: [build] ensures that the build step always runs first.

Resources