Hey I'd like to run my cypress tests using gitlab pipelines. I've got the following docker image
FROM cypress/browsers:latest
ARG DIR="/usr/tests/e2e"
ENV NODE_MODULES_PATH="$DIR/node_modules"
WORKDIR $DIR
COPY ./tests/e2e/package*.json ./
RUN npm ci
which is built at the beginning of the pipeline as a first job. My .gitlab-ci.yml file looks as follows
image-e2e:
build and push a docker image
...
test-e2e-staging:
stage: test-staging
image: registry.gitlab.com/.../e2e:latest
script:
- cd tests/e2e
- npm run e2e:ci
environment:
name: staging
needs: ["deploy-frontend-staging", "deploy-backend-staging"]
dependencies: []
allow_failure: false
The e2e:ci command simply runs cypress
"e2e:ci": "cypress run --headless --browser chrome --config-file cypress/config/cypress.json",
But the job output on gitlab gives me a following error
> cypress run --headless --browser chrome --config-file cypress/config/cypress.json
sh: 1: cypress: not found
npm ERR! code ELIFECYCLE
npm ERR! syscall spawn
npm ERR! file sh
npm ERR! errno ENOENT
npm ERR! e2e#1.0.0 e2e:ci: `cypress run --headless --browser chrome --config-file cypress/config/cypress.json`
npm ERR! spawn ENOENT
npm ERR!
npm ERR! Failed at the e2e#1.0.0 e2e:ci script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm WARN Local package.json exists, but node_modules missing, did you mean to install?
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2022-05-01T12_59_51_436Z-debug.log
Can anyone tell me what I'm doing wrong here? Thanks a lot in advance. Also I got a cypress dependency in the devDependencies section in package.json. The image-e2e job gives me the following output
Step 6/6 : RUN npm ci
---> Running in ed278e712827
> cypress#9.5.4 postinstall /usr/tests/e2e/node_modules/cypress
> node index.js --exec install
so it looks like cypress has been successfully installed here
As it complains about cypress not found, why don't you simply switch to cypress/included docker image, no need to install cypress on the fly.
Related
npm ERR! Tracker "idealTree" already exists
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2022-11-19T05_41_58_533Z-debug-0.log
I am setting up docker to execute playwright test
I created a docker image which stores an npm project. The npm project has an npm script which runs tests. I use Gitlab for CI/CD where I want to define a job which will pull my image and run the npm script. This is the .gitlab.yml:
stages:
- test
.test-api:
image: $CI_REGISTRY_IMAGE
stage: test
script:
- cd packages/mypackage && npm run test:ci
artifacts:
paths:
- packages/mypackage/test-report.html
expire_in: 1 week
test-api-beta:
extends: .test-api
environment:
name: some-env
variables:
CI_REGISTRY_IMAGE: my_image_name
The gitlab job fails with the error:
> mypackage#1.0.0 test:ci /builds/my-organization/my-project/packages/mypackage
> DEBUG=jest-mongodb:* NODE_ENV=test ts-node --transpile-only --log-error node_modules/.bin/jest --watchAll=false --detectOpenHandles --bail
sh: 1: ts-node: not found
npm ERR! code ELIFECYCLE
npm ERR! syscall spawn
npm ERR! file sh
npm ERR! errno ENOENT
npm ERR! mypackage#1.0.0 test:ci: `DEBUG=jest-mongodb:* NODE_ENV=test ts-node --transpile-only --log-error node_modules/.bin/jest --watchAll=false --detectOpenHandles --bail`
npm ERR! spawn ENOENT
npm ERR!
npm ERR! Failed at the mypackage#1.0.0 test:ci script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm WARN Local package.json exists, but node_modules missing, did you mean to install?
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2021-04-28T09_05_39_023Z-debug.log
The main issue is the error npm WARN Local package.json exists, but node_modules missing, did you mean to install?. This means that the gitlab script is executed on the actual git repository of my project instead of being executed on the docker image. Indeed my repository doesn't contain node_modules so the job fails. But why doesn't gitlab execute the script on the actual image?
The docker image has a CMD directive:
CMD ["npm", "run", "start"]
Maybe the CMD somehow interferes with the gitlab script?
P.S. pulling the docker image manually and executing the npm script locally works.
This is my Dockerfile:
FROM node:14.15.1
COPY ./package.json /src/package.json
WORKDIR /src
RUN npm install
COPY ./lerna.json /src/lerna.json
COPY ./packages/mypackage/package.json /src/packages/mypackage/package.json
RUN npm run clean
COPY . /src
EXPOSE 8082
CMD ["npm" , "run", "start"]
EDIT: As per M. Iduoad answer if the script is changed as follows:
.test-api:
image: $CI_REGISTRY_IMAGE
stage: test
script:
- cd /src/packages/mypackage && npm run test:ci
artifacts:
paths:
- packages/mypackage/test-report.html
expire_in: 1 week
the npm script works. We need to cd /src/packages/mypackage because this is the location of script in the Dockerfile.
Gitlab always clones you repo and checkout the branch you are triggering the pipeline against and run your commands on that code (same folder CI_PROJECT_DIR)
So in order to use you version of the code you should either move to folder where it is located in your docker image.
.test-api:
image: $CI_REGISTRY_IMAGE
stage: test
script:
- cd /the/absolute/path/of/the/project/ && npm run test:ci
Doing this, defies gitlab-ci's way of doing things. Since you job will always run on the same code(the one in the image) every time your gitlab job is run. When gitlab-ci is a CI system and is intended to run you jobs against the code in your git repo.
So to summarize, I suggest you add a stage where you install you dependencies (node_modules)
stages:
- install
- test
install-deps:
image: node:latest # or the version you are using
stage: install
script:
- npm install
cache:
key: some-key
paths:
- $CI_PROJECT_DIR/node_modules
.test-api:
image: $CI_REGISTRY_IMAGE
stage: test
script:
- npm run test:ci
cache:
key: some-key
paths:
- $CI_PROJECT_DIR/node_modules
artifacts:
paths:
- packages/mypackage/test-report.html
expire_in: 1 week
This will use gitlab-ci's cache feature, to store the node_module across you jobs and across your pipeline.
You can control how to use and share the cache across pipelines and jobs, by changing the key. (read more about the cache on gitlab's docs)
I'm trying to dockerize my create-react-app development environment and preserving hot reloads. According to most guides (and this guy), the most direct way is docker run -p 3000:3000 -v "$(pwd):/var/www" -w "/var/www" node npm start in the project folder.
However, I'm getting this error instead:
$ docker run -p 3000:3000 -v "$(pwd):/var/www" -w "/var/www" node npm start
> my-app#0.1.0 start /var/www
> react-scripts start
sh: 1: react-scripts: Input/output error
npm ERR! code ELIFECYCLE
npm ERR! syscall spawn
npm ERR! file sh
npm ERR! errno ENOENT
npm ERR! my-app#0.1.0 start: `react-scripts start`
npm ERR! spawn ENOENT
npm ERR!
npm ERR! Failed at the my-app#0.1.0 start script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2020-04-02T06_55_22_257Z-debug.log
I'm running on Windows. I believe mounting the volume might have some permission issues leading to the input/output error, but testing various settings didn't work out. I'm honestly stumped. All I want is to run my app in Docker with hot reload for development.
As it turns out, setting up create-react-app in docker takes a little more work.
The primary issue is that mounted volumes are not available in the build step, so when node npm start runs the mounted project files technically don't exist yet.
As such, you need to copy over and install the project first to let it run the first time before the volume mounts. Hot reloading works normally afterwards.
Here's my final working setup:
docker-compose.yml:
create-react-app:
build:
context: create-react-app
ports:
- 3000:3000
environment:
- NODE_PATH=/node_modules
- CHOKIDAR_USEPOLLING=true
volumes:
- ./create-react-app:/create-react-app
Dockerfile:
FROM node:alpine
# Extend PATH
ENV PATH=$PATH:/node_modules/.bin
# Set working directory
WORKDIR /client
# Copy project files for build
ADD . .
# Install dependencies
RUN npm install
# Run create-react-app server
CMD ["npm", "run", "start"]
I'm doing the try it out section in loopback4 oficial site:
https://loopback.io/doc/en/lb4/Authentication-Tutorial.html
But when i try to execute the next command:
npm run docker:start
I get the following error:
loopback4-example-shopping-monorepo#1.1.1 docker:start C:\Users\jmlascasas\Documents\Laboratorio Hanuman\loopback4-example-shopping
> ./bin/start-dbs.sh
'.' is not recognized as an internal or external command,
operable program or batch file.
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! loopback4-example-shopping-monorepo#1.1.1 docker:start: `./bin/start-dbs.sh`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the loopback4-example-shopping-monorepo#1.1.1 docker:start script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! C:\Users\jmlascasas\AppData\Roaming\npm-cache\_logs\2019-11-27T10_43_56_262Z-debug.log
I tried to change the route and searched for answers on google but nothing solved my problem
npm run docker:start start a bash script behind the scene if you look into package.json,
"docker:start": "./bin/start-dbs.sh",
While you are using the window, so might be the case something wrong the script in window. You can try two option
Run the command in git bash
Run the command manually without npm
you can run these command which npm run docker:start does in pacakge.json
docker run --name mongo -p 27017:27017 -d mongo:latest
docker run --name redis -p 6379:6379 -d redis:latest
I'm fairly new to docker and I'm kind of experimenting with Angular CLI app. I managed to run it locally through my docker container. It works great, but when I try running it from my server it fails.
Server is hosted on DigitalOcean:
512 MB Memory / 20 GB Disk / FRA1 - Ubuntu Docker 17.03.0-ce on 14.04
I used dockerhub to transfer my container to the server.
When logging the container it gives me this:
** NG Live Development Server is running on http://0.0.0.0:4200. **
63% building modules 469/527 modules 58 active ...s/#angular/compiler/src/assertions.jsKilled
npm info lifecycle angular-test#0.0.0~start: Failed to exec start script
npm ERR! Linux 4.4.0-64-generic
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "start"
npm ERR! node v6.10.3
npm ERR! npm v3.10.10
npm ERR! code ELIFECYCLE
npm ERR! angular-test#0.0.0 start: `ng serve --host 0.0.0.0`
npm ERR! Exit status 137
npm ERR!
npm ERR! Failed at the angular-test#0.0.0 start script 'ng serve --host 0.0.0.0'.
npm ERR! Make sure you have the latest version of node.js and npm installed.
npm ERR! If you do, this is most likely a problem with the angular-test package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR! ng serve --host 0.0.0.0
npm ERR! You can get information on how to open an issue for this project with:
npm ERR! npm bugs angular-test
npm ERR! Or if that isn't available, you can get their info via:
npm ERR! npm owner ls angular-test
npm ERR! There is likely additional logging output above.
npm ERR! Please include the following file with any support request:
npm ERR! /usr/src/app/npm-debug.log
Here is my Dockerfile:
# Create image based on the official Node 6 image from dockerhub
FROM node:6
# Create a directory where our app will be placed
RUN mkdir -p /usr/src/app
# Change directory so that our commands run inside this new directory
WORKDIR /usr/src/app
# Copy dependency definitions
COPY package.json /usr/src/app
# Install dependecies
RUN npm install
# Get all the code needed to run the app
COPY . /usr/src/app
# Expose the port the app runs in
EXPOSE 4200
# Serve the app
CMD ["npm", "start"]
How come it runs locally and fails on server? Am I missing some dependencies?
ng serve is an angular-cli command. I'm guessing you need to install it globally in your docker file if you want to start your server like that on digital ocean:
RUN npm i -g angular-cli
I think it would be more typical to simply run the app using the naked node server in production. So your CMD would look more like this:
CMD ["node", "app.js"]