npm install fails with ChromeDriver installation failed error [duplicate] - docker

This question already has answers here:
Is there a way to make npm install (the command) to work behind proxy?
(31 answers)
Closed 4 days ago.
Im building the vue.js application using docker as follows
# build stage
FROM node:10.15.0 as build-stage
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build
# production stage
FROM nginx:1.13.12-alpine as production-stage
.
.
.
Since this in this environment there is no internet, I have added proxy as below, after the COPY command.
ENV HTTP_PROXY http://<proxy-ip>:3128
ENV HTTPS_PROXY http://<proxy-ip>:3128
ENV http_proxy http://<proxy-ip>:3128
ENV https_proxy http://<proxy-ip>:3128
But still it seems to be not able to connect to internet fully, Below is the error I get during the npm install step,
I have tried changing the proxy configurations as follows, But still the issue is same
RUN npm config set proxy http://<proxy-ip>:3128
RUN npm config set http-proxy http://<proxy-ip>:3128
RUN npm config set https-proxy http://<proxy-ip>:3128

I think you can try to point NPM to use http version of registry.
The proxy config is still needed, but please add the following command before "npm install":
RUN npm config set registry http://registry.npmjs.org/

Related

Trying to install npm packages in local docker environment + content filtering by internet provider

I want to run local docker.
But I have content filtering by my internet service provider, So it doesn't work correctly.
my docker file :
FROM node:15.14
COPY . .
ENV NODE_ENV=development
RUN npm install --unsafe-perm
RUN npm run build
RUN npm i -g pm2
I tried docker run x and got error :
certificate signed by unknown authority.
Can somebody please tell me how to solve it?
I found a solution.
I added this lines to the dockerFile:
ADD your_provider_certificate_sign_path local_path
RUN cat local_path | sh
ENV NODE_EXTRA_CA_CERTS=/etc/ca-bundle.crt
ENV REQUESTS_CA_BUNDLE=/etc/ca-bundle.crt
ENV SSL_CERT_FILE=/etc/ca-bundle.crt
and it run successfully.

Docker build failing with error at array-slice

I'm running a docker build in windows and npm has failed in a step called array-slice. I installed the dependency manually and re-ran the build. Then it failed on buffer-from.
It says it cannot access the dependency url in the error - but it works when I test it in the browser - which seems to me that my docker build process is having trouble to access some urls, for some reason...
npm WARN server#1.0.0 No description
npm WARN server#1.0.0 No repository field.
npm ERR! network timeout at: https://registry.npmjs.org/array-unique/-/array-unique-0.3.2.tgz
Is there any necessary additional config to make sure npm can access those dependencies?
This is my Dockerfile content:
FROM node:12-alpine
WORKDIR /usr/app
COPY package.json package-lock.json ./
RUN npm install
COPY . .
EXPOSE 3333
CMD ["npm", "run", "dev"]
It was just an installation issue on windows... Removed everything, including virtualbox, and installed it all again... seems to work fine now...
The only problem is that my app is still not exposed to the port I've requested... but that's another question.

Could not find a required file

I'm trying to run docker container with create-react-app App. App works fine and here's how my dockerfile looks like.
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR ./
# add `//node_modules/.bin` to $PATH
ENV PATH ./node_modules/.bin:$PATH
# install and cache dependencies
COPY package.json ./package.json
COPY ./build/* ./public/
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
# start
CMD ["npm", "start"]
When I run docker im getting error
> my-app#0.1.0 start /
> react-scripts start
Could not find a required file.
Name: index.js
Searched in: /src
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! my-app#0.1.0 start: `react-scripts start`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the my-app#0.1.0 start script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2019-07-14T08_29_30_761Z-debug.log
has anybody have any idea?
npm start is for webpack - which serves you as the dev server. you are still using directly the src files, not the minified build (dist), which will only be used on production.
#Dockerfile.dev:
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR ./
# add `//node_modules/.bin` to $PATH
ENV PATH ./node_modules/.bin:$PATH
COPY package.json ./package.json
#use the minified build file for production, not now - npm start is for development.
#COPY ./build/* ./public/
#install dependencies:
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
#copy your project files: (also bad for development, use volume(https://docs.docker.com/storage/volumes/) instead)
COPY . .
# start
CMD ["npm", "start"]
(This builds on #EfratLevitan's answer, but is a little more production-oriented. Their answer will be better if you want to use Docker as a core part of your development flow.)
If you have a working Webpack setup already, its output is static files that can be served up by any Web server. Once you've successfully run npm run build, you can use anything to serve the resulting build directory – serve it as static content from something like a Flask application, put it in a cloud service like Amazon S3 that can serve it for you, directly host it yourself. Any of the techniques described on the CRA Deployment page will work just fine in conjunction with a Docker-based backend.
If you'd like to serve this yourself via Docker, you don't need Node to serve the build directory, so a plain Web server like nginx will work fine. The two examples from the image description work for you here:
# Just use the image and inject the content as data
docker run -v $PWD/build:/usr/share/nginx/html -p 80:80 nginx
# Build an image with the content "baked in"
cat >Dockerfile <<EOF
FROM nginx
COPY ./build /usr/share/nginx/html
EOF
# Run it
docker build -t me/nginx .
docker run -p 80:80 me/nginx
The all-Docker equivalent to this is to use a multi-stage build to run the Webpack build inside Docker, then copy it out to a production Web server image.
FROM node:12.2.0-alpine AS build
WORKDIR /app
COPY package.json yarn.lock .
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
COPY . .
RUN npm run build
FROM nginx
COPY --from=build /app/build /usr/share/nginx/html
In this model you'd develop your front-end code locally. (The Webpack/CRA stack has pretty minimal host dependencies, and since the application runs in the user's browser, it can't depend on Docker-specific networking features.) You'd only build this Dockerfile once you wanted to run an end-to-end test with all of the parts running together before you actually pushed out to production.

Webpack app in docker needs environment variables before it can be built

New to docker so maybe I'm missing something obvious...
I have an app split into a web client and a back end server. The back end is pretty easy to create an image for via a Dockerfile:
COPY source
RUN npm install, npm run build
CMD npm run start
The already-built back end app will then access the environment variables at runtime.
With the web client it's not as simple because webpack needs to have the environment variables before the application is built. This leaves me as far as I'm aware only two options:
Require the user to build their own image from the application source
Build the web client on container run by running npm run build in CMD
Currently I'm doing #2 but both options seem wrong to me. What's the best solution?
FROM node:latest
COPY ./server /app/server
COPY ./web /app/web
WORKDIR /app/web
CMD ["sh", "-c", "npm install && npm run build && cd ../server && npm install && npm run build && npm run start"]
First, it would be a good idea for both the backend server and web client to each have their own Dockerfile/image. Then it would be easy to run them together using something like docker-compose.
The way you are going to want to provide environment variables to the web Dockerfile is by using build arguments. Docker build arguments are available when building the Docker image. You use these by specifying the ARG key in the Dockerfile, or by passing the --build-arg flag to docker build.
Here is an example Dockerfile for your web client based on what you provided:
FROM node:latest
ARG NODE_ENV=dev
COPY ./web /app/web
WORKDIR /app/web
RUN npm install \
&& npm run build
CMD ["npm", "run", "start"]
The following Dockerfile uses the ARG directive to create a variable with a default value of dev.
The value of NODE_ENV can then be overridden when running docker build.
Like so:
docker build -t <myimage> --build-arg NODE_ENV=production .
Whether you override it or not NODE_ENV will be available to webpack before it is built. This allows you to build a single image, and distribute it to many people without them having to build the web client.
Hopefully this helps you out.

vue pwa in docker - cannot find module 'chalk'

this is my first question on stackoverflow. Thank you all for this absolute fantastic forum!
I try to get a vue pwa in docker running. I used the vue-cli to setup the pwa application. Installing and running local is no problem.
Then i tried to dockerize the project.
I tried with following docker code:
# Start with a Node.js image.
FROM node:10
# Make directory to install npm packages
RUN mkdir /install
ADD ["./code/package.json", "/install"]
WORKDIR /install
RUN npm install --verbose
ENV NODE_PATH=/install
# Copy all our files into the image.
RUN mkdir /code
WORKDIR /code
COPY . /code/
EXPOSE 8080
CMD npm run dev
The problem is when starting up the composition i get the error:
web_1 | internal/modules/cjs/loader.js:573
web_1 | throw err;
web_1 | ^
web_1 |
web_1 | Error: Cannot find module 'chalk'
...
I tried different ways now for a few days. But i can't see any solution. Do i miss something? Is there an incompatibility?
I also tried to change completely to yarn but the effect is the same. So i don't think there is a problem with installing the packages. Could there be a problem with the Node_Path variable?
Thanks for your support in advance!
Facing the same issue,
Normally you wouldn't install any devDependencies for production, therefore when NODE_ENV=production, NPM/Yarn will not install devDependencies.
For docker use case, when we build static site within the docker contianer, we might need to use NODE_ENV = production, to replace some PRODUCTION VARIABLES, therefore we'll need to use NODE_ENV = production but also install dev dependencies.
Some of the solution
1 - move everything from devDependencies to dependencies
2 - do not set NODE_ENV=production at yarn install || npm install, only set it after module installation
3 - for YARN, NODE_ENV=production yarn install --production=false, there should be NPM equivalent
4 - ( not tested ), some other name I.E NODE_ENV=prod, instead of the full name production, but you might need to play around with other configs that relies on NODE_ENV=production

Resources