I am unable to build the Docker images and receiving an authentication error when attempting to build them.
When I copy the SSH key locally and run the build, it is successful. However, when attempting to run it through the pipelines, it fails with the error below. Can someone please help me with this?
I have tried building the Docker image locally and it was successful. Please assist me in building the Docker image through the Azure pipeline.
Dockerfile:
FROM node:16.13.2 as build
WORKDIR /usr/local/app
COPY ./ /usr/local/app/
RUN npm i -g npm#8.1.2
RUN npm cache clear --force
RUN npm install --legacy-peer-deps
RUN npm run build
FROM nginx:latest
COPY --from=build /usr/local/app/dist/angular-app /usr/share/nginx/html
EXPOSE 80
Related
Whenever I run the following command to create a Docker image,
docker build -t ehi-member-portal:v1.0.0 -f ./Dockerfile .
I get the following results
I'm not sure why it is complaining about Node version because I am currently running
And I am not sure why it is detecting v12.14.1 when you see I am running v14.20.0. I installed Node and NPM using NVM. I used this site as a reference to how to create the node and ngix image for a container.
Here is the contents of my Dockerfile:
FROM node:12.14-alpine AS builder
WORKDIR /dist/src/app
RUN npm cache clean --force
COPY . .
RUN npm install
RUN npm run build --prod
FROM nginx:latest AS ngi
COPY --from=builder /dist/ehi-member-portal /usr/share/nginx/html
COPY /nginx.conf /etc/nginx/conf.d/default.conf
EXPOSE 80
Here is more version information:
Any help would be HIGHLY appreciated. I need to figure this out.
RUN npm run build --prod is executed INSIDE the docker container, and node inside is not in required version.
Also you clearly states that you want to use node v12 with
FROM node:12.14-alpine AS builder
so this is why it is "detected" as 12 because this is the node version inside the container. Bump the version. You can use some of images listed here
https://hub.docker.com/_/node
eg
FROM node:14.20.0-alpine AS builder
I am trying to create the docker image of my strapi project with cloud hosted mongodb atlas database. Below is my dockerfile code
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
RUN npm run start:develop
CMD ["npm","start"]
I am running the below code to build the docker file
docker build .
I am not receiving any error but the problem is building of image is not completing, it sticks at http://localhost:1337. How can I resolve this? I have attached the screenshot . TIA :)
Your RUN npm run start:develop step is never ending since it is running the server.
You can either write that step in your CMD and remove your existing CMD ["npm","start"], or you can simply remove that step. It depends on your case.
Try the following Dockerfile:
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
CMD ["npm","start"]
or
FROM strapi/base
COPY ./ ./
RUN npm install
RUN npm run build
CMD ["npm"," run", "start:develop]
I'm trying to build a docker file for an ionic project, on ubuntu virtualbox. Here's the dockerfile:
# Build
FROM beevelop/ionic AS ionic
# Create the application directory
WORKDIR /usr/src/app
# Install the application dependencies
# We can use wildcard to ensure both package.json AND package-lock.json are considered
# where available (npm#5+)
COPY package*.json ./
RUN npm --verbose ci
# Bundle app source
COPY . .
RUN ionic build
## Run
FROM nginx:alpine
#COPY www /usr/share/nginx/html
COPY --from=ionic /usr/src/app/www /usr/share/nginx/html
My problem is that the build gets stuck on step 4 (RUN npm --verbose ci) It starts downloading some packages, but then it hangs at some point.
I tried different solution:
npm clean cache
npm config set registry http://registry.npmjs.org/
removing package-lock.json
But nothing works, any help will be greatly apprecited. Thanks in advance
To whoever experienced this problem, it was to due to internet connection. Keep trying until it downloads all the packages.
I am struggling to get my build deploying to AWS on Docker. I have no idea where the solution lays as this is my first time with Docker. I have got it all working fine locally, but when I deploy I get the following error in Elastic Beanstalk:
2020/04/30 05:35:02.330900 [ERROR] An error occurred during execution of command [app-deploy] - [Docker Specific Build Application]. Stop running the command. Error: failed to pull docker image: Command /bin/sh -c docker pull node:13.3.0 AS compile-image failed with error exit status 1. Stderr:"docker pull" requires exactly 1 argument.
See 'docker pull --help'.
This is what my Docker file looks like:
FROM node:13-alpine as builder
WORKDIR /opt/ng
COPY package.json package-lock.json ./
RUN npm install
ENV PATH="./node_modules/.bin:$PATH"
COPY . ./
RUN ng build --prod
FROM nginx:1.18-alpine
COPY nginx.conf /etc/nginx/conf.d/default.conf
COPY --from=builder /opt/ng/dist/angular-universal-app/browser /usr/share/nginx/html
Can someone please point me in the right direction? Or is this method of Multi-Stage builds not supported by Elastic Beanstalk's Docker version?
I had the same problem.
Actually I check the following rows in my log file:
2020/05/26 17:26:30.327310 [INFO] Running command /bin/sh -c docker pull node:alpine as builder
2020/05/26 17:26:30.369280 [ERROR] "docker pull" requires exactly 1 argument.
As you can seen, it tries to make a 'docker pull' with 3 arguments:
node:alpine
as
builder
and of course, that is not possible because it requires only 1 argument. Thus, apparently AWS Elastic Beanstalk doesn't support stage naming. For this reason I solved using an Unnamed builder:
FROM node:13-alpine
and in the end:
COPY --from=0 /opt/ng/dist/angular-universal-app/browser /usr/share/nginx/html
Final Dockerfile:
FROM node:13-alpine
WORKDIR /opt/ng
COPY package.json package-lock.json ./
RUN npm install
ENV PATH="./node_modules/.bin:$PATH"
COPY . ./
RUN ng build --prod
FROM nginx:1.18-alpine
COPY nginx.conf /etc/nginx/conf.d/default.conf
COPY --from=0 /opt/ng/dist/angular-universal-app/browser /usr/share/nginx/html
For me it works using that solution. If someone has any problem, please share the last-100-lines log
I have seen this error when using a solution stack that uses 'Amazon Linux 2'. These platforms are new and have some ongoing issues.
Amazon Linux 2 support for AWS Elastic Beanstalk is in beta release and is subject to change.
https://docs.aws.amazon.com/elasticbeanstalk/latest/platforms/platforms-beta.html
Please use a solution stack that has 'Amazon Linux' in the name. You should not face the issue there.
I'm trying to run docker container with create-react-app App. App works fine and here's how my dockerfile looks like.
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR ./
# add `//node_modules/.bin` to $PATH
ENV PATH ./node_modules/.bin:$PATH
# install and cache dependencies
COPY package.json ./package.json
COPY ./build/* ./public/
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
# start
CMD ["npm", "start"]
When I run docker im getting error
> my-app#0.1.0 start /
> react-scripts start
Could not find a required file.
Name: index.js
Searched in: /src
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! my-app#0.1.0 start: `react-scripts start`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the my-app#0.1.0 start script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2019-07-14T08_29_30_761Z-debug.log
has anybody have any idea?
npm start is for webpack - which serves you as the dev server. you are still using directly the src files, not the minified build (dist), which will only be used on production.
#Dockerfile.dev:
# base image
FROM node:12.2.0-alpine
# set working directory
WORKDIR ./
# add `//node_modules/.bin` to $PATH
ENV PATH ./node_modules/.bin:$PATH
COPY package.json ./package.json
#use the minified build file for production, not now - npm start is for development.
#COPY ./build/* ./public/
#install dependencies:
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
#copy your project files: (also bad for development, use volume(https://docs.docker.com/storage/volumes/) instead)
COPY . .
# start
CMD ["npm", "start"]
(This builds on #EfratLevitan's answer, but is a little more production-oriented. Their answer will be better if you want to use Docker as a core part of your development flow.)
If you have a working Webpack setup already, its output is static files that can be served up by any Web server. Once you've successfully run npm run build, you can use anything to serve the resulting build directory – serve it as static content from something like a Flask application, put it in a cloud service like Amazon S3 that can serve it for you, directly host it yourself. Any of the techniques described on the CRA Deployment page will work just fine in conjunction with a Docker-based backend.
If you'd like to serve this yourself via Docker, you don't need Node to serve the build directory, so a plain Web server like nginx will work fine. The two examples from the image description work for you here:
# Just use the image and inject the content as data
docker run -v $PWD/build:/usr/share/nginx/html -p 80:80 nginx
# Build an image with the content "baked in"
cat >Dockerfile <<EOF
FROM nginx
COPY ./build /usr/share/nginx/html
EOF
# Run it
docker build -t me/nginx .
docker run -p 80:80 me/nginx
The all-Docker equivalent to this is to use a multi-stage build to run the Webpack build inside Docker, then copy it out to a production Web server image.
FROM node:12.2.0-alpine AS build
WORKDIR /app
COPY package.json yarn.lock .
RUN npm install --silent
RUN npm install react-scripts#3.0.1 -g
COPY . .
RUN npm run build
FROM nginx
COPY --from=build /app/build /usr/share/nginx/html
In this model you'd develop your front-end code locally. (The Webpack/CRA stack has pretty minimal host dependencies, and since the application runs in the user's browser, it can't depend on Docker-specific networking features.) You'd only build this Dockerfile once you wanted to run an end-to-end test with all of the parts running together before you actually pushed out to production.