NodeJs copy packages to docker image - docker

I have followed file structure
...
- public
- app
- docker/
- node-js/Dockerfile
docker-compose.yml
package.json
in my dockerfile I have a logic to copy package.json and run npm install
FROM node:12.0.0-alpine
MAINTAINER Bogdan Dubyk <bogdan.dubyk#gmail.comn>
COPY package.json /var/www/frontend/
RUN npm install
CMD [ "npm", "start" ]
but I'm getting an error ERROR: Service 'node-js' failed to build: COPY failed: stat /var/lib/docker/tmp/docker-builder184577258/package.json: no such file or directory while building the image, looks like Dockerfile looking for files only inside his own folder? is it possible to copy files from outside the folder?
I tried COPY ../../package.json /var/www/frontend/ but also getting error ERROR: Service 'node-js' failed to build: COPY failed: Forbidden path outside the build context: ../../package.json ()

Related

Missing file in my Dockerfile, whats the path its running from?

My git repo structure
app
- app.server
- server files
- app.client
- node_modules
- public
- src
- .dockerignore
- Dockerfile
- package.json
- package-log.json
I've set up CI/CD with GitHub actions but something is wrong in my Dockerimage for my client application (React)?
Error message: COPY failed: file not found in build context or excluded by .dockerignore: stat package.json: file does not exist
My .dockeringore file:
node_modules
build
.dockerignore
Dockerfile
Dockerfile.prod
My Dockerfile:
# pull official base image
FROM node:13.12.0-alpine
# set working directory
WORKDIR /app.client
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app.client/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
# add app
COPY ./ ./
# start app
CMD ["npm", "start"]
My GitHub action command for invoking the Dockerfile:
docker build app.client/ -t mycontainerregistry.azurecr.io/appdb:${{ github.sha }}
This is part of a publish to Azure Container registry that I'm trying to learn. I guess the Dockerfile works, because before failing at step 4/9 it goes thru the Dockerfile:
Step 1/9 : FROM node:13.12.0-alpine
13.12.0-alpine: Pulling from library/node
aad63a933944: Pulling fs layer
... (and so on)
Step 3/6 : COPY package.json ./
COPY failed: file not found in build context or excluded by .dockerignore: stat package.json: file does not exist
Error: Process completed with exit code 1.
WORKDIR refers to the working directory in the container. The working directory on the host is still the root of your repo.

Webpack inside a docker container "can't resolve './src'"

I'm trying to run a one-off webpack from within a docker container to generate a single bundle file. Unfortunately, webpack won't run inside the container based on the image I've configured.
Dockerfile ("DockerfileBuild"):
FROM node:10-alpine
COPY package.json ./
RUN npm install
RUN ["npm", "run", "build"]
docker-compose.yml:
version: "3.7"
services:
dist:
build:
context: .
dockerfile: DockerfileBuild
If I run docker-compose up dist I get ERROR in Entry module not found: Error: Can't resolve './src' in '/'.
I assume I haven't set up my image properly, but at this point I don't know what to do.
Notes:
The npm install seems to run ok beforehand.
The bundling runs ok outside the container.
You need to have your actual source files copied inside of the container before running npm run. I assume you have some reference to ./src in package.json, which would explain such an error.
Try copying everything you need before the RUN command (you can start copying everything with COPY . ., but may want to copy only ./src, that's on you).
FROM node:10-alpine
COPY package.json ./
RUN npm install
COPY . .
RUN ["npm", "run", "build"]
Since version 4.0.0, webpack doesn't need a configuration file and the default entry point is './src/index.js'. Based on the error that you are getting, it looks like you are not copying webpack.config.js into the docker image. It is trying to find the default index.js file in the './src' folder and that is why you are getting this error. If you copy the webpack.config.js then it should work.

How do I properly include an npm run test command in docker run?

I am sure I have run this command before, but I tested the following command in my terminal and got this error:
✗ docker run aa1112d76852 npm run test -- --coverage
ERRO[0001] error waiting for container: context canceled
docker: Error response from daemon: OCI runtime create failed: container_linux.go:344: starting container process caused "exec: \"npm\": executable file not found in $PATH": unknown.
I am concerned because this is the command with the exception of the container id, that I will be placing in my .travis.yml file. Where is the error in how I put it together this time?
This is my Dockerfile configuration:
FROM node:alpine as builder
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
RUN npm run build
FROM nginx
COPY --from=builder /app/build /usr/share/nginx/html
This is my docker-compose.yml file:
version: "3"
services:
web:
build: .
ports:
- "3000:3000"
volumes:
- /app/node_modules
- .:/app
So this worked previously because I was building it from Dockerfile.dev which has this last command that is crucial:
FROM node:alpine
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "start"]
Whereas the new container I was using was built from Dockerfile which has this configuration:
FROM node:alpine as builder
WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
RUN npm run build
FROM nginx
COPY --from=builder /app/build /usr/share/nginx/html
Notice the missing CMD ["npm", "run", "start"].
So the command should work in my .travis.yml file because I build it with my Dockerfile.dev like so:
before_install:
- docker build -t danale/docker-react -f Dockerfile.dev .
Just for context, this question is related to the "Docker and Kubernetes: The Complete Guide" course on Udemy which is fairly well-recommended on Reddit etc.
This error occurs when you pass the image ID for the production build (created using the Dockerfile) rather than the image ID of the Dockerfile.dev:
root#ubuntu-docker:/home/paul/frontend# docker run USERNAME/docker-react npm run test
docker: Error response from daemon: OCI runtime create failed:
container_linux.go:345: starting container process caused "exec: \"npm\": executable
file not found in $PATH": unknown.
root#ubuntu-docker:/home/paul/frontend# docker run USERNAME/docker-react-dev npm run
test
> frontend#0.1.0 test /app
> react-scripts test
PASS src/App.test.js
✓ renders without crashing (92ms)
Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total
Snapshots: 0 total
Time: 3.794s
Ran all test suites.
I believe it's because the Dockerfile.dev has the line "npm run start" which starts the development server and then allows it to receive the test command, whereas the production version just builds the app then serves it using nginx. I'm not familiar enough with React to really get my head around it.
However if anyone else (like the OP, and me) was doing the course and struggling with this error, I hope this helped.

no such file or directory in COPY dockerfile

I have made the following dockerfile to contain my node js application, the problem is that an error appears when building the dockerfile:
Sending build context to Docker daemon 2.048kB
Step 1/7 : FROM node:10
---> 0d5ae56139bd
Step 2/7 : WORKDIR /usr/src/app
---> Using cache
---> 5bfc0405d8fa
Step 3/7 : COPY package.json ./
COPY failed: stat /var/lib/docker/tmp/docker-
builder803334317/package.json: no such file or directory
this is my dockerfile:
FROM node:10
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]
i executed the command:
sudo docker build - < Dockerfile
into my root project folder.
My project folder is simple, like this:
-Project
-app.js
-Dockerfile
-package.json
-package-lock.json
-README.md
I am doing something wrong?
When you use the Dockerfile-on-stdin syntax
sudo docker build - < Dockerfile
the build sequence runs in a context that only has the Dockerfile, and no other files on disk.
The directory layout you show is pretty typical, and pointing docker build at that directory should work better
sudo docker build .
(This is the same rule as the "Dockerfiles can't access files in parent directories" rule, but instead of giving the current directory as the base directory to Docker, you're giving no directory at all, so it can't even access files in the current directory.)

Docker + node_modules: receiving error for local dependency while trying to run Dockerfile

I am working on creating a docker container for a node.js microservice and am running into an issue with a local dependency from another folder.
I added the dependency to the node_modules folder using:
npm install -S ../dependency1(module name).
This also added an entry in the package.json as follows:
"dependency1": "file:../dependency1".
When I run the docker-compose up -d command, I receive an error indicating the folowing:
npm ERR! Could not install from "../dependency1" as it does not contain a package.json file.
Dockerfile:
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY . /usr/src/app
RUN npm install
CMD [ "npm", "start" ]
EXPOSE 3000
docker-compose.yml:
customer:
container_name: "app_customer"
build:
context: .
dockerfile: Dockerfile
volumes:
- .:/usr/src/app/
- /usr/src/app/node_modules
ports:
- "3000:3000"
depends_on:
- mongo
- rabbitmq
I found articles outlining an issue with symlinks in a node_modules folder and docker and a few outlining this issue but none seem to provide a solution to this problem. I am looking for a solution to this problem or a really good workaround.
A Docker build can't reference files outside of the build context, which is the . defined in the docker-compose.yml file.
docker build creates a tar bundle of all the files in a build context and sends that to the Docker daemon for the build. Anything outside the context directory doesn't exist to the build.
You could move your build context with context: ../ to the parent directory and shuffle all the paths you reference in the Dockerfile to match. Just be careful not to make the build context to large as it can slow down the build process.
The other option is to publish the private npm modules to a scope, possible on a private npm registry that you and the build server have access to and install the dependencies normally.

Resources