Dockerfile client and server build - docker

Quick question regarding Dockerfile.
I've got a folder structure like so:
docker-compose.yml
client
src
package.json
Dockerfile
...etc
Client folder contains reactjs application and root is nodejs server with typescript. I've created Dockerfile like so:
FROM node
RUN mkdir -p /server/node_modules && chown -R node:node /server
WORKDIR /server
USER node
COPY package*.json ./
RUN npm install
COPY --chown=node:node . ./dist
RUN npm run build-application
COPY /src/views ./dist/src/views
COPY /src/public ./dist/src/public
EXPOSE 4000
CMD node dist/src/index.js
npm run build-application command executes client build (npm run build --prefix ./client) and server(rimraf dist && mkdir dist && tsc -p .). The problem is that Docker cannot find client folder error:
npm ERR! enoent ENOENT: no such file or directory, open '/server/client/package.json'
npm ERR! enoent This is related to npm not being able to find a file.
Can someone explain why? And how to fix this?
Docker compose file:
...
server:
build:
context: ./server
dockerfile: Dockerfile
image: mazosios-pedutes-server
container_name: mazosios-pedutes-server
restart: unless-stopped
networks:
- app-network
env_file:
- ./server/.env
ports:
- "4000:4000"

Since the error is saying that there is no client/package.json in /server my question is the following one.
Is your ./client directory located within /server?
Dockerfile WORKDIR instruction makes all commands that follow it to be executed within the directory that you pass to WORKDIR as parameter.
I guess if you add RUN tree -d (lists only nested directories) after the last COPY instruction, you will be able to see where your client directory is located and you will be able to fix the path to it.

Related

Docker error can't copy a file after build it

I'm trying to copy my ./dist after building my angular app.
here is my Dockerfile
# Create image based off of the official Node 10 image
FROM node:12-alpine
RUN apk update && apk add --no-cache make git
RUN mkdir -p /home/project/frontend
# Change directory so that our commands run inside this new directory
WORKDIR /home/project/frontend
# Copy dependency definitions
COPY package*.json ./
RUN npm cache verify
## installing packages
RUN npm install
COPY ./ ./
RUN npm run build --output-path=./dist
COPY /dist /var/www/front
but when I run docker-compose build dashboard I get this error
Service 'dashboard' failed to build: COPY failed: stat /var/lib/docker/tmp/docker-builderxxx/dist: no such file or directory
I don't know why is there something wrong?
if you need to check also docker-compose file
...
dashboard:
container_name: dashboard
build: ./frontend
image: dashboard
container_name: dashboard
restart: unless-stopped
networks:
- app-network
...
The Dockerfile COPY directive copies content from the build context (the host-system directory in the build: line) into the image. If you're just trying to move around content within the image, you can RUN cp or RUN mv to use the ordinary Linux shell commands instead.
RUN npm run build --output-path=./dist \
&& cp -a dist /var/www/front

How to avoid node_modules folder being deleted

I'm trying to create a Docker container to act as a test environment for my application. I am using the following Dockerfile:
FROM node:14.4.0-alpine
WORKDIR /test
COPY package*.json ./
RUN npm install .
CMD [ "npm", "test" ]
As you can see, it's pretty simple. I only want to install all dependencies but NOT copy the code, because I will run that container with the following command:
docker run -v `pwd`:/test -t <image-name>
But the problem is that node_modules directory is deleted when I mount the volume with -v. Any workaround to fix this?
When you bind mount test directory with $PWD, you container test directory will be overridden/mounted with $PWD. So you will not get your node_modules in test directory anymore.
To fix this issue you can use two options.
You can run npm install in separate directory like /node and mount your code in test directory and export node_path env like export NODE_PATH=/node/node_modules
then Dockerfile will be like:
FROM node:14.4.0-alpine
WORKDIR /node
COPY package*.json ./
RUN npm install .
WORKDIR /test
CMD [ "npm", "test" ]
Or you can write a entrypoint.sh script that will copy the node_modules folder to the test directory at the container runtime.
FROM node:14.4.0-alpine
WORKDIR /node
COPY package*.json ./
RUN npm install .
WORKDIR /test
COPY Entrypoint.sh ./
ENTRYPOINT ["Entrypoint.sh"]
and Entrypoint.sh is something like
#!/bin/bash
cp -r /node/node_modules /test/.
npm test
Approach 1
A workaround is you can do
CMD npm install && npm run dev
Approach 2
Have docker install node_modules on docker-compose build and run the app on docker-compose up.
Folder Structure
docker-compose.yml
version: '3.5'
services:
api:
container_name: /$CONTAINER_FOLDER
build: ./$LOCAL_FOLDER
hostname: api
volumes:
# map local to remote folder, exclude node_modules
- ./$LOCAL_FOLDER:/$CONTAINER_FOLDER
- /$CONTAINER_FOLDER/node_modules
expose:
- 88
Dockerfile
FROM node:14.4.0-alpine
WORKDIR /test
COPY ./package.json .
RUN npm install
# run command
CMD npm run dev

Docker + node_modules: receiving error for local dependency while trying to run Dockerfile

I am working on creating a docker container for a node.js microservice and am running into an issue with a local dependency from another folder.
I added the dependency to the node_modules folder using:
npm install -S ../dependency1(module name).
This also added an entry in the package.json as follows:
"dependency1": "file:../dependency1".
When I run the docker-compose up -d command, I receive an error indicating the folowing:
npm ERR! Could not install from "../dependency1" as it does not contain a package.json file.
Dockerfile:
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY . /usr/src/app
RUN npm install
CMD [ "npm", "start" ]
EXPOSE 3000
docker-compose.yml:
customer:
container_name: "app_customer"
build:
context: .
dockerfile: Dockerfile
volumes:
- .:/usr/src/app/
- /usr/src/app/node_modules
ports:
- "3000:3000"
depends_on:
- mongo
- rabbitmq
I found articles outlining an issue with symlinks in a node_modules folder and docker and a few outlining this issue but none seem to provide a solution to this problem. I am looking for a solution to this problem or a really good workaround.
A Docker build can't reference files outside of the build context, which is the . defined in the docker-compose.yml file.
docker build creates a tar bundle of all the files in a build context and sends that to the Docker daemon for the build. Anything outside the context directory doesn't exist to the build.
You could move your build context with context: ../ to the parent directory and shuffle all the paths you reference in the Dockerfile to match. Just be careful not to make the build context to large as it can slow down the build process.
The other option is to publish the private npm modules to a scope, possible on a private npm registry that you and the build server have access to and install the dependencies normally.

Babelrc file in Docker builds

I'm running into the errors:
ERROR in ../~/babel-polyfill/lib/index.js
Couldn't find preset "es2015-loose" relative to directory "/app"
amongst a few other preset not found errors upon building a ReactJS project. It runs on webpackdevserver in dev.
COPY in Docker doesn't copy over dot files by default. Should I be copying .babelrc over to avoid this breaking? How to do this if so. If not, what am I missing/wrong ordering in this build?
Dockerfile
FROM alpine:3.5
RUN apk update && apk add nodejs
RUN npm i -g webpack \
babel-cli \
node-gyp
ADD package.json /tmp/package.json
RUN cd /tmp && npm install
RUN mkdir -p /app && cp -a /tmp/node_modules /app/
WORKDIR /app
COPY . /app
docker-compose
version: '2.1'
services:
webpack:
build:
context: .
dockerfile: Docker.doc
volumes:
- .:/app
- /app/node_modules
COPY in Docker doesn't copy over dot files by default.
This is not true. COPY in the Dockerfile copies dot files by default. I came across this question as I had faced this issue earlier. For anyone else who may encounter this issue, troubleshoot with the following:
Check your host/local directory if the dotfiles exists. If you are copying the files over from your OS's GUI, there's a chance that the dotfiles will not be ported over simply because they are hidden.
Check if you have a .dockerignore file that may be ignoring these dotfiles. More info from .dockerignore docs

Lift Sails inside Docker container

I know there are multiple examples (actually only a few) out there, and I've looked into some and tried to apply them to my case but then when I try to lift the container (docker-compose up) I end up with more or less the same error every time.
My folder structure is:
sails-project
--app
----api
----config
----node_modules
----.sailsrc
----app.js
----package.json
--docker-compose.yml
--Dockerfile
The docker-compose.yml file:
sails:
build: .
ports:
- "8001:80"
links:
- postgres
volumes:
- ./app:/app
environment:
- NODE_ENV=development
command: node app
postgres:
image: postgres:latest
ports:
- "8002:5432"
And the Dockerfile:
FROM node:0.12.3
RUN mkdir /app
WORKDIR /app
# the dependencies are already installed in the local copy of the project, so
# they will be copied to the container
ADD app /app
CMD ["/app/app.js", "--no-daemon"]
RUN cd /app; npm i
I tried also having RUN npm i -g sails instead (in the Dockerfile) and command:sails lift, but I'm getting:
Naturally, I tried different configurations of the Dockerfile and then with different commands (node app, sails lift, npm start, etc...), but constantly ending up with the same error. Any ideas?
By using command: node app you are overriding the command CMD ["/app/app.js", "--no-daemon"] which as a consequence will have no effect. WORKDIR /app will create an app folder so you don't have to RUN mkdir /app. And most important you have to RUN cd /app; npm i before CMD ["/app/app.js", "--no-daemon"]. NPM dependencies have to be installed before you start your app.

Resources