So the situation is that I have coupled a bash script with docker. I use bash script to pull my code from the remote repo, and then execute the docker-compose
Here's my dockerfile
FROM node:6
WORKDIR /home/ayush/project-folder
RUN pwd
RUN npm run build
CMD ["forever", "server/app.js"]
Here's the section of my docker-compose.yml that has the above service listed:
web:
build: ./client
environment:
- NODE_VERSION=$NODE_WEB_VERSION
ports:
- "4200:4200"
api:
And here's my simple bash script
#!/usr/bin/env bash
frontend=$1
backend=$2
git clone //remote_url --branch $frontend --single-branch
mv project-folder ~/
docker-compose up
But the issue is that the RUN npm run build gives error,
enoent ENOENT: no such file or directory, open '/home/ayush/project-folder/package.json'
What could be the issue?
You forgot to copy your project during build time, with
...
COPY . ./
RUN npm run build
...
This will copy the whole build context (the contents of ./client) to the working directory, which you've set to /home/ayush/project-folder in the previous statement.
Related
Totally new to Gitlab and CI in general, so apologies for the lack of understanding. I have a repo, which is NuxtJS based, with a Dockerfile. The end goal of the pipeline is to build and push this repo to my docker account. The Dockerfile is relatively straight forward, containing an npm install and npm run build. I'm using a custom docker image as my runner, based on docker:20.10.17-dind-alpine3.16 with ansible, terraform and kubectl installed.
When building the project's docker image on my local machine, I receive no issues, however in gitlab, when running the npm run build command, I get the following error:
Module not found: Error: Can't resolve '../node_modules/vue-confirm-dialog' in '/usr/src/nuxt-app/plugins'
Here is my yml file:
stages:
- docker
docker:
stage: docker
image: <my-runner-image>
services:
- "docker:dind"
before_script:
- docker login -u $DOCKER_REGISTRY_USER -p $DOCKER_REGISTRY_PASSWORD
script:
- docker build -t <my-repo> .
- docker push <my-repo>
Any suggestions are greatly appreciated
--EDIT--
As requested, here is the project's Dockerfile:
FROM node:lts-alpine3.15
# create destination directory
RUN mkdir -p /usr/src/nuxt-app
WORKDIR /usr/src/nuxt-app
# update and install dependency
RUN apk update && apk upgrade
RUN apk add git
# copy the app, note .dockerignore
COPY . /usr/src/nuxt-app/
RUN npm install
RUN npm run build
EXPOSE 3000
ENV NUXT_HOST=0.0.0.0
ENV NUXT_PORT=3000
CMD [ "npm", "start" ]
I'm trying to copy my ./dist after building my angular app.
here is my Dockerfile
# Create image based off of the official Node 10 image
FROM node:12-alpine
RUN apk update && apk add --no-cache make git
RUN mkdir -p /home/project/frontend
# Change directory so that our commands run inside this new directory
WORKDIR /home/project/frontend
# Copy dependency definitions
COPY package*.json ./
RUN npm cache verify
## installing packages
RUN npm install
COPY ./ ./
RUN npm run build --output-path=./dist
COPY /dist /var/www/front
but when I run docker-compose build dashboard I get this error
Service 'dashboard' failed to build: COPY failed: stat /var/lib/docker/tmp/docker-builderxxx/dist: no such file or directory
I don't know why is there something wrong?
if you need to check also docker-compose file
...
dashboard:
container_name: dashboard
build: ./frontend
image: dashboard
container_name: dashboard
restart: unless-stopped
networks:
- app-network
...
The Dockerfile COPY directive copies content from the build context (the host-system directory in the build: line) into the image. If you're just trying to move around content within the image, you can RUN cp or RUN mv to use the ordinary Linux shell commands instead.
RUN npm run build --output-path=./dist \
&& cp -a dist /var/www/front
Quick question regarding Dockerfile.
I've got a folder structure like so:
docker-compose.yml
client
src
package.json
Dockerfile
...etc
Client folder contains reactjs application and root is nodejs server with typescript. I've created Dockerfile like so:
FROM node
RUN mkdir -p /server/node_modules && chown -R node:node /server
WORKDIR /server
USER node
COPY package*.json ./
RUN npm install
COPY --chown=node:node . ./dist
RUN npm run build-application
COPY /src/views ./dist/src/views
COPY /src/public ./dist/src/public
EXPOSE 4000
CMD node dist/src/index.js
npm run build-application command executes client build (npm run build --prefix ./client) and server(rimraf dist && mkdir dist && tsc -p .). The problem is that Docker cannot find client folder error:
npm ERR! enoent ENOENT: no such file or directory, open '/server/client/package.json'
npm ERR! enoent This is related to npm not being able to find a file.
Can someone explain why? And how to fix this?
Docker compose file:
...
server:
build:
context: ./server
dockerfile: Dockerfile
image: mazosios-pedutes-server
container_name: mazosios-pedutes-server
restart: unless-stopped
networks:
- app-network
env_file:
- ./server/.env
ports:
- "4000:4000"
Since the error is saying that there is no client/package.json in /server my question is the following one.
Is your ./client directory located within /server?
Dockerfile WORKDIR instruction makes all commands that follow it to be executed within the directory that you pass to WORKDIR as parameter.
I guess if you add RUN tree -d (lists only nested directories) after the last COPY instruction, you will be able to see where your client directory is located and you will be able to fix the path to it.
I am working on creating a docker container for a node.js microservice and am running into an issue with a local dependency from another folder.
I added the dependency to the node_modules folder using:
npm install -S ../dependency1(module name).
This also added an entry in the package.json as follows:
"dependency1": "file:../dependency1".
When I run the docker-compose up -d command, I receive an error indicating the folowing:
npm ERR! Could not install from "../dependency1" as it does not contain a package.json file.
Dockerfile:
FROM node:latest
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY . /usr/src/app
RUN npm install
CMD [ "npm", "start" ]
EXPOSE 3000
docker-compose.yml:
customer:
container_name: "app_customer"
build:
context: .
dockerfile: Dockerfile
volumes:
- .:/usr/src/app/
- /usr/src/app/node_modules
ports:
- "3000:3000"
depends_on:
- mongo
- rabbitmq
I found articles outlining an issue with symlinks in a node_modules folder and docker and a few outlining this issue but none seem to provide a solution to this problem. I am looking for a solution to this problem or a really good workaround.
A Docker build can't reference files outside of the build context, which is the . defined in the docker-compose.yml file.
docker build creates a tar bundle of all the files in a build context and sends that to the Docker daemon for the build. Anything outside the context directory doesn't exist to the build.
You could move your build context with context: ../ to the parent directory and shuffle all the paths you reference in the Dockerfile to match. Just be careful not to make the build context to large as it can slow down the build process.
The other option is to publish the private npm modules to a scope, possible on a private npm registry that you and the build server have access to and install the dependencies normally.
have a mocha.opts file that does
--require babelhook
It seems that the babelhook needs to be in the same directory as the node_modules directory. so
--myApp
--node_modules
--babelhook.js
--tests
--mocha.opts
--mytest
This works fine, however, what I need ( I'll explain at the end if you care why ) is this structure
--node_modules
--myApp
--babelhook.js
--tests
--mocha.opts
--mytest
This does not work. I get the mocha error "cannot find module babelhook"
I've tried making the mocha.opts file relative. So
--require ../babelhook
and various other attempts. But then I just get "cannot find module ../babelhook"
If I move node_modules back down a level it works fine. Is there anyway to do this?
The reason for this folder structure is that I have a docker container that is using a volume from my hard drive. The docker when a docker container builds it will do the npm install then copy everything from your local path to the container ( the volume action ). This overwrites the node_modules. So I do npm install and then move it up a folder so the volume copy doesn't overwrite it.
Any help would be appreciated
Edit
Dockerfile
FROM node:latest
MAINTAINER reharik#gmail.com
ENV PLUGIN_HOME /home/app/current
RUN npm install mocha -g
RUN npm install babel -g
RUN mkdir -p $PLUGIN_HOME
WORKDIR $PLUGIN_HOME
ADD ./package.json $PLUGIN_HOME/package.json
RUN npm install
RUN mv node_modules ../
ADD . $PLUGIN_HOME
docker-compose.yml
plugin:
build: .
volumes:
- .:/home/app/current
command: tail -f /dev/null
environment:
DEBUG: true
NODE_ENV: development
from package.json
"scripts": {
"test": "mocha --opts ./tests/mocha.opts ./tests/unitTests/*.js"
},