This is the docker-compose file I have already set the folder to share by Virtual Box VM but it is still not working.
version: '3'
services:
postgres:
image: 'postgres:latest'
deploy:
restart_policy:
condition: on-failure
window: 15m
redis:
image: 'redis:latest'
nginx:
restart: always
build:
dockerfile: Dockerfile.dev
context: ./nginx
ports:
- '3050:80'
api:
build:
dockerfile: Dockerfile.dev
context: ./server
volumes:
- /usr/src/app/node_modules
- ./server:/usr/src/app
environment:
- REDIS_HOST=redis
- REDIS_PORT=6379
- PGUSER=postgres
- PGHOST=postgres
- PGDATABASE=postgres
- PGPASSWORD=postgres_password
- PGPORT=5432
client:
build:
dockerfile: Dockerfile.dev
context: ./client
volumes:
- /usr/src/app/node_modules
- ./client:/usr/src/app
worker:
build:
dockerfile: Dockerfile.dev
context: ./worker
volumes:
- /usr/src/app/node_modules
- ./worker:/usr/src/app
I am running it on Windows 7 sp1. Whenever I run docker-compose up - I get an error:
api_1 | npm ERR! code ENOENT
api_1 | npm ERR! syscall open
api_1 | npm ERR! path /usr/src/app/package.json
api_1 | npm ERR! errno -2
api_1 | npm ERR! enoent ENOENT: no such file or directory, open '/usr/src/
app/package.json'
api_1 | npm ERR! enoent This is related to npm not being able to find a fi
le.
api_1 | npm ERR! enoent
api_1 |
api_1 | npm ERR! A complete log of this run can be found in:
api_1 | npm ERR! /root/.npm/_logs/2020-05-28T04_06_56_121Z-debug.log
complex_api_1 exited with code 254
Thanks in advance, please help.
I am trying to run a Fibonacci project from the Udemy course of Docker and Kubernetes complete guide.
Each service has its own package.json and other files.
Server Docker File :
FROM node:alpine
WORKDIR /usr/src/app
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "dev"]
Worker Docker File :
FROM node:alpine
WORKDIR /usr/src/app
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "dev"]
Client Docker File :
FROM node:alpine
WORKDIR /usr/src/app
COPY package.json .
RUN npm install
COPY . .
CMD ["npm", "run", "start"]
If you want to share data between containers
services:
client:
build:
dockerfile: Dockerfile.dev
context: ./client
volumes:
- datavolume:/usr/src/app/node_modules
- ./client:/usr/src/app
worker:
build:
dockerfile: Dockerfile.dev
context: ./worker
volumes:
- datavolume:/usr/src/app/node_modules
- ./worker:/usr/src/app
volumes:
datavolume: {}
Since it looks like your dev, I would suggest mount your workspace folder into container
services:
client:
build:
dockerfile: Dockerfile.dev
context: ./client
volumes:
- ./node_modules:/usr/src/app/node_modules
- ./client:/usr/src/app
worker:
build:
dockerfile: Dockerfile.dev
context: ./worker
volumes:
- ./node_modules:/usr/src/app/node_modules
- ./worker:/usr/src/app
And better way is treating every service a standalone project. Each of them should own their self package.json and node_modules.
services:
client:
build:
dockerfile: Dockerfile.dev
context: ./client
volumes:
- ./client:/usr/src/app
worker:
build:
dockerfile: Dockerfile.dev
context: ./worker
volumes:
- ./worker:/usr/src/app
In my opinion, it doesn't make sense to use same libraries in different project which in different purpose.
I had the same error! Actually I solved moving my project to /c/Users/currentUser from c/Program Files/Docker Toolbox. Maybe you have your project folder inside Program Files directory and not inside Users one, is It right? Try with this, Just Copy your project folder inside users and running your docker-compose from there. Let me know!
Related
To whom it may concern,
Im trying to run my app with docker compose file and get an error:
This is my docker-compsoe file
version: '3.8'
services:
backend:
container_name: backend
build:
context: .
dockerfile: ./docker/Dockerfile.backend-dev
restart: always
ports:
- '5000:5000'
env_file:
- ./apps/backend/envs/.env.development
volumes:
- type: bind
source: ./apps/backend/src
target: /app/apps/backend/src
- /app/apps/backend/node_modules
frontend:
container_name: frontend
build:
context: .
dockerfile: ./docker/Dockerfile.frontend-dev
restart: always
ports:
- '3000:3000'
volumes:
- type: bind
source: ./apps/frontend/src
target: /app/apps/frontend/src
- /app/apps/frontend/node_modules
And this is my dockerfile of the frontend
FROM node:18.14.0
RUN curl -f https://get.pnpm.io/v6.16.js | node - add --global pnpm
WORKDIR /app
COPY ./package.json ./pnpm-workspace.yaml ./.npmrc ./
COPY ./apps/frontend/package.json ./apps/frontend/
RUN pnpm i -w
RUN pnpm --filter frontend i
COPY ./tsconfig.base.json ./
COPY ./apps/frontend/ ./apps/frontend/
CMD ["pnpm", "-F", "frontend", "start" ]
And this is the error
apps/frontend start$ vite -c ./vite.config.ts
apps/frontend start: failed to load config from /app/apps/frontend/vite.config.ts
apps/frontend start: error when starting dev server:
apps/frontend start: Error: Cannot find module '#vitejs/plugin-react'
Here is my pnpm workspace.yaml file
packages:
- 'apps/*'
Please help
When running docker-compose up -d
I have a docker setup with AdonisJS 5. I'm currently trying to get it started (and it builds just fine). But as discussed a bit further down, the ace command cannot be found. ace is the CLI package for AdonisJS (think ng for Angular)
This is my Dockerfile:
ARG NODE_IMAGE=node:16.13.1-alpine
FROM $NODE_IMAGE AS base
RUN apk --no-cache add dumb-init
RUN mkdir -p /home/node/app && chown node:node /home/node/app
WORKDIR /home/node/app
USER node
RUN mkdir tmp
FROM base AS dependencies
COPY --chown=node:node ./package*.json ./
RUN npm ci
RUN npm i #adonisjs/cli
COPY --chown=node:node . .
FROM dependencies AS build
RUN node ace build --production
FROM base AS production
ENV NODE_ENV=production
ENV PORT=$PORT
ENV HOST=0.0.0.0
COPY --chown=node:node ./package*.json ./
RUN npm ci --production
COPY --chown=node:node --from=build /home/node/app/build .
EXPOSE $PORT
CMD [ "dumb-init", "node", "service/build/server.js" ]
And this is my docker-compose.yml:
version: '3.9'
services:
postgres:
container_name: postgres
image: postgres
restart: always
environment:
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD:-password}
- POSTGRES_USER=${POSTGRES_USER:-user}
networks:
- family-service-network
volumes:
- fn-db_volume:/var/lib/postgresql/data
adminer:
container_name: adminer
image: adminer
restart: always
networks:
- family-service-network
ports:
- 8080:8080
minio:
container_name: storage
image: 'bitnami/minio:latest'
ports:
- '9000:9000'
- '9001:9001'
environment:
- MINIO_ROOT_USER=user
- MINIO_ROOT_PASSWORD=password
- MINIO_SERVER_ACCESS_KEY=access-key
- MINIO_SERVER_SECRET_KEY=secret-key
networks:
- family-service-network
volumes:
- fn-s3_volume:/var/lib/postgresql/data
fn_service:
container_name: fn_service
restart: always
build:
context: ./service
target: dependencies
ports:
- ${PORT:-3333}:${PORT:-3333}
- 9229:9229
networks:
- family-service-network
env_file:
- ./service/.env
volumes:
- ./:/home/node/app
- /home/node/app/node_modules
depends_on:
- postgres
command: dumb-init node ace serve --watch --node-args="--inspect=0.0.0.0"
volumes:
fn-db_volume:
fn-s3_volume:
networks:
family-service-network:
When I run this with docker-compose up everything works, except for the fn_service.
I get the error:
Error: Cannot find module '/home/node/app/ace'
at Function.Module._resolveFilename (node:internal/modules/cjs/loader:933:15)
at Function.Module._load (node:internal/modules/cjs/loader:778:27)
at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:77:12)
at node:internal/main/run_main_module:17:47 {
code: 'MODULE_NOT_FOUND',
requireStack: []
}
node:internal/modules/cjs/loader:936
throw err;
^
I followed this tutorial, and I can't seem to find anything by googling. I'm sure it's something miniscule.
Any help would be appreciated.
I keep getting the following issue when trying to start up npm container within docker, I can't seem to figure out what is happening. It keeps attempting to start npm and then exits and doesn't run.
| sh: 1: mix: not found
npm-aws | npm ERR! code ELIFECYCLE
npm-aws | npm ERR! syscall spawn
npm-aws | npm ERR! file sh
npm-aws | npm ERR! errno ENOENT
npm-aws | npm ERR! # watch-poll: `mix watch -- --watch-options-poll=3000`
npm-aws | npm ERR! spawn ENOENT
npm-aws | npm ERR!
npm-aws | npm ERR! Failed at the # watch-poll script.
npm-aws | npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm-aws |
npm-aws | npm ERR! A complete log of this run can be found in:
npm-aws | npm ERR! /root/.npm/_logs/2021-10-05T22_43_50_263Z-debug.log
docker-compose.yml:
version: '3'
networks:
laravel:
services:
testing-aws:
build:
context: .
dockerfile: nginx.dockerfile
container_name: nginx-aws
ports:
- 5001:5001
volumes:
- ./src:/var/www/html:delegated
depends_on:
- php
- mysql
links:
- mysql
networks:
- laravel
mysql:
image: mysql:5.6
container_name: mysql-aws
restart: unless-stopped
tty: true
ports:
- 3306:3306
environment:
MYSQL_HOST: mysql
MYSQL_DATABASE: heatable
MYSQL_USER: heatable
MYSQL_ROOT_PASSWORD: password
networks:
- laravel
volumes:
- ./mysql:/var/lib/mysql
php:
build:
context: .
dockerfile: php.dockerfile
container_name: php-aws
volumes:
- ./src:/var/www/html:delegated
networks:
- laravel
links:
- mysql
depends_on:
- mysql
composer:
build:
context: .
dockerfile: composer.dockerfile
container_name: composer-aws
volumes:
- ./src:/var/www/html
working_dir: /var/www/html
depends_on:
- php
user: laravel
entrypoint: ['composer', '--ignore-platform-reqs']
networks:
- laravel
npm:
build:
context: .
dockerfile: npm.dockerfile
container_name: npm-aws
volumes:
- ./src:/var/www/html
working_dir: /var/www/html
command: npm run watch-poll
networks:
- laravel
artisan:
build:
context: .
dockerfile: php.dockerfile
container_name: artisan-aws
volumes:
- ./src:/var/www/html:delegated
depends_on:
- mysql
working_dir: /var/www/html
user: laravel
entrypoint: ['php', '/var/www/html/artisan']
links:
- mysql
networks:
- laravel
npm.dockerfile:
FROM node:14.17.1
WORKDIR /var/www/html
COPY ./src/package.json .
RUN npm install
RUN npm clean-install
CMD npm run watch-poll
UPDATE
I've managed to resolve it by adding tty: true, see the updated:
npm:
tty: true
build:
context: .
dockerfile: npm.dockerfile
container_name: npm-aws
working_dir: /var/www/html
networks:
- laravel
I had to manually run npm install within the terminal of the container to get the node modules, if anyone knows a way of fixing without the manual inputted command. Please let me know :)
Experiencing first setups with Docker with several services running.
Spent a while on this, but cannot pinpoint the problem.
Below is the cause of the problem, I think.
Why the Node app does not work/start?
web_1 | npm ERR! code ENOENT
web_1 | npm ERR! syscall open
web_1 | npm ERR! path /app/http/app/package.json
web_1 | npm ERR! errno -2
web_1 | npm ERR! enoent ENOENT: no such file or directory, open '/app/http/app/package.json'
web_1 | npm ERR! enoent This is related to npm not being able to find a file.
web_1 | npm ERR! enoent
web_1 |
web_1 | npm ERR! A complete log of this run can be found in:
web_1 | npm ERR! /root/.npm/_logs/2020-12-27T23_32_03_845Z-debug.log
Why it does not see it:
docker-compose.yml
version: '3'
services:
mongo:
image: mongo
restart: always
ports:
- "27017:27017"
environment:
MONGO_INITDB_ROOT_USERNAME: mongo_user
MONGO_INITDB_ROOT_PASSWORD: mongo_secret
api:
build:
context: .
dockerfile: Dockerfile
restart: always
ports:
- "4433:4433"
depends_on:
- rabbit
volumes:
- .:/app
web:
build:
context: .
dockerfile: Dockerfile1
restart: always
ports:
- "8080:8080"
depends_on:
- api
volumes:
- .:/app
rabbit:
hostname: rabbit
image: rabbitmq:management
environment:
- RABBITMQ_DEFAULT_USER=rabbitmq
- RABBITMQ_DEFAULT_PASS=rabbitmq
ports:
- "5673:5672"
- "15672:15672"
worker_1:
build:
context: .
hostname: worker_1
entrypoint: celery
command: -A workerA worker --loglevel=info -Q workerA
volumes:
- .:/app
links:
- rabbit
depends_on:
- rabbit
Dockerfile
FROM python:3.8
ADD Pipfile.lock /app/Pipfile.lock
ADD Pipfile /app/Pipfile
WORKDIR /app
COPY . /app
RUN pip install pipenv
RUN pipenv install --system --deploy --ignore-pipfile
ENV FLASK_APP=app/http/api/endpoints.py
ENV FLASK_RUN_PORT=4433
ENV FLASK_ENV=development
ENTRYPOINT ["python"]
#CMD ["app/http/api/endpoints.py","--host=0.0.0.0","--port 4433"]
CMD ["-m", "flask", "run"]
Dockerfile1
FROM node:10
WORKDIR /app/http/app
ADD app/http/app/package.json /app/http/app/package.json
ADD app/http/app/package-lock.json /app/http/app/package-lock.json
RUN npm i
CMD ["npm","start"]
How to make such a setup
Flask, RabbitMQ, React????
How to make it properly?
The Dockerfile being used:
FROM node:8-alpine
WORKDIR /usr/src/app
COPY . .
RUN npm install
CMD ["npm", "run", "serve"]
EXPOSE 8080
And the docker-compose.yml file:
version: '3'
services:
app:
container_name: app
restart: always
build:
context: ./app
dockerfile: Dockerfile
ports:
- "8080:8080"
volumes:
- ./app:/usr/src/app
- ./logs:/logs
The folder structure is the following:
project/
|-- docker-compose.yml
|-- logs/
|-- app/
|-- Dockerfile
|-- package.json
When running docker-compose up --build from project/, the npm install step outputs the following after about one minute:
added 1684 packages from 1297 contributors and audited 36429 packages in 56.23s
found 0 vulnerabilities
However, at the npm run serve step, the output basically consists in saying that no npm module can be found, and among other things, this line:
npm WARN Local package.json exists, but node_modules missing, did you mean to install?
How comes npm install is actually and definitely executed, but npm complains that node_modules cannot be found?
I had the same problem and I solved it just following this instruction. Add one line of code - /usr/src/app/node_modules to the docker-compose.yml file in the volumes:
volumes:
- ${PWD-.}/name_of_your_app:/usr/src/app
- /usr/src/app/node_modules
Update: I just ended up using only ./app/src folder as a volume, instead of ./app.
This way, /app/node_modules is not overridden by the host's volume.
version: '3'
services:
app:
container_name: app
restart: always
build:
context: ./app
dockerfile: Dockerfile-dev
ports:
- "8080:8080"
volumes:
- ./app/src:/usr/src/app/src # <---- this
- ./logs:/logs