docker-compose complains about missing ']' - docker

I'm trying to get nodemon and docker-compose work with each others and I'm having a problem: Every time I run docker-compose up I get the following error: sh: missing ]. Here are the relevant files:
Dockerfile
FROM node:11.1.0-alpine
MAINTAINER TileHalo
WORKDIR /usr/src/app
RUN npm install -g nodemon
COPY package*.json ./
RUN npm install --save
COPY . .
EXPOSE 3000
CMD [ "nodemon", "./bin/www"]
and docker-compose.yml
version: "2"
services:
web:
build: .
ports:
- "3000:3000"
depends_on:
- postgres
volumes:
- .:/usr/src/app
postgres:
image: "postgres:alpine"
environment:
POSTGRES_PASSWORD: supersalainen
POSTGRES_USER: kipa
EDIT: Works on pure docker after the comma fix, but doesn't work on docker-compose

Related

How to run db-migrate up on Dockerfile?

When I run the following Dockerfile,in the middle of the process, after installing prisma, the RUN command db-migrate up is stopped. But when I used docker exec bin bash to run it, it worked without any problem. I don't think I can run the app before serving it. But there is a workaround like putting the migration commands as a service in docker-compose.yml. How to achieve it? or if there's any way to run those RUN commands of migration in this Dockerfile?
Dockerfile
FROM node:16.15.0-alpine
WORKDIR /app
COPY package*.json ./
# generated prisma files
COPY prisma ./prisma/
# COPY ENV variable
COPY .env ./
# COPY
COPY . .
RUN npm install
RUN npm install -g db-migrate
RUN npm install -g prisma
RUN db-migrate up
RUN prisma db pull
RUN prisma generate
EXPOSE 3000
CMD ["npm", "run", "dev"]
docker-compose.yml
version: '3.8'
services:
mysqldb:
image: mysql:5.7
restart: unless-stopped
env_file: ./.env
environment:
- MYSQL_ROOT_PASSWORD=$MYSQLDB_ROOT_PASSWORD
- MYSQL_DATABASE=$MYSQLDB_DATABASE
ports:
- $MYSQLDB_LOCAL_PORT:$MYSQLDB_DOCKER_PORT
volumes:
- db:/var/lib/mysql
auth:
depends_on:
- mysqldb
build: ./auth
restart: unless-stopped
env_file: ./.env
ports:
- $NODE_LOCAL_PORT:$NODE_DOCKER_PORT
environment:
- DB_HOST=mysqldb
- DB_USER=$MYSQLDB_USER
- DB_PASSWORD=$MYSQLDB_ROOT_PASSWORD
- DB_NAME=$MYSQLDB_DATABASE
- DB_PORT=$MYSQLDB_DOCKER_PORT
stdin_open: true
tty: true
volumes:
db:

AdonisJS 5 Dockerfile - Cannot find ace

I have a docker setup with AdonisJS 5. I'm currently trying to get it started (and it builds just fine). But as discussed a bit further down, the ace command cannot be found. ace is the CLI package for AdonisJS (think ng for Angular)
This is my Dockerfile:
ARG NODE_IMAGE=node:16.13.1-alpine
FROM $NODE_IMAGE AS base
RUN apk --no-cache add dumb-init
RUN mkdir -p /home/node/app && chown node:node /home/node/app
WORKDIR /home/node/app
USER node
RUN mkdir tmp
FROM base AS dependencies
COPY --chown=node:node ./package*.json ./
RUN npm ci
RUN npm i #adonisjs/cli
COPY --chown=node:node . .
FROM dependencies AS build
RUN node ace build --production
FROM base AS production
ENV NODE_ENV=production
ENV PORT=$PORT
ENV HOST=0.0.0.0
COPY --chown=node:node ./package*.json ./
RUN npm ci --production
COPY --chown=node:node --from=build /home/node/app/build .
EXPOSE $PORT
CMD [ "dumb-init", "node", "service/build/server.js" ]
And this is my docker-compose.yml:
version: '3.9'
services:
postgres:
container_name: postgres
image: postgres
restart: always
environment:
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD:-password}
- POSTGRES_USER=${POSTGRES_USER:-user}
networks:
- family-service-network
volumes:
- fn-db_volume:/var/lib/postgresql/data
adminer:
container_name: adminer
image: adminer
restart: always
networks:
- family-service-network
ports:
- 8080:8080
minio:
container_name: storage
image: 'bitnami/minio:latest'
ports:
- '9000:9000'
- '9001:9001'
environment:
- MINIO_ROOT_USER=user
- MINIO_ROOT_PASSWORD=password
- MINIO_SERVER_ACCESS_KEY=access-key
- MINIO_SERVER_SECRET_KEY=secret-key
networks:
- family-service-network
volumes:
- fn-s3_volume:/var/lib/postgresql/data
fn_service:
container_name: fn_service
restart: always
build:
context: ./service
target: dependencies
ports:
- ${PORT:-3333}:${PORT:-3333}
- 9229:9229
networks:
- family-service-network
env_file:
- ./service/.env
volumes:
- ./:/home/node/app
- /home/node/app/node_modules
depends_on:
- postgres
command: dumb-init node ace serve --watch --node-args="--inspect=0.0.0.0"
volumes:
fn-db_volume:
fn-s3_volume:
networks:
family-service-network:
When I run this with docker-compose up everything works, except for the fn_service.
I get the error:
Error: Cannot find module '/home/node/app/ace'
at Function.Module._resolveFilename (node:internal/modules/cjs/loader:933:15)
at Function.Module._load (node:internal/modules/cjs/loader:778:27)
at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:77:12)
at node:internal/main/run_main_module:17:47 {
code: 'MODULE_NOT_FOUND',
requireStack: []
}
node:internal/modules/cjs/loader:936
throw err;
^
I followed this tutorial, and I can't seem to find anything by googling. I'm sure it's something miniscule.
Any help would be appreciated.

Run commands on docker container and sync automatically with host

I Dockerkized a MENN(Nextjs) stack App, now everything works fine. I run into issues when i need to install npm packages. let me first show you the structure
src/server/Dockerfile
FROM node:14-alpine
WORKDIR /usr/app
COPY package*.json ./
RUN npm install -qyg nodemon#2.0.7
RUN npm install -qy
COPY . .
CMD ["npm", "run", "dev"]
src/client/Dockerfile
FROM node:14-alpine
WORKDIR /usr/app
COPY package*.json ./
RUN npm install -qy
COPY . .
CMD ["npm", "run", "dev"]
src/docker-compose.yml
version: "3"
services:
client:
build:
context: ./client
dockerfile: Dockerfile
ports:
- 3000:3000
networks:
- mern-network
volumes:
- ./client/src:/usr/app/src
- ./client/public:/usr/app/public
depends_on:
- server
environment:
- REACT_APP_SERVER=http://localhost:5000
- CHOKIDAR_USEPOLLING=true
command: npm run dev
stdin_open: true
tty: true
server:
build:
context: ./server
dockerfile: Dockerfile
ports:
- 5000:5000
networks:
- mern-network
volumes:
- ./server/src:/usr/app/src
depends_on:
- db
environment:
- MONGO_URL=mongodb://db:27017
- CLIENT=http://localhost:3000
command: /usr/app/node_modules/.bin/nodemon -L src/index.js
db:
image: mongo:latest
ports:
- 27017:27017
networks:
- mern-network
volumes:
- mongo-data:/data/db
networks:
mern-network:
driver: bridge
volumes:
mongo-data:
driver: local
Now if i install any packages using the host machine it is as expected updated in package.json file and if run
docker-compose build
the package.json is also updated inside the container which is fine, but i feel like this kinda breaks the whole point of having your App Dockerized! , if multiple developers need to work on this App and they all need to install node/npm in their machines whats the point of using docker other than for deployments? so what I do right now is
sudo docker exec -it cebc4bcd9af6 sh //login into server container
run a command e.g
npm i express
it installs the package and updates package.json but the host package.json is not updated and if i run the build command again all changes are lost as Dockerfile copies in the source code of host into container, is there a way to synchronize the client and host? in a way that if i install a package inside my container that should also update the host files? this way i dont need to have node/npm installed locally and fulfills the purpose of having your App Dockerized!

Docker Error: Cannot cerate container for service <container name>: status code not OK but 500

I used docker-compose to create docker containers in my React, Node.js, and Postgres structured project.
After I created Dockerfile and docker-compose.yml, I did docker up 'docker-compose up --build.'
Then, I wasn't be able to create containers, and get an errors.
I get Error says:
Error 1
Error 2
Error 3
How can I fix it and successfully build containers?
Here is a docker-compose.yml file in './'
version: '3'
services:
server:
container_name: mylivingcity_server
build: ./server
expose:
- 3001
ports:
- 3001:3001
volumes:
- ./server/config:/usr/src/app/server/config
- ./server/controllers:/usr/src/app/server/controllers
- ./server/db:/usr/src/app/server/db
command: npm run start
postgres:
image: postgres:12
container_name: mylivingcity_postgres
ports:
- 5432:5432
volumes:
- ./postgres/data:/var/lib/postgresql/data
environment:
- POSTGRES_DB=mylivingcity
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
frontend:
container_name: mylivingcity_frontend
build: ./frontend
expose:
- 3000
ports:
- 3000:3000
volumes:
- ./frontend/src:/usr/src/app/frontend/src
- ./frontend/public:/usr/src/app/frontend/public
command: npm start
stdin_open: true
Here is a Dockerfile in './frontend'
FROM node:12
# Create frontend directory
RUN mkdir -p /usr/src/app/frontend/
WORKDIR /usr/src/app/frontend/
# Install dependencies
COPY package*.json /usr/src/app/frontend/
RUN npm install
COPY . /usr/src/app/frontend/
CMD [ "npm" , "start" ]
Here is a Dockerfile in './server'
FROM node:12
# Create server directory
RUN mkdir -p /usr/src/app/server/
WORKDIR /usr/src/app/server/
# Install dependencies
COPY package*.json /usr/src/app/server/
RUN npm install
COPY . /usr/src/app/server/
CMD [ "npm" , "run" , "start" ]

How to run gulp after "docker-compose up"?

Can you help with running gulp tasks inside the Docker container with docker-compose, so it would compile SCSS files?
My file structure at host machine:
/application
--/config
--/models
--/public
----/scss
----/css
----index.html
----gulfile.js
--/routes
.dockerignore
.gitignore
Dockerfile
docker-compose.yml
package.json
server.js
/application/public/gulpfile.js:
var gulp = require('gulp');
var sass = require('gulp-sass');
gulp.task('sass', function() {
gulp.src('./scss/styles.scss')
.pipe(sass().on('error', sass.logError))
.pipe(gulp.dest('./css'));
});
gulp.task('sass:watch', function() {
gulp.watch('./scss/styles.scss', ['sass']);
});
Dockerfile:
FROM node:6
RUN mkdir -p /app
WORKDIR /app
COPY . /app
RUN npm install nodemon -g && npm install bower -g && npm install gulp -g
RUN cd /app
RUN npm install
RUN cd /app/public && bower install --allow-root
RUN cd /app
COPY . /app
docker-compose.yml:
version: '2'
services:
web:
build: .
command: nodemon /app/server.js
volumes:
- .:/app/
- /app/node_modules
- /app/public
- /app/config
- /app/models
- /app/routes
ports:
- "3000:3000"
depends_on:
- mongo
mongo:
image: mongo:latest
ports:
- "27018:27017"
restart: always
Ideally, it would be good to run the command from the docker-compose.yml file to start watching scss files inside the /application/public folder, but I wasted couple of days to solve this problem.
Also, I tried to run gulp inside the container. Actually, it works ok, but changes were not reflected at host machine.
Please do not suggest to use ready-made Docker-Hub images. I have used them and they did not solve my issue.
I'll be thankful for any help, links, info or ready-solution.
You can add a separate service in your compose file to run a separate command in a separate container (see the web line & the gulp service line below
version: '2'
services:
web: &web
build: .
command: nodemon /app/server.js
volumes:
- .:/app/
- /app/node_modules
- /app/public
- /app/config
- /app/models
- /app/routes
ports:
- "3000:3000"
depends_on:
- mongo
gulp: *web
command: <gulp command to watch files>
mongo:
image: mongo:latest
ports:
- "27018:27017"
restart: always

Resources