It ran just before I rebooted my machine, and suddenly I get Could not locate Gemfile or .bundle/ directory when starting the container.
During build, which completes without issue, I can see the contents of /app are correct, but on startup, /app only contains a .bundle directory and nothing else.
UPDATE: Turns out the volume ./documents_api:/app is what isn't working. Environment is docker for windows 17.09.1 running as administrator
Here is my folder structure:
./
.env
docker-compose.yml
documents_api/
<typical rails directory contents>
Dockerfile
.env just contains RAILS_ENV=development
the dockerfile contains:
FROM ruby:2.3.3
RUN apt-get update -qq && apt-get install -y build-essential libpq-dev nodejs
RUN mkdir /app
WORKDIR /app
ADD Gemfile /app/Gemfile
ADD Gemfile.lock /app/Gemfile.lock
RUN bundle install
ADD . /app
docker-compose.yml contains:
version: '3'
services:
database:
image: mongo
volumes:
- mongo:/var/lib/mongo
env_file:
- .env
ports:
- "27017:27017"
documents:
build: ./documents_api
command: bundle exec rails s -p 3000 -b '0.0.0.0'
volumes:
- ./documents_api:/app
env_file:
- .env
expose:
- "3000"
depends_on:
- database
frontend:
image: nginx
build: ./web
depends_on:
- documents
ports:
- "80:80"
- "144:144"
# Persistence
volumes:
mongo:
Turns out I needed to remap my shared drives in the settings. It had lost the credentials after the reboot and was silently failing to map the volume.
Related
I am dockerizing my rails app with the mountable engine. But I am constantly getting one error while running the image The path `/app/include/engine` does not exist.
The image is successfully built but while running docker-compose up it throws an error of the path.
Below I am attching my Dockerfile and docker-compose.yml
Dockerfile
FROM ruby:2.4.1
RUN apt-get update -qq && apt-get install -y build-essential libpq-dev nodejs
RUN mkdir /app
WORKDIR /app
ADD Gemfile /app/Gemfile
ADD Gemfile.lock /app/Gemfile.lock
COPY . .
RUN mkdir -p /app/include/engine
RUN git clone git#github.com:engine/engine.git /app/include/engine
RUN ls
RUN ls /app/include/engine
RUN DISABLE_SSL=true gem install puma -v 3.6.0
RUN bundle check || bundle install
CMD ["rails", "server", "-b", "0.0.0.0"]
docker-compose.yml
version: '2'
services:
db:
image: mysql:8.0.21
restart: always
environment:
MYSQL_ROOT_PASSWORD: root#123
MYSQL_DATABASE: prod
MYSQL_USER: root
MYSQL_PASSWORD: root#123
ports:
- "3307:3306"
app:
build:
context: .
dockerfile: Dockerfile
args:
SSH_PRIVATE_KEY: ${SSH_PRIVATE_KEY}
volumes:
- ".:/app"
ports:
- "3000:3000"
depends_on:
- db
environment:
key: value
I have also included my engine path in Gemfile and followed all the steps in mounting engine
# engine
gem 'api', path: 'include/engine'
It works fine in the local environment but it gives me an error in docker.
Can someone please help what I m missing somewhere?
it because of this line
volumes:
- ".:/app"
This mounts your local dir inside the container when starting overwriting existing data already in the image. This means everything in /app is replaced with data from your local machine including your engine /app/include/engine.
To fix this you need to have this engine cloned in your local folder so it is available when starting the container. An other option is to clone the engine outside /app for example in /tmp or whatever you like.
I'm having problems when I build the container with MongoDB, when using the docker-compose up I get the following error
ERROR: Service 'app' failed to build: COPY failed: stat /var/lib/docker/tmp/docker-builder367230859/entrypoint.sh: no such file or directory
I tried to change the mongo to PostgreSQL, but continue.
my files are below, thanks in advance
that Dockerfile
version: '3'
services:
web:
image: nginx
restart: always
# volumes:
# - ${APPLICATION}:/var/www/html
# - ${NGINX_HOST_LOG_PATH}:/var/log/nginx
# - ${NGINX_SITES_PATH}:/etc/nginx/conf.d
ports:
- "80:80"
- "443:443"
networks:
- web
mongo:
image: mongo
environment:
MONGO_INITDB_ROOT_USERNAME: admin
MONGO_INITDB_ROOT_PASSWORD: password
ports:
- "27017:27017"
# volumes:
# - data:/data/db
networks:
- mongo
app:
build: .
volumes:
- .:/mm_api
ports:
- 3000:3000
depends_on:
- mongo
networks:
web:
driver: bridge
mongo:
driver: bridge´´
that docker-compose
FROM ruby:2.7.0
RUN apt-get update -qq && apt-get install -y nodejs
RUN mkdir /mm_api
WORKDIR /mm_api
COPY Gemfile /mm_api/Gemfile
COPY Gemfile.lock /mm_api/Gemfile.lock
RUN bundle install
COPY . /mm_api
COPY entrypoint.sh /usr/bin/
RUN chmod +x /usr/bin/entrypoint.sh
ENTRYPOINT ["entrypoint.sh"]
EXPOSE 3000
CMD ["bundle", "exec", "puma", "-C", "config/puma,rb"]
#CMD ["rails", "server", "-b", "0.0.0.0"]
that entry point
#!/bin/bash
set -e
rm -f /mm_api/tmp/pids/server.pid
exec "$#"
I had a similar issue when working on a Rails 6 application using Docker.
When I run docker-compose build, I get the error:
Step 10/16 : COPY Gemfile Gemfile.lock ./
ERROR: Service 'app' failed to build : COPY failed: stat /var/lib/docker/tmp/docker-builder408411426/Gemfile.lock: no such file or directory
Here's how I fixed it:
The issue was that the Gemfile.lock was missing in my project directory. I had deleted when I was having some issues with my gem dependencies.
All I had to do was to run the command below to install the necessary gems and then re-create the Gemfile.lock:
bundle install
And then this time when I ran the command docker-compose build everything worked fine again.
So whenever you encounter this issue endeavour to check if the file is present in your directory and most importantly if the path you specified to the file is the correct one.
That's all.
I hope this helps
I am having trouble with a rails container in a docker-compose network. I have not touched it in a few months, and when I attempted to start it this week it fails.
When I attempt to start the container with docker-compose up service the startup fails with:
service_1 | Could not locate Gemfile or .bundle/ directory
support_portal_service_1 exited with code 10
Both files are present, the host is a Windows 10 machine.
Bundle install completes succesfully:
Bundle complete! 19 Gemfile dependencies, 83 gems now installed.
Bundled gems are installed into `/usr/local/bundle`
What I have tried:
Added ruby and ruby-all-dev to apt-get install in case missing requirements were the issue
Changed ADD Gemfile /app/Gemfile to COPY Gemfile /app/Gemfile
Tried commenting out Gemfile.lock /app/Gemfile.lock
Ran bundle install on the Windows 10 host
Rebuild the container without cache
Insuring docker has access to the drive/directory
Here is my Dockerfile:
FROM ruby:2.5.3
RUN apt-get update -qq && apt-get install -y build-essential libpq-dev nodejs default-libmysqlclient-dev ruby ruby-all-dev
RUN mkdir /app
WORKDIR /app
COPY Gemfile /app/Gemfile
#COPY Gemfile.lock /app/Gemfile.lock
WORKDIR /app
RUN bundle install
ADD . /app
And my docker-compose.yml:
version: '2'
services:
# Structured database
sqldb:
image: mysql:5.7
volumes:
- sql:/var/lib/mysql
env_file:
- .env
environment:
- MYSQL_USER=web
- MYSQL_ROOT_PASSWORD=${PORTAL_DATABASE_PASSWORD}
- MYSQL_PASSWORD=${PORTAL_DATABASE_PASSWORD}
ports:
- "3306:3306"
# Application server
service:
build: .
command: bundle exec rails s -p 3000 -b '0.0.0.0'
volumes:
- .:/app
env_file:
- .env
expose:
- "3000"
depends_on:
- sqldb
# Front end proxy
web:
image: nginx
build:
context: .
dockerfile: Dockerfile-web
depends_on:
- service
ports:
- "80:80"
- "144:144"
# Persistence
volumes:
sql:
I am setting up docker-compose for an existing Ruby on Rails project. I am using docker-compose version 1.23.1, build b02f1306 and Docker version 18.09.0, build 4d60db4
When I am trying to start my containers for development using docker-compose up --build my web and worker containers are exiting with code 10. When I /bin/bash into them the /web_gen folder only contains a /tmp/db inside of that and postgres files inside of that.
I can get the containers working by changing the volumes to - /web_gen but then the volumes will not hot reload.
My docker-compose.yml
version: '3'
services:
web:
build: .
command: bundle exec rails s -p 3000 -b '0.0.0.0'
volumes:
- .:/web_gen
ports:
- "3000:3000"
depends_on:
- db
- redis
db:
image: 'postgres:9.4.5'
volumes:
- ./tmp/db:/var/lib/postgresql/data
redis:
image: 'bitnami/redis:latest'
environment:
- ALLOW_EMPTY_PASSWORD=yes
worker:
build: .
command: bundle exec sidekiq -c 1
volumes:
- .:/web_gen
depends_on:
- redis
Dockerfile
FROM ruby:2.3.3
RUN apt-get update -qq && apt-get install -y build-essential libpq-dev nodejs
RUN mkdir /web_gen
WORKDIR /web_gen
COPY Gemfile /web_gen/Gemfile
COPY Gemfile.lock /web_gen/Gemfile.lock
RUN bundle install
COPY . /web_gen
I am following this tutorial from docker Docker Rails and I have created a folder and added this code below in my docker file.
FROM ruby:2.5
RUN apt-get update -qq && apt-get install -y build-essential libpq-dev nodejs
RUN mkdir /myapp
WORKDIR /myapp
COPY Gemfile /myapp/Gemfile
COPY Gemfile.lock /myapp/Gemfile.lock
RUN bundle install
COPY . /myapp
And my docker compose file code is:
version: '3'
services:
db:
image: postgres
volumes:
- .data:/var/lib/postgresql/data
web:
build: .
command: bundle exec rails s -p 3000 -b '0.0.0.0'
volumes:
- .:/myapp
ports:
- "3000:3000"
depends_on:
- db
I am following tutorial when I am running docker compose up I can just see this error:
Could not connect to server: No such file or directory Is the server running locally and accepting connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
What is wrong here I don't know how to inspect and detect error how to fix this.
You need environment variables within your web container so that it knows how to connect to the db container.
version: '3'
services:
db:
image: postgres
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=
volumes:
- ./data:/var/lib/postgresql/data
web:
build: .
command: bundle exec rails s -p 3000 -b '0.0.0.0'
environment:
- PGHOST=db
- PGUSER=postgres
volumes:
- .:/myapp
ports:
- "3000:3000"
depends_on:
- db
Please go to your database.yml and add host set to db , then username and password and run the command again.