I am trying to build the container for the ruby on rails application. I followed the official guide provided at official docker guide
The problem is while creating the Gemfile I provide gem 'rails', '~> 6.0' but when building the docker image it overrides the rails version and installs the latest version.
I have few libraries which are not compatible with the latest version. How can I stop docker in using the latest version while building the docker container?
Dockerfile:
FROM ruby:3
RUN apt-get update -qq && apt-get install -y nodejs
WORKDIR /backend
COPY Gemfile /backend/Gemfile
COPY Gemfile.lock /backend/Gemfile.lock
RUN bundle install
COPY . /backend
# Add a script to be executed every time the container starts.
COPY entrypoint.sh /usr/bin/
RUN chmod +x /usr/bin/entrypoint.sh
ENTRYPOINT ["entrypoint.sh"]
EXPOSE 3000
# Start the main process.
CMD ["rails", "server", "-b", "0.0.0.0"]
docker-compose.yml
version: "3.9"
services:
mongodb:
image: "mongo"
volumes:
- "mongodb:/var/lib/mongodb/data"
environment:
MONGO_INITDB_ROOT_USERNAME: "root"
MONGO_INITDB_ROOT_PASSWORD: "root"
ports:
- 2717:27017
web:
build: .
volumes:
- .:/backend
ports:
- "3000:3000"
depends_on:
- mongodb
volumes:
mongodb:
~> 6.0 is a shortcut for ">= 6.0 and < 7". You need to be more specific if you don't want Rails 6.1: ~> 6.0.0. This is equivalent to ">= 6.0.0 and < 6.1".
Related
I'm trying to create an environment for developing some Ruby on Rails applications using Docker.
I'm following the official guide for a ruby on rails application on the docker website
The following are my Gemfile, Dockerfile and docker-compose.yml.
source 'https://rubygems.org'
gem 'rails', '~>6'
# syntax=docker/dockerfile:1
FROM ruby:latest
RUN apt-get update -qq && apt-get install -y nodejs
WORKDIR /app
COPY Gemfile /app/Gemfile
COPY Gemfile.lock /app/Gemfile.lock
RUN bundle install
# Add a script to be executed every time the container starts.
COPY entrypoint.sh /usr/bin/
RUN chmod +x /usr/bin/entrypoint.sh
ENTRYPOINT ["entrypoint.sh"]
EXPOSE 3000
# Configure the main process to run when running the image
CMD ["rails", "server", "-b", "0.0.0.0"]
version: "3.9"
services:
db:
image: mysql:8
restart: always
ports:
- 3306:3306
volumes:
- dbdata:/var/lib/mysql
environment:
MYSQL_ROOT_PASSWORD: password
MYSQL_DATABASE: app_db
MYSQL_USER: db_user
MYSQL_PASSWORD: db_user_pass
web:
build: .
command: bash -c "rm -f tmp/pids/server.pid && bundle exec rails s -p 3000 -b '0.0.0.0'"
volumes:
- .:/app
ports:
- "3000:3000"
depends_on:
- db
volumes:
dbdata:
The only changes I have made to the guide is to use Rails 6, the latest version of ruby and to use MySQL instead of postgres. When I try to run 'docker-compose build' I get the following error:
ERROR [7/9] RUN bundle install
#17 0.505 /usr/local/lib/ruby/3.0.0/rubygems.rb:281:in `find_spec_for_exe': Could not find 'bundler' (1.17.3) required by your /app/Gemfile.lock. (Gem::GemNotFoundException)
#17 0.505 To update to the latest version installed on your system, run `bundle update --bundler`.
The error message is clear, I don't have the correct version of bundler installed. At the bottom of my Gemfile.lock file I have the following:
RUBY VERSION
ruby 2.5.9p229
BUNDLED WITH
1.17.3
I had assumed that my ruby version would have been the latest version and that my Gemfile would have been bundled with > 2.0. I tried adding 'RUN gem bundle install' in my Dockerfile but that did not fix the issue. Is there a correct way I can specify docker to use the latest version of ruby, rails and bundler that are compatible with each other.
You can add this line before RUN bundle install in your Dockerfile:
ENV BUNDLE_VERSION 1.17.3
RUN gem install bundler --version "$BUNDLE_VERSION"
I am dockerizing my rails app with the mountable engine. But I am constantly getting one error while running the image The path `/app/include/engine` does not exist.
The image is successfully built but while running docker-compose up it throws an error of the path.
Below I am attching my Dockerfile and docker-compose.yml
Dockerfile
FROM ruby:2.4.1
RUN apt-get update -qq && apt-get install -y build-essential libpq-dev nodejs
RUN mkdir /app
WORKDIR /app
ADD Gemfile /app/Gemfile
ADD Gemfile.lock /app/Gemfile.lock
COPY . .
RUN mkdir -p /app/include/engine
RUN git clone git#github.com:engine/engine.git /app/include/engine
RUN ls
RUN ls /app/include/engine
RUN DISABLE_SSL=true gem install puma -v 3.6.0
RUN bundle check || bundle install
CMD ["rails", "server", "-b", "0.0.0.0"]
docker-compose.yml
version: '2'
services:
db:
image: mysql:8.0.21
restart: always
environment:
MYSQL_ROOT_PASSWORD: root#123
MYSQL_DATABASE: prod
MYSQL_USER: root
MYSQL_PASSWORD: root#123
ports:
- "3307:3306"
app:
build:
context: .
dockerfile: Dockerfile
args:
SSH_PRIVATE_KEY: ${SSH_PRIVATE_KEY}
volumes:
- ".:/app"
ports:
- "3000:3000"
depends_on:
- db
environment:
key: value
I have also included my engine path in Gemfile and followed all the steps in mounting engine
# engine
gem 'api', path: 'include/engine'
It works fine in the local environment but it gives me an error in docker.
Can someone please help what I m missing somewhere?
it because of this line
volumes:
- ".:/app"
This mounts your local dir inside the container when starting overwriting existing data already in the image. This means everything in /app is replaced with data from your local machine including your engine /app/include/engine.
To fix this you need to have this engine cloned in your local folder so it is available when starting the container. An other option is to clone the engine outside /app for example in /tmp or whatever you like.
I am trying to create my rails application in a docker environment. I have used volumes to mount source directories from the host at a targeted path inside the container. The application is in the development phase and I need to continuously add new gems to it. I install a gem from the bash of my running container, it installs the gem and the required dependencies. But when I removed the running containers(docker-compose down) and then again instantiated them(docker-compose up), my rails web image shows errors of missing gems. I know re-building the image will add the gems but IS THERE ANY WAY TO ADD GEMS WITHOUT REBUILDING THE IMAGE?
I Followed docker-compose docs for setting the rails app
https://docs.docker.com/compose/rails/#define-the-project
DOCKERFILE
FROM ruby:2.7.1-slim-buster
LABEL MAINTAINER "Prayas Arora" "<prayasa#mindfiresolutions.com>"
# Install apt based dependencies required to run Rails as
# well as RubyGems. As the Ruby image itself is based on a
# Debian image, we use apt-get to install those.
RUN apt-get update \
&& apt-get install -qq -y --no-install-recommends \
build-essential \
libpq-dev \
netcat \
postgresql-client \
nodejs \
&& rm -rf /var/lib/apt/lists/*
ENV APP_HOME /var/www/repository/repository_api
# Configure the main working directory. This is the base
# directory used in any further RUN, COPY, and ENTRYPOINT
# commands.
RUN mkdir -p $APP_HOME
WORKDIR $APP_HOME
# Copy the Gemfile as well as the Gemfile.lock and install
# the RubyGems. This is a separate step so the dependencies
# will be cached unless changes to one of those two files
# are made.
COPY ./repository_api/Gemfile $APP_HOME/Gemfile
COPY ./repository_api/Gemfile.lock $APP_HOME/Gemfile.lock
RUN bundle install
# Copy the main application.
COPY ./repository_api $APP_HOME
# Add a script to be executed every time the container starts.
COPY ./repository_docker/development/repository_api/entrypoint.sh /usr/bin/
RUN chmod +x /usr/bin/entrypoint.sh
ENTRYPOINT ["entrypoint.sh"]
# Expose port 3000 to the Docker host, so we can access it
# from the outside.
EXPOSE 3000
# The main command to run when the container starts. Also
# tell the Rails dev server to bind to all interfaces by
# default.
CMD ["rails","server","-b","0.0.0.0"]
docker-compose.yml
container_name: repository_api
build:
context: ../..
dockerfile: repository_docker/development/repository_api/Dockerfile
user: $UID
env_file: .env
stdin_open: true
environment:
DB_NAME: ${POSTGRES_DB}
DB_PASSWORD: ${POSTGRES_PASSWORD}
DB_USER: ${POSTGRES_USER}
DB_HOST: ${POSTGRES_DB}
volumes:
- ../../repository_api:/var/www/repository/repository_api
networks:
- proxy
- internal
depends_on:
- repository_db
A simple solution is to cache the gems in a docker volume. You can create a volume in docker and attach it to the path to bundle gems. This will maintain a shared state and you will not require to install the gems in every container you spun.
container_name: repository_api
build:
context: ../..
dockerfile: repository_docker/development/repository_api/Dockerfile
user: $UID
env_file: .env
stdin_open: true
environment:
DB_NAME: ${POSTGRES_DB}
DB_PASSWORD: ${POSTGRES_PASSWORD}
DB_USER: ${POSTGRES_USER}
DB_HOST: ${POSTGRES_DB}
volumes:
- ../../repository_api:/var/www/repository/repository_api
- bundle_cache:/usr/local/bundle
networks:
- proxy
- internal
.
.
volumes:
bundle_cache:
Also, a/c to bundler.io, the official Docker images for Ruby assume that you will use only one application, with one Gemfile, and no other gems or Ruby applications will be installed or run in your container. So once you have added all the gems required in your application development, you can remove this bundle_cache volume and rebuild your image with your final Gemfile.
I'm having problems when I build the container with MongoDB, when using the docker-compose up I get the following error
ERROR: Service 'app' failed to build: COPY failed: stat /var/lib/docker/tmp/docker-builder367230859/entrypoint.sh: no such file or directory
I tried to change the mongo to PostgreSQL, but continue.
my files are below, thanks in advance
that Dockerfile
version: '3'
services:
web:
image: nginx
restart: always
# volumes:
# - ${APPLICATION}:/var/www/html
# - ${NGINX_HOST_LOG_PATH}:/var/log/nginx
# - ${NGINX_SITES_PATH}:/etc/nginx/conf.d
ports:
- "80:80"
- "443:443"
networks:
- web
mongo:
image: mongo
environment:
MONGO_INITDB_ROOT_USERNAME: admin
MONGO_INITDB_ROOT_PASSWORD: password
ports:
- "27017:27017"
# volumes:
# - data:/data/db
networks:
- mongo
app:
build: .
volumes:
- .:/mm_api
ports:
- 3000:3000
depends_on:
- mongo
networks:
web:
driver: bridge
mongo:
driver: bridge´´
that docker-compose
FROM ruby:2.7.0
RUN apt-get update -qq && apt-get install -y nodejs
RUN mkdir /mm_api
WORKDIR /mm_api
COPY Gemfile /mm_api/Gemfile
COPY Gemfile.lock /mm_api/Gemfile.lock
RUN bundle install
COPY . /mm_api
COPY entrypoint.sh /usr/bin/
RUN chmod +x /usr/bin/entrypoint.sh
ENTRYPOINT ["entrypoint.sh"]
EXPOSE 3000
CMD ["bundle", "exec", "puma", "-C", "config/puma,rb"]
#CMD ["rails", "server", "-b", "0.0.0.0"]
that entry point
#!/bin/bash
set -e
rm -f /mm_api/tmp/pids/server.pid
exec "$#"
I had a similar issue when working on a Rails 6 application using Docker.
When I run docker-compose build, I get the error:
Step 10/16 : COPY Gemfile Gemfile.lock ./
ERROR: Service 'app' failed to build : COPY failed: stat /var/lib/docker/tmp/docker-builder408411426/Gemfile.lock: no such file or directory
Here's how I fixed it:
The issue was that the Gemfile.lock was missing in my project directory. I had deleted when I was having some issues with my gem dependencies.
All I had to do was to run the command below to install the necessary gems and then re-create the Gemfile.lock:
bundle install
And then this time when I ran the command docker-compose build everything worked fine again.
So whenever you encounter this issue endeavour to check if the file is present in your directory and most importantly if the path you specified to the file is the correct one.
That's all.
I hope this helps
I am having trouble with a rails container in a docker-compose network. I have not touched it in a few months, and when I attempted to start it this week it fails.
When I attempt to start the container with docker-compose up service the startup fails with:
service_1 | Could not locate Gemfile or .bundle/ directory
support_portal_service_1 exited with code 10
Both files are present, the host is a Windows 10 machine.
Bundle install completes succesfully:
Bundle complete! 19 Gemfile dependencies, 83 gems now installed.
Bundled gems are installed into `/usr/local/bundle`
What I have tried:
Added ruby and ruby-all-dev to apt-get install in case missing requirements were the issue
Changed ADD Gemfile /app/Gemfile to COPY Gemfile /app/Gemfile
Tried commenting out Gemfile.lock /app/Gemfile.lock
Ran bundle install on the Windows 10 host
Rebuild the container without cache
Insuring docker has access to the drive/directory
Here is my Dockerfile:
FROM ruby:2.5.3
RUN apt-get update -qq && apt-get install -y build-essential libpq-dev nodejs default-libmysqlclient-dev ruby ruby-all-dev
RUN mkdir /app
WORKDIR /app
COPY Gemfile /app/Gemfile
#COPY Gemfile.lock /app/Gemfile.lock
WORKDIR /app
RUN bundle install
ADD . /app
And my docker-compose.yml:
version: '2'
services:
# Structured database
sqldb:
image: mysql:5.7
volumes:
- sql:/var/lib/mysql
env_file:
- .env
environment:
- MYSQL_USER=web
- MYSQL_ROOT_PASSWORD=${PORTAL_DATABASE_PASSWORD}
- MYSQL_PASSWORD=${PORTAL_DATABASE_PASSWORD}
ports:
- "3306:3306"
# Application server
service:
build: .
command: bundle exec rails s -p 3000 -b '0.0.0.0'
volumes:
- .:/app
env_file:
- .env
expose:
- "3000"
depends_on:
- sqldb
# Front end proxy
web:
image: nginx
build:
context: .
dockerfile: Dockerfile-web
depends_on:
- service
ports:
- "80:80"
- "144:144"
# Persistence
volumes:
sql: