Separate Dockerfile for dav and prod - docker

I am new in Docker so please do not blame me :)
Is there a way to create two different Dockerfiles inherits from one?
Example, we have to have 2 environment: develop and production. Theirs base is the same:
FROM gcc
# it's just an example which shows the same base packets for both environment
RUN apt install lib-boost
For "develop" I have to install some utilities like gdb, valgrind etc.
For "production" I have to build an application. It thought to use "multi stage builds", but it runs steps in Dockerfile consistently. How I should do if I do not want to build an application in "develop"?
The first build the base image:
build -t base_image .
And then for each Dockerfile use it?
# for develop
FROM base_image
RUN apt install gdb
# for prod
FROM base_image
RUN make

Here is an example I'm currently using.
Base image Dockerfile:
FROM python:3.6-slim as base
RUN apt update
RUN apt install --no-install-recommends -y git-core build-essential \
&& apt autoclean
# ...
Prod image Dockerfile:
FROM your-registry/base:0.0.0 as prod
# your code
# ...
Hope, it'll be helpful for you.

Related

Failed to load platform plugin xcb while launching pyqt5 app on ubuntu on docker container

I am trying to run a gui using PyQt5 on a docker container. Everything working fine but when I am actualy running the container using docker-compose up command I am getting an error that says:
qt.qpa.xcb: could not connect to display
qt.qpa.plugin: Could not load the Qt platform plugin "xcb" in "" even though it was found.
This application failed to start because no Qt platform plugin could be initialized. Reinstalling the application may fix this problem.
Available platform plugins are: eglfs, linuxfb, minimal, minimalegl, offscreen, vnc, xcb.
Can someone help my fix this?
Note: I have been tried this solutions and none of them worked for me
Solution 1
Solution 2
Solution 3
This is my Dcokerfile:
FROM ubuntu:latest
# Preparing work environment
ADD server.py .
ADD test.py .
RUN apt-get update
RUN apt-get upgrade
RUN apt-get -y install python3
RUN apt-get -y install python3-pip
# Preparing work environment
RUN apt-get -y install python3-pyqt5
This is the docker-compose part:
test:
container_name: test
image: image
command: python3 test.py
ports:
- 4000:4000/tcp
networks:
pNetwork1:
ipv4_address: 10.1.0.3

Docker build on top existing docker image

I have a nodejs app wrapped in a docker image. Whenever I do any change in the code, even adding few console.logs, I need to rebuild the whole image, a long process of over 10 minutes, during the CI process.
Is there a way to build one image on top of another, say adding only the delta?
Thanks
EDIT in response to #The Fool comment
I'm running the CI using aws tools, mainly CodeBuild. I can use the latest docker image I created (all are stored in aws ecr) and build based on it, would it know to take only the delta even if it conflicts with the new code?
Following is my Dockerfile:
FROM node:15.3.0
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app
RUN apt-get update
RUN apt-get install -y build-essential libcairo2-dev libpango1.0-dev libjpeg-dev libgif-dev librsvg2-dev
RUN npm install
COPY . /usr/src/app
EXPOSE 3000
CMD bash -c "npm run sequelize db:migrate&&npm run sequelize db:seed:all&&npm run prod"

How to cache deployment image in gitlab-ci?

I'm creating a gitlab-ci deployment stage that requires some more libraries than existing in my image. In this example, I'm adding ssh (in real world, I want to add many more libs):
image: adoptopenjdk/maven-openjdk11
...
deploy:
stage: deploy
script:
- which ssh || (apt-get update -y && apt-get install -y ssh)
- chmod 600 ${SSH_PRIVATE_KEY}
...
Question: how can I tell gitlab runner to cache the image that I'm building in the deploy stage, and reuse it for all deployment runs in future? Because as written, the library installation takes place for each and every deployment, even if nothing changed between runs.
GitLab can only cache files/directories, but because of the way apt works, there is no easy way to tell it to cache installs you've done this way. You also cannot "cache" the image.
There are two options I see:
Create or use a docker image that already includes your dependencies.
FROM adoptopenjdk/maven-openjdk11
RUN apt update && apt install -y foo bar baz
Then build/push the image the image to dockerhub, then change the image: in the yaml:
image: membersound/maven-openjdk11-with-deps:latest
OR simply choose an image that already has all the dependencies you want! There are many useful docker images out there with useful tools installed. For example octopusdeploy/worker-tools comes with many runtimes and tools installed (java, python, AWS CLI, kubectl, and much more).
attempt to cache the deb packages and install from the deb packages. (beware this is ugly)
Commit a bash script as so to a file like install-deps.sh
#!/usr/bin/env bash
PACKAGES="wget jq foo bar baz"
if [ ! -d "./.deb_packages" ]; then
apt update && apt --download-only install -y ${PACKAGES}
cp /var/cache/apt/archives/*.deb ./.deb_packages
fi
apt install -y ./.deb_packages/*.deb
This should cause the debian files to be cached in the directory ./.deb_packages. You can then configure gitlab to cache them so you can use them later.
my_job:
before_script:
- install-deps.sh
script:
- ...
cache:
paths:
- ./.deb_packages

Docker usage with Odoo 10.0

I need to know how to setup a Docker to implement a container that could help me run an Odoo 10.0 ERP environment in it.
I'm looking for references or some setup guides, even I don't mind if you can paste the CLI below. I'm currently developing in a Ubuntu OS.
Thanks in Advance.......!!!
#NaNDuzIRa This is quite simple. I suggest that when you want to learn how to do something even if you need it very fast to look into the "man page" of the tool that you are trying to use to package your application. In this case, it is Docker.
Create a file name Dockerfile or dockerfile
Now that you know the OS flavor you want to use. Include that at the beginning of the "Dockerfile"
Then, you can add how you want to install your application in the OS.
Finally, you include the installation steps of Odoo for which i have added a link at the bottom of this post.
#OS of the image, Latest Ubuntu
FROM ubuntu:latest
#Privilege raised to install the application or package as a root user
USER root
#Some packages that will be used for the installation
RUN apt update && apt -y install wget
#installing Odoo
RUN wget -O - https://nightly.odoo.com/odoo.key | apt-key add -
RUN echo "deb http://nightly.odoo.com/10.0/nightly/deb/ ./" >> /etc/apt/sources.list.d/odoo.list
RUN apt-get -y update && apt-get -y install odoo
References
Docker
Dockerfile
Odoo

Docker-compose cannot find Java

I am trying to use a Python wrapper for a Java library called Tabula. I need both Python and Java images within my Docker container. I am using the openjdk:8 and python:3.5.3 images. I am trying to build the file using Docker-compose, but it returns the following message:
/bin/sh: 1: java: not found
when it reaches the line RUN java -version within the Dockerfile. The line RUN find / -name "java" also doesn't return anything, so I can't even find where Java is being installed in the Docker environment.
Here is my Dockerfile:
FROM python:3.5.3
FROM openjdk:8
FROM tailordev/pandas
RUN apt-get update && apt-get install -y \
python3-pip
# Create code directory
ENV APP_HOME /usr/src/app
RUN mkdir -p $APP_HOME/temp
WORKDIR /$APP_HOME
# Install app dependencies
ADD requirements.txt $APP_HOME
RUN pip3 install -r requirements.txt
# Copy source code
COPY *.py $APP_HOME/
RUN find / -name "java"
RUN java -version
ENTRYPOINT [ "python3", "runner.py" ]
How do I install Java within the Docker container so that the Python wrapper class can invoke Java methods?
This Dockerfile can not work because the multiple FROM statements at the beginning don't mean what you think it means. It doesn't mean that all the contents of the Images you're referring to in the FROM statements will end up in the Images you're building somehow, it actually meant two different concepts throughout the history of docker:
In the newer Versions of Docker multi stage builds, which is a very different thing from what you're trying to achieve (but very interesting nontheless).
In earlier Versions of Docker, it gave you the ability to simply build multiple images in one Dockerfile.
The behavior you are describing makes me assume you are using such an earlier Version. Let me explain what's actually happening when you run docker build on this Dockerfile:
FROM python:3.5.3
# Docker: "The User wants me to build an
Image that is based on python:3.5.3. No Problem!"
# Docker: "Ah, the next FROM Statement is coming up,
which means that the User is done with building this image"
FROM openjdk:8
# Docker: "The User wants me to build an Image that is based on openjdk:8. No Problem!"
# Docker: "Ah, the next FROM Statement is coming up,
which means that the User is done with building this image"
FROM tailordev/pandas
# Docker: "The User wants me to build an Image that is based on python:3.5.3. No Problem!"
# Docker: "A RUN Statement is coming up. I'll put this as a layer in the Image the user is asking me to build"
RUN apt-get update && apt-get install -y \
python3-pip
...
# Docker: "EOF Reached, nothing more to do!"
As you can see, this is not what you want.
What you should do instead is build a single image where you will first install your runtimes (python, java, ..), and then your application specific dependencies. The last two parts you're already doing, here's how you could go about installing your general dependencies:
# Let's start from the Alpine Java Image
FROM openjdk:8-jre-alpine
# Install Python runtime
RUN apk add --update \
python \
python-dev \
py-pip \
build-base \
&& pip install virtualenv \
&& rm -rf /var/cache/apk/*
# Install your framework dependencies
RUN pip install numpy scipy pandas
... do the rest ...
Note that I haven't tested the above snippet, you may have to adapt a few things.

Resources