Docker doesn't write poetry.lock and requirements on build image - docker

I'm trying to create an image with Python + Poetry.
However, after executing poetry install, I demand that the Dockerfile generate a poetry.lock and then generate a requirements.txt in order to appear in my project root these files, however, it does not happen .
Dockerfile:
WORKDIR /code
RUN pip install "poetry==$POETRY_VERSION"
RUN poetry config virtualenvs.create false
COPY pyproject.toml poetry.lock* /code/
RUN poetry install --no-root
RUN poetry lock --no-update
RUN poetry export --format=requirements.txt > requirements.txt
COPY . /code/
If I take these same commands and execute them directly inside the container after the build, it works normally and then I can access the generated files.
$ poetry lock --no-update
Skipping virtualenv creation, as specified in config file.
Updating dependencies
Resolving dependencies... (0.2s)
Writing lock file

The problem was in my docker-compose.yml where there was a "bind" of volumes that did all the replacement inside the container removing the files generated during the build.
web:
volumes:
- .:/code
That is, during the build the files were generated, however, when the container was uploaded via compose, the local files (which did not have the poetry.lock or the requirements.txt ) were replaced inside the container, giving to understand this problem.

Related

Dockerfile for Flask app, WORKDIR path should be absolute

So I am learning out docker for the first time and was wondering if I am doing this in the correct format for my flask app, as a lot of documentation online for the WOKRDIR command is changing dir into "/app" however my main file to run the app is run.py which would be the same directory as the actual docker file. However, WORKDIR doesn't let me do "WORKDIR ." to use the current DIR.
Can someone clarify if I have my docker file set up correctly?
(I also plan to host this on Heroku if that matters)
File structure
Docker file
# start by pulling the python image
FROM python:3.8-alpine
# copy the requirements file into the image
COPY ./requirements.txt /requirements.txt
# Don't need to switch working directory
# WORKDIR
# install the dependencies and packages in the requirements file
RUN pip install -r requirements.txt
# copy every content from the local file to the image
COPY . /app
# configure the container to run in an executed manner
ENTRYPOINT [ "python" ]
CMD ["run.py" ]

What is the RUN command in Dockerfile for install vuetify?

I expected and tried to include it in Dockefile directly. Here is my whole dockerfile:
FROM node
# make the 'app' folder the current working directory
WORKDIR /app
# copy both 'package.json' and 'package-lock.json' (if available)
COPY package*.json ./
# install project dependencies
RUN npm install
RUN npm i --save #koumoul/vuetify-jsonschema-form
RUN npm install --save axios vue-axios
RUN npm install vuetify#1.5.8
# copy project files and folders to the current working directory (i.e. 'app' folder)
COPY . .
But got
Module not found: Error: Can't resolve 'vuetify' in '/app/src/views'
It is not good practice to install separately from package.json. You should just include it in your package.json.. But I am going to teach you a technique for testing cases like this.
You can run first the image on your own docker run -it node bash then do there what you want to run. You can also apply bind mount so the files that you needed are included like docker run -it -v=$(pwd):/usr/src/app node bash.. With this you can practice everything that you are trying to run in your Dockerfile more directly

How to create a docker file which has a standard python library for use

need your help i have a standard python library which is in .tar.gz file. i need to manually copy the file in git repo to use it all the time.
i need to create a docker container which will have this file and install the libraries from that standard library.
need your help on it. i looking for a Docker file
tried docker file as below
FROM python:3.6
COPY . /app
WORKDIR /app
RUN ls -ltr
EXPOSE 8080
RUN pip install pipenv
RUN pipenv install --system --deploy --skip-lock
I have a .tar.gz file which i need to copy it to docker and install the packages in it and use it containers

How do I restrict which directories and files are copied by Docker?

I have a Dockerfile that explicitly defines which directores and files from the context directory are copied to the app directory. But regardless of this Docker tries to copy all files in the context directory.
The Dockerfile is in the context directory.
My test code and data files are in directories directly below the context directory. It attempts to copy everything in the context directory, not just the directories and files specified by my COPY commands. So I get a few hundred of these following ERROR messages, except specifying each and every file in every directory and sub directory:
ERRO[0043] Can't add file /home/david/gitlab/etl/testdata/test_s3_fetched.csv to tar: archive/tar: missed writing 12029507 bytes
...
ERRO[0043] Can't close tar writer: archive/tar: missed writing 12029507 bytes
Sending build context to Docker daemon 1.164GB
Error response from daemon: Error processing tar file(exit status 1): unexpected EOF
My reading of the reference is that it only copies all files and directories if there are no ADD or COPY directives.
I have tried with the following COPY patterns
COPY ./name/ /app/name
COPY name/ /app/name
COPY name /app/name
WORKDIR /app
COPY ./name/ /name
WORKDIR /app
COPY name/ /name
WORKDIR /app
COPY name /name
My Dockerfile:
FROM python3.7.3-alpine3.9
RUN apk update && apk upgrade && apk add bash
# Copy app
WORKDIR /app
COPY app /app
COPY configfiles /configfiles
COPY logs /logs/
COPY errorfiles /errorfiles
COPY shell /shell
COPY ./*.py .
WORKDIR ../
COPY requirements.txt /tmp/
RUN pip install -U pip && pip install -U sphinx && pip install -r /tmp/requirements.txt
EXPOSE 22 80 8887
I expect it to only copy my files without the errors associated with trying to copy files I have not specified in COPY commands. Because the Docker output scrolls off my terminal window due to aqll thew error messages I cannot see if it succeeded with my COPY commands.
All files at and below the build directory are coppied into the initial layer of the docker build context.
Consider using a .dockerignore file to exclude files and directories from the build.
Try to copy the files in the following manner-
# set working directory
WORKDIR /usr/src/app
# add and install requirements
COPY ./requirements.txt /usr/src/app/requirements.txt
RUN pip install -r requirements.txt
# add app
COPY ./errorfiles /usr/src/app
Also, you will have to make sure that your docker-compose.yml file is correctly built-
version: "3.6"
services:
users:
build:
context: ./app
dockerfile: Dockerfile
volumes:
- "./app:/usr/src/app"
Here, I'm assuming that your docker-compose.yml file is inside the parent directory of your app.
See if this works. :)

Babelrc file in Docker builds

I'm running into the errors:
ERROR in ../~/babel-polyfill/lib/index.js
Couldn't find preset "es2015-loose" relative to directory "/app"
amongst a few other preset not found errors upon building a ReactJS project. It runs on webpackdevserver in dev.
COPY in Docker doesn't copy over dot files by default. Should I be copying .babelrc over to avoid this breaking? How to do this if so. If not, what am I missing/wrong ordering in this build?
Dockerfile
FROM alpine:3.5
RUN apk update && apk add nodejs
RUN npm i -g webpack \
babel-cli \
node-gyp
ADD package.json /tmp/package.json
RUN cd /tmp && npm install
RUN mkdir -p /app && cp -a /tmp/node_modules /app/
WORKDIR /app
COPY . /app
docker-compose
version: '2.1'
services:
webpack:
build:
context: .
dockerfile: Docker.doc
volumes:
- .:/app
- /app/node_modules
COPY in Docker doesn't copy over dot files by default.
This is not true. COPY in the Dockerfile copies dot files by default. I came across this question as I had faced this issue earlier. For anyone else who may encounter this issue, troubleshoot with the following:
Check your host/local directory if the dotfiles exists. If you are copying the files over from your OS's GUI, there's a chance that the dotfiles will not be ported over simply because they are hidden.
Check if you have a .dockerignore file that may be ignoring these dotfiles. More info from .dockerignore docs

Resources