I'm new to docker and creating a simple test app to test my docker container, but docker unable to locate the server.py file.
The directory structure of my project is:
<project>
|
|-- Dockerfile
|-- app
|
|-- requirements.txt
|-- server.py
Below is the Dockerfile content:
FROM ubuntu:latest
MAINTAINER name <mail#domain.com>
COPY . /app # do I need this ?
COPY ./app/requirements.txt /tmp/requirements.txt
RUN apt-get -y update && \
apt-get install -y python-pip python-dev build-essential
RUN pip install -r /tmp/requirements.txt
WORKDIR /app
RUN chmod +x server.py # ERROR: No such file or directory
EXPOSE 5000
ENTRYPOINT ["python"]
CMD ["server.py"] # ERROR: No such file or directory
I'm using boot2docker on windows.
What am I missing here?
You're copying your local /app/ folder to the /app/ folder in the running Docker container (as mentioned in the comments) creating /app/app/server.py in the Docker container.
How to resolve
A simple fix will be to change
COPY . /app
to
COPY ./app/server.py /app/server.py
Explanation
The command COPY works as follows:
COPY <LOCAL_FROM> <DOCKER_TO>
You're selecting everything in the folder where the Dockerfile resides, by using . in your first COPY, thereby selecting the local /app folder to be added to the Docker's folder. The destination you're allocating for it in the Docker container is also /app and thus the path in the running container becomes /app/app/.. explaining why you can't find the file.
Have a look at the Docker docs.
Related
I am running docker containers with WSL2. When I make changes to my files in the /client directory the changes are not reflected and I have to do docker compose stop client, docker compose build client and docker compose start client. If I cat a file after changing domething one can see the change.
Here is my Dockerfile:
FROM node:16.17.0-alpine
RUN mkdir -p /client/node_modules
RUN chown -R node:node /client/node_modules
RUN chown -R node:node /root
WORKDIR /client
# Copy Files
COPY . .
# Install Dependencies
COPY package.json ./
RUN npm install --force
USER root
I alse have a /server directory with the following Dockerfile and the automatic image rebuild happens on file change there just fine:
FROM node:16.17.0-alpine
RUN mkdir -p /server/node_modules
RUN chown -R node:node /server/node_modules
WORKDIR /server
COPY . .
# Install Dependencies
COPY package.json ./
RUN npm install --force --verbose
USER root
Any help is appreciated.
Solved by adding the following to my docker-compose.yml:
environment:
WATCHPACK_POLLING: "true"
Docker does not take care of the hot-reload.
You should look into the hot-reload documentation of the tools you are building with.
Ive tried all solutions to this issue I could find and no luck.
My directory looks like this:
-automation
--app
---requirements.txt
--archive
--Dockerfile
How can I get the Dockerfile to recognize the requirements.txt?
FROM *secret*/python:3.8
WORKDIR /automation
RUN pwd
COPY requirements.txt ./app/requirements.txt
RUN pip install -r requirements.txt
ADD automation automation/
RUN python3 ./app/main.py
your problem related to your work directory.
Your Dockerfile must be like taht:
FROM *secret*/python:3.8
WORKDIR /app # change this from automation to app
RUN pwd
COPY app/requirements.txt ./app/requirements.txt # this line change because your requirement.txt is inside the folder app
RUN pip install -r requirements.txt
Copy . . # This mean copy all you files to /app
# ADD automation automation/ # delete this line
RUN python3 ./app/app/main.py # update also this line main.py will be inside the folder app inside the working directory app
I hope that can help you to resolve your issue.
So, I am trying to dockerize a golang application with different directories containing supplementary code for my main file.
I am using gorilla/mux. The directory structure looks like this.
$GOPATH/src/github.com/user/server
|--- Dockerfile
|--- main.go
|--- routes/
handlers.go
|--- public/
index.gohtml
It works on my host machine with no problem. The problem is that when I try to deploy the docker image it does not run and exits shortly after creation. I have tried changing the WORKDIR command in my dockerfile to /go/src and dump all my files there, but still no luck. I have also tried the official documentation on docker hub. Doesn't work either.
My Dockerfile.
FROM golang:latest
WORKDIR /go/src/github.com/user/server
COPY . .
RUN go get -d github.com/gorilla/mux
EXPOSE 8000
CMD ["go","run","main.go"]
My golang main.go
package main
import (
"github.com/gorilla/mux"
"github.com/user/server/routes"
"log"
"net/http"
"time"
)
func main(){
//...
}
I get this error message when I check the logs of my docker image.
Error Message
main.go:5:2: cannot find package "github.com/user/server/routes" in any of:
/usr/local/go/src/github.com/user/server/routes (from $GOROOT)
/go/src/github.com/user/server/routes (from $GOPATH)
Try the following Docker file:
# GO Repo base repo
FROM golang:1.12.0-alpine3.9 as builder
RUN apk add git
# Add Maintainer Info
LABEL maintainer="<>"
RUN mkdir /app
ADD . /app
WORKDIR /app
COPY go.mod go.sum ./
# Download all the dependencies
RUN go mod download
COPY . .
# Build the Go app
RUN CGO_ENABLED=0 GOOS=linux go build -a -installsuffix cgo -o main .
# GO Repo base repo
FROM alpine:latest
RUN apk --no-cache add ca-certificates curl
RUN mkdir /app
WORKDIR /app/
# Copy the Pre-built binary file from the previous stage
COPY --from=builder /app/main .
# Expose port 8000
EXPOSE 8000
# Run Executable
CMD ["./main"]
Here, we are creating an intermediate docker builder container, copying the code into it, build the code inside the builder container and then copy the binary image to the actual docker.
This will help in both having all the dependencies in the final container and also, the size of the final image will be very small
need your help i have a standard python library which is in .tar.gz file. i need to manually copy the file in git repo to use it all the time.
i need to create a docker container which will have this file and install the libraries from that standard library.
need your help on it. i looking for a Docker file
tried docker file as below
FROM python:3.6
COPY . /app
WORKDIR /app
RUN ls -ltr
EXPOSE 8080
RUN pip install pipenv
RUN pipenv install --system --deploy --skip-lock
I have a .tar.gz file which i need to copy it to docker and install the packages in it and use it containers
I have a Dockerfile that explicitly defines which directores and files from the context directory are copied to the app directory. But regardless of this Docker tries to copy all files in the context directory.
The Dockerfile is in the context directory.
My test code and data files are in directories directly below the context directory. It attempts to copy everything in the context directory, not just the directories and files specified by my COPY commands. So I get a few hundred of these following ERROR messages, except specifying each and every file in every directory and sub directory:
ERRO[0043] Can't add file /home/david/gitlab/etl/testdata/test_s3_fetched.csv to tar: archive/tar: missed writing 12029507 bytes
...
ERRO[0043] Can't close tar writer: archive/tar: missed writing 12029507 bytes
Sending build context to Docker daemon 1.164GB
Error response from daemon: Error processing tar file(exit status 1): unexpected EOF
My reading of the reference is that it only copies all files and directories if there are no ADD or COPY directives.
I have tried with the following COPY patterns
COPY ./name/ /app/name
COPY name/ /app/name
COPY name /app/name
WORKDIR /app
COPY ./name/ /name
WORKDIR /app
COPY name/ /name
WORKDIR /app
COPY name /name
My Dockerfile:
FROM python3.7.3-alpine3.9
RUN apk update && apk upgrade && apk add bash
# Copy app
WORKDIR /app
COPY app /app
COPY configfiles /configfiles
COPY logs /logs/
COPY errorfiles /errorfiles
COPY shell /shell
COPY ./*.py .
WORKDIR ../
COPY requirements.txt /tmp/
RUN pip install -U pip && pip install -U sphinx && pip install -r /tmp/requirements.txt
EXPOSE 22 80 8887
I expect it to only copy my files without the errors associated with trying to copy files I have not specified in COPY commands. Because the Docker output scrolls off my terminal window due to aqll thew error messages I cannot see if it succeeded with my COPY commands.
All files at and below the build directory are coppied into the initial layer of the docker build context.
Consider using a .dockerignore file to exclude files and directories from the build.
Try to copy the files in the following manner-
# set working directory
WORKDIR /usr/src/app
# add and install requirements
COPY ./requirements.txt /usr/src/app/requirements.txt
RUN pip install -r requirements.txt
# add app
COPY ./errorfiles /usr/src/app
Also, you will have to make sure that your docker-compose.yml file is correctly built-
version: "3.6"
services:
users:
build:
context: ./app
dockerfile: Dockerfile
volumes:
- "./app:/usr/src/app"
Here, I'm assuming that your docker-compose.yml file is inside the parent directory of your app.
See if this works. :)