How do I create a temporary directory using mktemp in my current working directory? - tmp

I created a pipeline that strings multiple programs together, unfortunately these programs are creating a huge amount of temporary files in the /tmp folder and when using large datasets my pipeline crashes because the /tmp folder fills up.
How do I export temporary files so that they are created in my current working directory where the pipeline is being run and not in the /tmp folder?
Currently I have tried to export the TMPDIR env variable to an already created directory /work in my current working directory, but the temporary files are still being created in the /tmp folder:
export TMPDIR=$(mktemp -d --tmpdir=/work)
<script>
rm -rf $TMPDIR
The programs do not have the option to set different output folders for temporary files created.

Just change /work to work if the directory work is in your current directory. /work means that you have a top-level directory named /work. Without the forward slash, it will be a relative directory.
I just tested this code on my computer. No files were written to /tmp that I noticed:
mkdir work
export TMPDIR=$(mktemp -d --tmpdir=work)
ls work
# tmp.AWA4dTERha
rm -rf $TMPDIR
ls work
# --no output--

Related

Does docker auto delete files inside /tmp directory?

I have a nodejs application with docker and application has added some files in /tmp folder inside docker container.
now after 3-4 days that file from /tmp folder is missing.
so does doker auto removes files from /tmp folder ??

dockerfile COPY not copying file

I'm creating a dockerFile and trying to copy an environment configuration file that I've at the same path where dockerFile is found. The idea is to copy it during the docker build process to be able to use it during the gradle build a few steps later.
RUN \
set -ex && \
cd /app/src && \
git clone URL.git dest_folder
COPY .env_dev /app/src/dest_folder
After that, to check if the file is already there I make a pwd to ensure I'm in the right folder and ls -la to see if the file is there, but it never is, I can only find files downloaded from the repository, but of course, the .env_dev with credentials is not uploaded to the repository.
RUN \
cd /app/src/dest-folder && \
pwd && \
ls -la
I'm sure it may be something tricky I'm not using correctly but checked with both ADD/COPY with no results. I've even tried to use the wrong filename to see if COPY complains about it, and it does, so ... it seems that COPY finds it.
If you have a .dockerignore file, make sure that you do not ignore hidden files like .git or .venv
more info here :
https://docs.docker.com/engine/reference/builder/#dockerignore-file
Thanks everyone, finally I managed to see the problem.
It's quite weird... I'm cloning my repository to dest-folder, but copying the file in dest_folder.
That's why the file wasn't being detected, because it was in another folder.

gsutil cp in dockerfile does not copy file

I am trying to deploy my FastAPI app on Cloud Run, and in the Dockerfile I'd like to copy a file on my GCS Bucket and read it in the API. It somehow does not copy the file.
This is the copy lines in my Dockerfile:
FROM google/cloud-sdk AS gcloud
RUN mkdir ./models
RUN gsutil cp gs://rim-models/model_1.pkl ./models/model_1.pkl
And when I created an image with gcloud builds submit --tag gcr.io/project-id/api-name --timeout=3600, it shows that the file was copied:
Copying gs://rim-models/model_1.pkl ...
- [1 files][399.8 KiB/399.8 KiB]
Operation completed over 1 objects/399.8 KiB.
However, I got this error in the API:
FileNotFoundError: [Errno 2] No such file or directory: 'models/model_1.pkl'
When I run the command gsutil cp gs://rim-models/model_1.pkl ./models/model_1.pkl locally, it does copy the file model_1.pkl to my local directory models. So why did it not work when I deployed the app to Cloud Run?
EDIT: After changing all models paths to either all relative or all absolute, the error remains.
Your container creation contains mistakes or inconsistency in the directory definition.
RUN mkdir /models
RUN gsutil cp gs://rim-models/model_1.pkl ./models/model_1.pkl
First line, you create the absolute path /models, in the seconds, you use the relative path ./models
You can either use the full path, or the relative, but there is lack of consistency.
In your API error, the mention is models/model_1.pkl, obviously a relative path. I don't know the full path where your API look the data, but you have to fix all this small unaligned directory definition

Copying files in a directory to a docker container not at same level as Dockerfile

I am trying to copy content of a directory while creating the docker image in the Dockerfile but while copying its giving me error-
COPY failed: stat
/var/lib/docker/tmp/docker-builder108255131/Users/user_1/media : no
such file or directory
I have the following file in the Dockerfile-
COPY /Users/user_1/media/. /code/project_media/
The media directory is in a seperate directory level then Dockerfile.
Sorry per dockerfile documentation:
Multiple resources may be specified but the paths of files and directories will be interpreted as relative to the source of the context of the build
https://docs.docker.com/engine/reference/builder/#copy
I usually will have a script for building the docker, and in that script will copy the files need for the COPY command into either the same directory or sub-directory of the "context of the build" (a very confusing way of saying the same directory as the dockerfile is in)
DOCKER_BUILD_DIR="./build/docker"
DOCKERFILE="./docker/Dockerfile"
MEDIA_FILES="/Users/user_1/media"
mkdir -p "${DOCKER_BUILD_DIR}/${MEDIA_FILES}"
cp -f "${DOCKERFILE}" "${DOCKER_BUILD_DIR}/"
cp -rf "${MEDIA_FILES}" "${DOCKER_BUILD_DIR}/${MEDIA_FILES}/.."
docker build "${DOCKER_BUILD_DIR}"

External SpringBoot properties file on Docker

I'm currently trying to automatically externalize my application.yml file from my Spring Boot app's default location /src/main/resouces/application.yml. I know currently Spring Cloud Server is a good or prefered way to do so, but that may not be an option for my case at this time.
I'm currently trying to extract the .yml file from it's .jar and then copy it to my desired folder.
Unfortunately, I don't seem to get it at all! At some point I try to run RUN ls -lrt /tmp/config and even though I get a success message from COPY command, it's always empty.
This is my curreny setup:
Dockerfile:
FROM openjdk:8-jdk-alpine
VOLUME ["/tmp", "/tmp/config", "/tmp/logs"]
ADD /target/*.jar app.jar
RUN apk add --update unzip && unzip app.jar "*application.yml" && ls -lrt
RUN ls -lrt /BOOT-INF/classes
RUN cp /BOOT-INF/classes/application.yml tmp/config
RUN ls -lrt tmp/config
# ----> Total 0
ENTRYPOINT ["java","-Djava.security.egd=file:/dev/./urandom","-jar","/app.jar", "--spring.config.location=file:/tmp/config/application.yml"]
And in my docker-compose.yml I have a mapping for all three VOLUMES I'm defining above.
Do you guys have any idea on how to solve this issue without making the user drop the .yml file in the directory at first deploy?
Best regards,
Enrico Bergamo
In the end I have just decided keeping it simple and have a volume mounted just for an additional application.yml file. Before running the container, I'm creating the directory and placing my new .yml file in this dir and it did the trick :)
FROM openjdk:8-jre-alpine
VOLUME ["/tmp", "/tmp/config", "/tmp/logs"]
ADD /target/*.jar app.jar
ENTRYPOINT ["java","-Djava.security.egd=file:/dev/./urandom","-jar","/app.jar", "--spring.config.location=classpath:/application.yml,file:/tmp/config/application.yml"]

Resources