Visual Studio docker-compose build context - docker

When adding container orchastrator support (docker-compose) to a .NET Core Web API project with a dependency on some project library the following folder structure is created
├── Solution
│ ├── API.Project
| | ├── API.Project.csproj
| | ├── Dockfile
| |
| ├── Library.project
| | ├── Library.project.csproj
| |
| ├── docker-compose.yaml
As you can see the library project is outside the Dockerfile context. If I build an image in my Github Action pipeline with docker/build-push-action#v2 (https://github.com/marketplace/actions/build-and-push-docker-images) it can't find the library project. If I move the Dockerfile to the Solution folder and build the image and run a container the visual studio debugger won't attach, but the container does run. However, when I make a http request to the container a null pointer exception is logged in the container logs (also in a container from the github action image). How do I build an docker image with a folder structure like this example? I would prefer too keep the Dockerfile inside the API.project folder.

With docker/build-push-action#v2 you can specify the context and the location of the docker file like so:
name: Build and push
uses: docker/build-push-action#v2
with:
context: .
file: API.Project/Dockerfile
push: true
tags: user/app:latest
This allows you to include files in the parent folder of the Dockerfile.
The null pointer exception I received when moving my Dockerfile to the parent folder had to do with a dependency on System.Security.Cryptography, but I didn't have to solve it, because specifying the docker build context and keeping the dockerfile inside the API.project folder fixed my issues

Related

code-server docker installation cannot see git in linked volume

I have code-server installed with docker.
The docker compose file and config folder are under
/home/al3xis/containers/code-server/
├── compose.yaml
└── config/
├── workspace
├── data
└── ...
In my compose.yaml I have linked the volumes:
- ./config:/config
- /home/al3xis/containers:/config/workspace/containers
- /home/al3xis/projects:/config/workspace/projects
I also have the environment input:
- DEFAULT_WORKSPACE=/config/workspace
All is working great in code-server where I see in the workspace my two folders and I can work as expected in them.
But when I went to one of them and cloned my github repository then vscode wouldn't show me any git information.
For example, the .git folder is inside /home/al3xis/projects/sites/mysite and everything works as expected from my terminal but not from inside vscode.
I tried starting git from vscode interface but that created a .git inside the /config/workspace folder which meant that all files from all folders where added in git.
I only want git to be present inside that one folder that I have my website files.
Have I made a simple mistake with linking the volumes?

Gitlab CI/CD to Digital Ocean for multiple repos using docker-compose

Currently I have a project (repo) in Gitlab which is an angular app. I'm using Gitlab CI/CD to build, test, release and deploy. Releasing will build a new docker image pushing it to the Gitlab registry and after deploying it on NGinx in a docker container on my Digital Ocean droplet. This works fine.
Let's say I want to add a backend to it like the MEAN stack so I would have 2 containers running using a docker-compose file.
container 1 - Angular
container 2 - Node.js, Express.js and MongoDB
The 2 gitlab projects (repo's) will have to be build separately when a change occurs (own Dockerfile and gitlab-ci.yml file) but deployed together using the docker-compose file.
Where do I manage/put the docker-compose file?
I hope my explanation is clear and if I'm assuming correctly.
Thanks in advance.
According to your comment I understand you'd be interested in adopting a monorepo configuration.
In this case, for the question
Where do I manage/put the docker-compose file?
you could just put the docker-compose.yml file at the root of your GitLab CI project, which would lead to a directory structure like this:
monorepo-project/
├── backend/
│   ├── Dockerfile
│   ├── .dockerignore
│   └── src/
├── frontend/
│   ├── Dockerfile
│   ├── .dockerignore
│   └── src/
├── docker-compose.yml
├── .git/
├── .gitignore
└── .gitlab-ci.yml
As pointed out in https://docs.gitlab.com/ee/user/packages/workflows/monorepo.html (the original version of this page, deleted by this commit, is still available at this URL), you can tweak your configuration using the changes: key, so that if just one part of the project changes (e.g., the frontend), then the CI behaves accordingly.
Further pointers
For more examples, see e.g. this article in Medium which specifically relies on Docker, or that blog article which takes advantage of the needs: key.
Finally, the semantics of the GitLab CI YAML conf file is well-documented in https://docs.gitlab.com/ee/ci/yaml/ (to be bookmarked!).

Docker unable to prepare context within directory

So! I'm setting up a CI/CD system which has it's own folder and yaml file. Within that folder, I need to call docker build. However, I continue to get
the following error:
unable to prepare context: path "./server" not found.
Here's the structure of my app:
├──CI Folder
| ├── deployment-file.yaml
├──server
| ├── Dockerfile.prod
| └── All the other files for building the server
Within the deployment-file.yaml, I'm calling:
docker build -t dizzy/dizzy-server:latest -f ./server/Dockerfile.prod ./server
I've tried every variation of this with relative paths like ../server, etc, but Docker won't take it. It gives me a new error of unable to prepare context: unable to evaluate symlinks in Dockerfile path: lstat.
What's the proper way to do this or am I required to move that deployment-file.yaml to the root directory...

Docker: copy folder into multiple images

For example, I have the next structure of project:
.
├── docker-compose.yml
├── library
└── _services
└──_service1
| └── Dockerfile
├──_service2
| └── Dockerfile
└──_service3
└── Dockerfile
How can I copy library into each service? Or is it exist a better way to create services images with library package?
You can't copy files that are in a parent directory of where you Dockerfile is.
Of course you don't want to copy your library content into each service directory but you can.
Create a distinct Dockerfile for each service at the top level.
Eg:
docker-compose.yml
library
Dockerfile.service1
Dockerfile.service2
Dockerfile.service3
then each dockerfile can COPY the library in.
If your library is a fundamental part of your services, you can simply create an image for it and make it the base image for your services.
Eg:
base
library
Dockerfile
services
Dockerfile.service1
Dockerfile.service2
Dockerfile.service3
with Dockerfile
FROM alpine:3.7
COPY library/...
docker build -t base .
and a Dockerfile.serviceN
FROM base
Generally, I find it better to not include building of Dockerfile in the docker compose. You build your services when needed, push them to an image repository (eg: quay.io, docker.io) and your docker compose file pulls them in at deploy time.
One way of maintaining is single image and having Docker Volumes . You can also specify the volume in docker-compose.yml and use the shared data.
Volumes in compose : https://docs.docker.com/compose/compose-file/#volumes.
Volumes in Docker : https://docs.docker.com/storage/volumes/.

Build docker image using different directory contexts

My current project consists of a mongo server, a rabbitmq server and a dotnet core service. It is structured as follows:
.
├── project1.docker-compose.yml #multiple docker-compose files for all projects
├── .dockerignore
├── Util/
| └── some common code across all projects
└── Project1/ #there are multiple projects at the same level with the same structure
├── .docker/
| ├── mongodb
| | └──Dockerfile
| └── rabbitmq
| └──Dockerfile
├── BusinessLogicClasses/
| └── some classes that contain my business logic
└── DotNetCoreService/
├── my service code
└── .docker
└──Dockerfile
Right now I am able to use docker-compose command to build the images for mongodb, rabbitmq and the dot net core succesfully. The docker-compose.yml sits at the home directory level because my different projects (in this case Project1) references code found under the Util directory. Therefore I need to be able to provide a context that is above both directories so that I can use COPY operations on the Dockerfile.
My basic project1.docker-compose.yml is as follows (I excluded not important parts)
version: '3'
services:
rabbitmq:
build:
context: Project1/.docker/rabbitmq/
mongodb:
build:
context: Project1/.docker/mongodb/
dotnetcoreservice:
build:
context: ./
dockerfile: Project1/DotNetCoreService/.docker/Dockerfile
As can be seen, the context for the dotnetcoreservice is at the home directory level. Therefore my Dockerfile for that specific image needs to target the full paths from the context as follows:
#escape=`
FROM microsoft/dotnet:2.0-sdk AS build
WORKDIR /app
COPY Project1/ ./Project1/
COPY Util/ ./Util/
RUN dotnet build Project1/DotNetCoreService/
This dockerfile works succesfully when invoked via the docker-compose command at the home directory level, however when invoked via the docker build .\Project1\DotNetCoreService\.docker\ command it fails with the following message:
COPY failed: stat
/var/lib/docker/tmp/docker-builder241915396/Project1: no
such file or directory
I think this is a matter of the actual context because the docker build instruction automatically sets the context to where the Dockerfile is. I would like to be able to use this same directory structure to create images both with the docker-compose build as well as with the docker build instructions.
Is this somehow possible?
Use flag -f to set custom path
Example: docker build --rm -t my-app -f path/to/dockerfile .
May 2022: The new releases of Dockerfile 1.4 and Buildx v0.8+ come with the ability to define multiple build contexts.
This means you can use files from different local directories as part of your build.
Dockerfiles now Support Multiple Build Contexts
Tõnis Tiigi
Multiple Projects
Probably the most requested use case for named contexts capability is the possibility to use multiple local source directories.
If your project contains multiple components that need to be built together, it’s sometimes tricky to load them with a single build context where everything needs to be contained in one directory.
There’s a variety of issues:
every component needs to be accessed by their full path,
you can only have one .dockerignore file,
or maybe you’d like each component to have its own Dockerfile.
If your project has the following layout:
project
├── app1
│ ├── .dockerignore
│ ├── src
├── app2
│ ├── .dockerignore
│ ├── src
├── Dockerfile
…with this Dockerfile:
#syntax=docker/dockerfile:1.4
FROM … AS build1
COPY –from=app1 . /src
FROM … AS build2
COPY –from=app2 . /src
FROM …
COPY –from=build1 /out/app1 /bin/
COPY –from=build2 /out/app2 /bin/
…you can invoke your build with docker buildx build –build-context app1=app1/src –build-context app2=app2/src .. Both of the source directories are exposed separately to the Dockerfile and can be accessed by their respective names.
This also allows you to access files that are outside of your main project’s source code.
Normally when you’re inside the Dockerfile, you’re not allowed to access files outside of your build context by using the ../ parent selector for security reasons.
But as all build contexts are passed directly from the client, you’re now able to use --build-context othersource=../../path/to/other/project to avoid this limitation.

Resources