When building Azure Functions using custom containers I am unable to restore NuGet packages from a private Azure Artifacts NuGet feed. I get the error message error NU1301: Unable to load the service index for source https://pkgs.dev.azure.com/<project>/_packaging/<feed>/nuget/v3/index.json in the docker-compose logs, because Docker has no way of authenticating with the Azure Artifacts feed. How do I authenticate with Azure Artifact sin a way that works both locally when debugging in Visual Studio Container Tools, and also when building the Docker images using Azure Pipelines?
Please do not suggest using a PAT, because this is just a security nightmare.
I have a specific requirement on referring common .DLL files to multiple applications running on Azure Container instances/App Services. What I want to do is to package all the .DLL files to one docker image and put them in a container registry or docker hub and use them in the application runtime of other applications as common libraries. This is required to migrate to the cloud from the following architecture in an on-prem IIS server. Is this possible to do in Azure using a Container registry - how to approach this?
I've done this using Azure Container Instances. I deployed both base DLL library and the projects that are referring to the DLL library separately in an Azure Container Registry and used relative paths in the DOCKER file inside the project to the DLL image. So worked smoothly.
I am new to this. I am now having a project which is developing an application. From what I known, normally people will store it on docker hub by then creating a docker container and pass it to Kubernetes. But what we are trying to do is that we wish to store the repositories without using the docker hub. Is there a way to build something like that from scratch? Which direction should I look into?
You can install jfrog or nexus artifact management server to create repository and store your image to that artifact management repo.
Please check this link for more details.
If I have the automated build set up on DockerHub, for instance, based on ubuntu:yy_mm image and in its Dockerfile I install some package foo-bar-ng through the apt-get, how can I set up the image to be automaticaly rebuilt when the package is updated in Ubuntu repository?
Right now the only approach I see is to develop and spin up separate private service for myself which will monitor the package version in official Ubuntu repository and trigger the rebuild by "Built triggers" DockerHub feature that is available in automatic build settings:
Trigger your Automated Build by sending a POST to a specific endpoint.
For instance, here question about how can new packages be monitored in specific Ubuntu repo.
(Made this as an answer - let the community vote on it and, especially, provide better answer if there is any)
I know that Docker hub is there but it allows only for 1 private repository. Can I put these images on Github/Bitbucket?
In general you don't want to use version control on large binary images (like videos or compiled files) as git and such was intended for 'source control', emphasis on the source. Technically, here's nothing preventing you from doing so and putting the docker image files into git (outside the limits of the service you're using).
One major issue you'll have is git/bitubucket have no integration with Docker as neither provide the Docker Registry api needed for a docker host to be able to pull down the images as needed. This means you'll need to manually pull down out of the version control system holding the image files if you want to use it.
If you're going to do that, why not just use S3 or something like that?
If you really want 'version control' on your images (which docker hub does not do...) you'd need to look at something like: https://about.gitlab.com/2015/02/17/gitlab-annex-solves-the-problem-of-versioning-large-binaries-with-git/
Finally, docker hub only allows one FREE private repo. You can pay for more.
So the way to go is:
Create a repository on Github or Bitbucket
Commit and push your Dockerfile (with config files if necessary)
Create an automated build on Docker Hub which uses the Github / Bitbucket repo as source.
In case you need it all private you can self-host a git service like Gitlab or GOGS and of course you can also selfhost a docker registry service for the images.
Yes, since Sept. 2020.
See "Introducing GitHub Container Registry" from Kayla Ngan:
Since releasing GitHub Packages last year (May 2019), hundreds of millions of packages have been downloaded from GitHub, with Docker as the second most popular ecosystem in Packages behind npm.
Available today as a public beta, GitHub Container Registry improves how we handle containers within GitHub Packages.
With the new capabilities introduced today, you can better enforce access policies, encourage usage of a standard base image, and promote innersourcing through easier sharing across the organization.
Our users have asked for anonymous access for public container images, similar to how we enable anonymous access to public repositories of source code today.
Anonymous access is available with GitHub Container Registry today, and we’ve gotten things started today by publishing a public image of our own super-linter.
GitHub Container Registry is free for public images.
With GitHub Actions, publishing to GitHub Container Registry is easy. Actions automatically suggests workflows based for you based on your work, and we’ve updated the “Publish Docker Container” workflow template to make publishing straightforward.
GitHub is in the process of releasing something similar to ECR or Docker Hub. At the time of writing this, it's in Alpha phase and you can request access.
From GitHub:
"GitHub Package Registry is a software package hosting service, similar to npmjs.org, rubygems.org, or hub.docker.com, that allows you to host your packages and code in one place. You can host software packages privately or publicly and use them as dependencies in your projects."
https://help.github.com/en/articles/about-github-package-registry
I guess you are saying about docker images. You can setup your own private registry which will contain the docker images. If you are not pushing only dockerfiles, but are interested in pushing the whole image, then pushing the images as a whole to github is a very bad idea. Consider a case you have 600 MB of docker image, pushing it to github is like putting 600 MB of data to a github repo, and if you keep on pushing more images there, it will get terribly bad.
Also, docker registry does the intelligent mapping of storing only a single copy of a layer (this layer can be referenced by multiple images). If you use github, you are not going to use this use-case. You will end up storing multiple copies of large files which is really really bad.
I would definitely suggest you to go with a private docker registry rather than going with github.
If there is a real need of putting docker image to github/bitbucket you can try to save it into archive (by using https://docs.docker.com/engine/reference/commandline/save/) and commit/push it to your repository.