I run a docker image for data processing on a windows server 2016 (single vm, On Premises). My image is stored in a Azure Container Registry. The code does not change often. To get security Updates I like to get a rebuild and release after the microsoft/windowsservercoreis updated.
Is there a Best Practice Way to do this?
I thought about 3 ways of solving this:
Run a scheduled build every 24h, pull the microsoft/windowsservercore, pull my custom image, run powershell to get the build dates and compare then (or use some of the histroy ids). If a rebuild is needed, build the new image and tag the build. Configure the Release to run only on this tag.
Run a Job to check the update time of the docker image and trigger the build with a REST request.
Put a basic Dockerfile on github. Set up automated Build with a trigger to microsoft/windowsservercore and configure the webhook to a WebService, which start the Build with REST.
But I really like non of these Ideas. Is there a better option?
You can use Azure Container Registry webhooks directly, the simple workflow:
Build a Web Api project to queue build per to detail request (webhook request) through Queue a build Rest API
Create an Azure Container Registry webhook to call Web API (step1)
I choose option three. Therefore I set up a github repository with a one line Dockerfile:
FROM alpine
I used the alpine image and not the windowsservercore, because automated build does currently does not support windows images. I configured a automated build in the docker hub and add a Linked Repositories to microsoft/windowsservercore.
Then I set up a MS Flow with a HTTP Request Trigger to start the Build. Add the Flow URL to a new webhook on the automated build.
For me this are to many moving parts that has to configured and work together, but I know no better way.
Related
Currently I trigger a Cloud Build each time a pull request is completed.
The image is built correctly, but we have to manually go to Edit and Deploy New Revision and select the most recent docker image to deploy.
How can we automate this process and have the a container deployed from the image automatically?
You can do it with a pretty simple GitHub Action. I have followed this article:
https://towardsdatascience.com/deploy-to-google-cloud-run-using-github-actions-590ecf957af0
Cloud Run also natively integrates with Cloud Build. You can import a GitHub repository and it sets up a GCB trigger for your repository (on the specified branch or tag filter).
You can use GitLab CI to automate your Cloud Run deployment.
Here are the tutorial if you want to automate your deployment with GitLab CI Link
Docker introduces RUN --mount=type=cache which I can work well locally, but I want to be able to leverage it in a CI specifically Azure Devops.
But I can't find a way of save and load the cache between builds. Is there an option to do this?
Please refer to this doc:
In the current design of Microsoft-hosted agents, every job is dispatched to a newly provisioned virtual machine (based on the image generated from azure-pipelines-image-generation repository templates). These virtual machines are cleaned up after the job reaches completion, not persisted and thus not reusable for subsequent jobs. The ephemeral nature of virtual machines prevents the reuse of cached Docker layers.
Therefore, the local docker cache on VM cannot be used by another build when you use Microsoft-Hosted agents.
Here are some alternative methods:
You could use self-hosted agent to execute the docker build process. Multiple builds can share the local cache.
You can also you use Cache task and docker save/load commonds to upload the saved docker layer to azure devops server and restore it on the future run.
Use docker pull to pull the image from remote repository. Use using --cache-from to point the image. You could push the build image to remote repository for next build.
You could refer to this blog and this ticket for more detailed info.
Well, how can I create a pipeline to build and release a Docker compose, with Azure Devops through the graphical interface (GUI) I am not an expert in devops but I have this challenge in my work.
I would point you toward a great guide by microsoft, it's for java applications but you can get what you need out of it.
Solution in general:
Open the Azure Portal. Select + Create a resource and search for
Container Registry. Select Create. In the Create Container Registry
dialog, enter a name for the service, select the resource group,
location and click Review + Create. Once the validation is success
click Create.
In your CI build you need to have 2 tasks, 1 for the build/compose where you provide and another to publish the image to your Azure Container Registry. You will use the "same task" for this.
This container registry is where you store the outputs of your builds, similar to artifacts in traditional CI builds. This is where you publish your application from during a release to on-prem or cloud.
You can read more about the parameters you need to provide and the settings in details in the guide.
P.S. Here is an example on how to dockerize and existing .NETCore application.
How do you build and release your Docker compose on local?
Normally, you can copy the related docker-compose CLI and Docker CLI that you execute on local to the shell script tasks (such as Bash, PowerShell, etc.) in the pipeline you set up on Azure DevOps.
Of course, there are also the available Docker Compose task and Docker task.
I am exploring the automated build features of the "New Docker Hub".
I see that it is possible to trigger automated tests for each new pull request that is submitted against a specific branch.
Is it possible to push/publish the images created from a specific pull request?
If this was possible, an open source project could invite folks to test pull requests without needing to re-build the code for a pull request.
I am trying to build a jenkins job(trigger builds remotely) on docker image build, build all I am getting on docker hub is following:
HISTORY
ID Status Date & Time
7345... ! ERROR 10/12/17 10:03
Reason (I assume): Docker is not authenticated to post to the jenkins url.
Question: How can I trigger the job automatically when an image gets pushed to docker hub?
Pull and run Watchtower docker image to poll any third-party public Docker image on Docker Hub or Quay that you need (typically as a base image of your own containers). Here's how. "Polling" here does not imply crudely pulling the whole image every 5 minutes or so - we are monitoring periodically for changes in the image, downloading only the checksum (SHA digest) most of the time (when there are no changes in the locally cached image).
Install the Build Token Root Plugin in your Jenkins server and set it up to receive Slack-formatted notifications secured with a token to trigger builds remotely or - safer - locally (those triggers will be coming from Watchtower container, not Slack). Here's how.
Set up Watchtower to post Slack messages to your Jenkins endpoint upon every change in the image(s) (tags) that you want. Here's how.
Optionally, if your scale is so large that you could end up overloading and bringing down the entire Docker Hub with a flood HTTP GET requests (should the time triggers go wrong and turn into a tight loop) make sure to build in some safety checks on top of Watchtower to "watch the watchman".
You can try the following plugin: https://wiki.jenkins.io/display/JENKINS/CloudBees+Docker+Hub+Notification
Which claims to do what you're looking for.
You can configure a WebHook in DockerHub wich will trigger the Jenkins-Build.
Docker Hub webhooks targeting your Jenkings server endpoint require making periodic copies of the image to another repo that you own [see my other answer with Docker Hub -> Watchman -> Jenkins integration through Slack notifications].
More details
You need to set up a cron job with periodic polling (docker pull) of the source repo to [docker] pull its `latest' tag, and if a change is detected, re-tag it as your own and [docker] push to a repo you own (e.g. a "clone" of the source Docker Hub repo) where you have set up a webhook targeting your Jenkings build endpoint.
Then and only then (in a repo you own) will Jenkins plugins such as Docker Hub Notification Trigger work for you.
Polling for Dockerfile / release changes
As a substitute of polling the registry for image changes (which need not generate much network traffic thanks to the local cache of docker images) you can also poll the source Dockerfile on Github using wget. For instance Dockerfiles of the official Docker Hub images are here. In case when the Github repo makes releases, you can get push notifications of them using Github Watch > Releases Only feature and if they have CI docker builds. Docker images will usually be available with a delay after code releases, even with complete automation, so image polling is more reliable.
Other projects
There was also a proposal for a 2019 Google Summer of Code project called Polling Docker Registries for Image Changes that tried to solve this problem for Jenkins users (incl. apparently Google), but sadly it was not taken up by participants.
Run a cron job with a periodic docker search to list all tags in the docker image of interest (here's the script). Note that this script requires the substitution of the jannis/jq image with an existing image (e.g. docker run --rm -i imega/jq).
Save resulting tags list to a file, and monitor it for changes (e.g. with inotifywait).
Fire a POST request using curl to your Jenkins server's endpoint using Generic Webhook Trigger plugin.
Cautions:
for efficiency reasons this tags listing script should be limited to a few (say, 3) top pages or simple repos with a few tags,
image tag monitoring relies on tags being updated correctly (automatically) after each image change, rather than being stuck in the past, like say Ubuntu tags (e.g. trusty-20190515 was updated a few days ago - late November, without the change in its mid-May tag).