Travis CI - docker image not pushed to dockerhub - docker

I am new to CI with travis and I am trying to learn it by following a course. I have created a public repository in dockerhub, and pushed the project to github with .travis.yml file. I have previously connected and authorised travis at github. The github project can be found here. The travis build gets triggered and it appears to be successful, but nothing is pushed to dockerhub.
You can see the build process here:
The travis build appears to be successful, but nothing is pushed to docker hub. On expanding the log I could see this message is being logged:
Must provide --username with --password-stdin
But, I have already set up the environment variables in travis as you can see below, so I don't get why I get this message?

I didn't realise that environment variables need to be set per project. Also I wasn't even getting the option for settings for that repository until I synced again github with travis. Once I done the syncing part one more time, the option appeared and I was able to set the variables for that repository. And then the build process went through and the image was pushed to dockerhub.

Related

Unable to create Build on Docker Hub from GitHub

I just created a new account on hub.docker.com. I successfully linked my GitHub account in settings. After I clicked on Create Repository, I am presented with the familiar screen, where I can create a new docker repository. However, the first strange thing here I noticed is, that I don't have the additional build options.
This is what is missing:
I went on and created a blank repository. After that I went to the builds tab and clicked on the GitHub button, which said I was connected:
In the following screen, I can select the the desired repository. After filling out the remainder of the form, nothing happens after I click Save or Save and Build.
I do not see any error messages popping up either.
Any ideas what I am doing wrong with this newly created docker account?
Thank you very much for your time and help!
For GitHub, I'd recommend this guide for setting up a connection with GitHub Actions and Docker.
I've only set up a connection with BitBucket and Docker, but I'll share the process with BitBucket for anyone else searching, or in case that provides any additional insight that might help with GitHub.
In BitBucket, you need to set up a pipeline to push the build to the repo. The docs here outline the process.
A quick summary of the steps in BitBucket:
Go to your repo in bit bucket
Click on pipelines and add the Docker template
Update the image name in the yml template
Commit the yml to the repo
Set up the variables in the repository settings in BitBucket for your credentials and repo slug
Check the status of your build in the pipelines tab, it should automatically rebuild up update.

CI/CD integration problem when using google-cloud-build with github push as trigger for Cloud Run

I am trying to set up a CI/CD pipeline using one of my public GitHub repositories as the source for Cloud Run (fully-managed) service using Cloud Build. I am using a Dockerfile initialized in root folder of the repository with source configuration parameter initialized as /Dockerfile when setting up the cloud build trigger. (to continuously deploy new revisions from source repository)
When, I initialize the cloud run instance, I face the following error:
Moreover, when I try to run my cloud build trigger manually, it shows the following error:
I also tried editing continuous deployment settings by setting it to automatically detect Dockerfile/cloudbuild.yaml. After that, build process becomes successful but the revision are not getting updated. I've also tried deploying a new revision and then triggering cloud build trigger but it isn't still able to pick the latest build from container registry.
I am positive that my Dockerfile and application code are working properly since I've previously submitted the build on Container registry using Google Cloud Shell and have tested it manually after deploying it to cloud run.
Need help to fix the issue.
UPPERCASE letters in the image path aren't allowed. Chnage Toxicity-Detector to toxicity-detector

DockerHub builds too many images

I set up DockerHub to build an image every time a commit is pushed to either dev or master on Github.
But every time a commit is pushed 2 builds are scheduled for the same tag.
Why?
Maybe you already found the solution, but in case other users run into the same issue:
Docker Hub automated builds rely on some GitHub webhook to trigger the corresponding docker build at each push. However, in the past few months, Docker Hub has changed the corresponding URL entrypoint, so that GitHub repositories may contain several versions of the webhook, which leads to multiple, spurious builds. (See e.g. this GitHub issue for details)
To fix this, you would just need to browse the webhooks settings of your GitHub repo − that should be at (private) URL https://github.com/cadoman/mapisto-api/settings/hooks − and only keep the Docker Hub item that starts with https://hub.docker.com/api/...

Pipeline GitHub -> Travis CI -> Docker

I have a github-repository, that is linked to automated build on Docker. Consequently, on each commit to master-branch, docker triggers building of Docker-image.
Also, each commit is tested by Travis CI automatically.
My question is: is there any way to trigger Docker only if travis finishes successfully? Do I need some sort of webhook or something like that for my goal?
You could trigger the Travis CI test after the repository is pushed. Then, in the deploy step you could trigger a build on Docker. Or even do the build inside Travis, and just push the image to the repository you are using.
Travis has a nice overview of how to make this flow happen here.
The gist is that you're going to need to have sudo: required, so you're going to be running in a VM instead of inside Docker, as is the standard way in Travis. You also need to add docker as a service, much like you'd add redis or postgres for an integration test. The Pushing Docker Image to a Registry section has a lot of info on setting things up for the actual deployment. I'd use an actual deploy step with the script provider, rather than after_success, but that's up to you.

Jenkins - Docker integration - Use Jenkins to build Docker images and push to the registry

I am currently working on integrating Docker with Jenkins and I am currently trying to figure out the following pipeline:
Whenever a Dockerfile is updated in GIT, trigger a Jenkins Job to do the following
Build the Docker image
List item
Test, Verify the Docker image
Version the image - Prod, testing etc.
Push the image to the registry
If the image is not built, have a proper mechanism to get the logs
From my research, I found that we have 2 different plugins for Jenkins for Docker integration - Build step plugin and Docker build publish plugin. As far as I could see, I could not see any plugins or workflow to test the image before pushing it to the repository. Since we are doing this from the scratch, I would like to know the best tried and tested workflow.
Any help appreciated.
We applied the same mindset like "git flow" to the creation of docker images. In our solution, there was no need for testing the image itself. We solved that splitting up the Build in to a "Source-Build" producing artifacts and a downstream job e.g. "Runtime-Build" only packaging the artifacts into the runtime and pushing into the registry. At this point the whole stack is delivered to a "Release-Stage" for automatic testing.
To test the image there's a tool called Anchore.
Then, if you want to integrate other types of tests before building the Docker image, you can integrate for example Sonarqube with Jenkins and do a static analysis of the source code. Full example at: https://pillsfromtheweb.blogspot.com/2020/05/integrate-sonarqube-static-analysis-in.html

Resources