CI/CD integration problem when using google-cloud-build with github push as trigger for Cloud Run - docker

I am trying to set up a CI/CD pipeline using one of my public GitHub repositories as the source for Cloud Run (fully-managed) service using Cloud Build. I am using a Dockerfile initialized in root folder of the repository with source configuration parameter initialized as /Dockerfile when setting up the cloud build trigger. (to continuously deploy new revisions from source repository)
When, I initialize the cloud run instance, I face the following error:
Moreover, when I try to run my cloud build trigger manually, it shows the following error:
I also tried editing continuous deployment settings by setting it to automatically detect Dockerfile/cloudbuild.yaml. After that, build process becomes successful but the revision are not getting updated. I've also tried deploying a new revision and then triggering cloud build trigger but it isn't still able to pick the latest build from container registry.
I am positive that my Dockerfile and application code are working properly since I've previously submitted the build on Container registry using Google Cloud Shell and have tested it manually after deploying it to cloud run.
Need help to fix the issue.

UPPERCASE letters in the image path aren't allowed. Chnage Toxicity-Detector to toxicity-detector

Related

Is there any way in google cloud to automatically erase a past docker image the moment you create a new one?

I made a google cloud run trigger that creates an image the moment you make a git push into the repo. This creates the image that is stored in gc, but in the long run that will be full of docker images. I´m lookin for a way to do it automatically, i know it can be done manually but that is not what I want. Anything is helpful at this moment
You can integrate your code in Cloud Source Repositories. Then, Cloud Build will create the desired Docker image than will be pushed to Google Cloud Container Registry. As soon as you commit code, the whole pipeline will run, updating the image.
Click the Cloud Run instance / Set up continuous deployment, enter the GitHub repo and enter Build configuration.
Then you attach the existing Cloud Build trigger to Cloud Run service
GCP Documentation:
https://cloud.google.com/run/docs/continuous-deployment-with-cloud-build

Where to find deployment and run logs GCP

So, I've deployed a container using Google Cloud Run with continuous deployment. Google Cloud Build says that build was successful, but when I go to app url I see a error page: "Continuous deployment has been set up, but your repository has failed to build and deploy.".
There's no container logs in Google Cloud Run logs page, the only ones I can see are: "Hello from Cloud Run! The container started successfully and is listening for HTTP requests on $PORT"
How do I find deployment and run logs in order to figure out what went wrong?
Make sure your Build,Push and Deploy step was set up correctly in Cloud Build (either within your repository cloudbuild.yaml or Inline in the trigger). Since you are likely using a trigger you can find the configuration there (Cloud Build > Triggers > {your-tigger}).
The error can mean that the image you are trying to mount in your Cloud Run did not build correctly.
To verify the image has been created you can go to Cloud Run YAML tab and verify the image it is trying to pull is actually there (Cloud Run > {your-service} > YAML tab).

How to deploy the built docker image built by Cloud Build on Cloud Run automatically

Currently I trigger a Cloud Build each time a pull request is completed.
The image is built correctly, but we have to manually go to Edit and Deploy New Revision and select the most recent docker image to deploy.
How can we automate this process and have the a container deployed from the image automatically?
You can do it with a pretty simple GitHub Action. I have followed this article:
https://towardsdatascience.com/deploy-to-google-cloud-run-using-github-actions-590ecf957af0
Cloud Run also natively integrates with Cloud Build. You can import a GitHub repository and it sets up a GCB trigger for your repository (on the specified branch or tag filter).
You can use GitLab CI to automate your Cloud Run deployment.
Here are the tutorial if you want to automate your deployment with GitLab CI Link

Trigger a new build via Codeship API from Jenkins

I have a CI/CD setup with a Jenkins server to manage our internal CI/CD. We have Codeship performing our CI/CD for our AWS work.
I'm looking to setup jobs on our Jenkins server to manage when new builds are triggered on Codeship.
The aim being, we will have our Jira dashboard integrated with Jenkins in such a way that as an issue's status changes, specific jobs are executed.
So I'm trying to create a job that uses Codeship's API to trigger a new build, but it appears that you can only rerun an old build? How do you trigger a fresh build?
From the docs enter link description here you can only retrieve information and restart previous builds.
You want to run specific jobs, but those must be associated with some specific commit on your repository. You can identify the build for that specific commit and restart it.
Builds are always triggered from your git repository (github or bitbucket), and Codeship is highly dependent on that to keep the flow as simple as possible. You don't need to upload anything anywhere and then command Codeship to run a build on that. All you need is specify a repository and push something.
You could create an internal git server where your developers push to and with jenkins you can push changes from there to a repository connected to Codeship. That way you could control indirectly what gets tested and what does not.

Jenkins - Docker integration - Use Jenkins to build Docker images and push to the registry

I am currently working on integrating Docker with Jenkins and I am currently trying to figure out the following pipeline:
Whenever a Dockerfile is updated in GIT, trigger a Jenkins Job to do the following
Build the Docker image
List item
Test, Verify the Docker image
Version the image - Prod, testing etc.
Push the image to the registry
If the image is not built, have a proper mechanism to get the logs
From my research, I found that we have 2 different plugins for Jenkins for Docker integration - Build step plugin and Docker build publish plugin. As far as I could see, I could not see any plugins or workflow to test the image before pushing it to the repository. Since we are doing this from the scratch, I would like to know the best tried and tested workflow.
Any help appreciated.
We applied the same mindset like "git flow" to the creation of docker images. In our solution, there was no need for testing the image itself. We solved that splitting up the Build in to a "Source-Build" producing artifacts and a downstream job e.g. "Runtime-Build" only packaging the artifacts into the runtime and pushing into the registry. At this point the whole stack is delivered to a "Release-Stage" for automatic testing.
To test the image there's a tool called Anchore.
Then, if you want to integrate other types of tests before building the Docker image, you can integrate for example Sonarqube with Jenkins and do a static analysis of the source code. Full example at: https://pillsfromtheweb.blogspot.com/2020/05/integrate-sonarqube-static-analysis-in.html

Resources