Trigger Jenkins job when a S3 file is updated - jenkins

I'm looking for a way to trigger my Jenkins job whenever a file is created or updated in S3.
I can't seem to find anything by usual means of search. It is always upload artifacts to S3, but rarely download and even then I can't seem to find a way to trigger of the actual update process.
The only way I currently can figure out how to do this at all, would be to sync the file periodically and compare the hash to previous versions, but that is a really terrible solution.
The idea behind this would be to have an agency (which does not have access to our Jenkins) upload their build artifacts and to trigger a deployment from that.

You can use a combination of SNS Notifications for new artifacts in the S3 bucket https://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html and the Jenkins AWS SQS plugin to trigger a build (https://github.com/jenkinsci/aws-sqs-plugin)
A little bit of manual configuration is required in terms of the AWS SQS plugin, but it should work.
S3 Upload > SNS Notification > Publish to SQS > Trigger Jenkins Build
Ideally it would be straight to Jenkins like so: S3 Upload > SNS Notification > Publish to Jenkins HTTP Endpoint > Trigger Jenkins Build
Hope this helps

We can write a icron job if linux or powershell script if windows, which queries a particular s3 bucket for the given string, if it finds then u can trigger the Jenkins job.
For doing this, the Jenkins instance must be in the AWS itself if we are trying to add an IAM role, if not we need to add aws credentials.

To implement S3 Upload > Publish to SQS > Trigger Jenkins Build (assuming you have appropriate AWS Users, Roles and Policies attached):
Create an AWS SQS Queue
After creating an AWS SQS Queue, on AWS S3 bucket we need to configure:
S3 Bucket "Events" section to register an "Object Create" event
Provide the SQS Queue name. Detailed documentation.
On Jenkins, we need to:
Install Plugin AWS SQS from the Jenkins Install Plugin Page
Configure AWS SQS Plugin to point to SQS queue in Jenkins System Configuration
Configure the Jenkins Job to "Trigger build when a message is published to an Amazon SQS queue"
Note that Jenkins user MUST have Read access to SQS(all Read fucntions) in addition to S3 access.
Now whenever someone adds/updates anything on the bucket S3 sends an event notification the SQS which is then polled by the Jenkins AWS SQS plugin and the respective Job Build is triggered!
This article explains the process in detail AWS to Github to Jenkins. If you are just using S3 then you would skip the Github part.

Related

Remove auto trigger from ECR as Source in AWS CodePipeline

I have a pipeline with Source stage which reads from ECR. For every image pushed to ECR my pipeline is triggered automatically. I don't want this behavior and would like to start my pipeline manually from Release Change button. How to achieve this?
I managed to achieve the same for GitHub Source Stage by removing the webhook from GitHub repo itself. Unable to find similar web-hook for ECR.
CodePipeline uses CloudWatch rules when the source is configured as ECR to start its execution on an Image Push.
To verify this, you can check the associated CloudWatch rule [1] and look if the event pattern currently set on your rule matches an event on ECR Image Push. Also you can refer to this sample Event for a Completed Image Push [2] to see the available attributes for filtering.
As a general guidance, you might want to check the following links that will walk through the process of using ECR as a Source on CodePipeline[3].
[1] Use CloudWatch Events to Start a Pipeline (Amazon ECR Source) - https://docs.aws.amazon.com/codepipeline/latest/userguide/create-cwe-ecr-source.html
[2] Amazon ECR Events and EventBridge - Sample Events from Amazon ECR - https://docs.aws.amazon.com/AmazonECR/latest/userguide/ecr-eventbridge.html#ecr-eventbridge-bus
[3] AWS DevOps Blog - https://aws.amazon.com/pt/blogs/devops/build-a-continuous-delivery-pipeline-for-your-container-images-with-amazon-ecr-as-source/

How to check frequently to a artifactory repo and if we found any new artifacts that should start bamboo plan

I have a Jenkins job outside my workplace network that creates artifacts and these are copied to my company artifactory. from there will run bamboo plan to deploy.
1)how can I search for new artifacts in artifactory to automatically start bamboo plan
2) how can we automate this process
I think the answer here is a User-Plugin (examples).
You should probably write a user-plugin that will listen to an afterCreate event and then trigger a REST call to the Bamboo server to trigger the build (probably via the Build Queue Service)
Hope this helps.

How can I access my Jenkins dashboard on my remote droplet server?

I'm little confused about Jenkins and was hoping someone could clarify some matter for me.
After reading up on Jenkins, both from official docs and various tutorials I get this:
If I wanna set up auto deplyoment or anything Jenkins related, I could just install docker jenkins image, launch it and access it via localhost. That is clear to me.
Then, I just put Jenkinsfile into my repository, so that it knows what and how to build my repo and stuff.
The questions that I have are:
It seems to me that Jenkins needs to be running all the time, so that it can watch for all the repo changes and trigger code building, testing and deploying. If that is the case, I'd have to install Jenkins on my droplet server. But how do I then access my dashboard, if all I have is ssh access?
If Jenkins doesn't need to be up and running 24/7, then how does it watch for any changes?
I'll try to deploy my backend and front apps on docker-compose file on my server. I'm not sure where does Jenkins integrates in all that.
How Jenkins can watch for all the repository changes and trigger code building, testing and deploying?
If Jenkins doesn't need to be up and running 24/7, then how does it watch for any changes?
Jenkins and other automation servers offer two options to watch source code changes:
Poll SCM: Download and compare source code at predefined intervals.This is simple but, hardware consumption is elevated and is a little outdated
Webhooks: Optimal functionality offered by github, bitbucket, gitlab, etc. in which Github, for example, at any git event, makes an http request to your automation server, sending all the information like branch name, commit author, etc). Here more info about webhooks and jenkins.
If you don't want a 24/7 dedicated server, you can use:
Some serverless platform or just a simple application able to receive http posts + webhook strategy. For instance, Github will perform a post requet to your app/servlerless and at this point, just execute your build, test or any other commands to deploy your application.
https://buddy.works/. It is like a mini-jenkins.
If I'd have to install Jenkins on my droplet server. But how do I then access my dashboard, if all I have is ssh access?
Yes. Jenkins is an automation server, so it needs its own dedicated server.
You can install jenkins manually or use docker in your droplet. Configure 8080 port for your jenkins. If everyting is ok, just access to your droplet public ip offered by digitalocean, like: http://197.154.458.456:8080. This url must load the Jenkins dashboard.

Cloud Dataflow to trigger email upon failure

When running Dataflow job in Google cloud, how can the Dataflow pipeline be configured to shoot an email upon failure (or successful completion)? Is there an easy option, where it can be configured inside the Pipeline program, or any other options?
One possible option from my practice is to use Apache Airflow(on GCP, it's Cloud Composer).
Cloud Composor/Apache Airflow can send you failure emails when DAG (jobs) face failures. So I can host my job in Airflow and it sends emails upon failures.
You can check[1] to see if it satisfies your need
https://cloud.google.com/composer/docs/quickstart

Accessing the BitBucket webhook's payload in Jenkins jobs

I'm using webhooks on Bitbucket to trigger builds on Jenkins when push event occurs, for this purpose I'm using Bitbucket plugin.
My Jenkins pipeline consist of multiple cross depending tasks e.g.:
Main pipeline (triggered task)
1) build docker images
2) run tests
3) do something
The build is triggered when expected but tasks are failing because they rely on specific branch that I need to provide. Unfortunately I don't know how to access the webhook's payload that have all the information I need.
The alternative would be using Poll CMS option in Jenkins but I prefer to build on demand and not periodically.
From:
https://wiki.jenkins-ci.org/display/JENKINS/BitBucket+Plugin
they say:
Since 1.1.5 Bitbucket automatically injects the payload received by Bitbucket into the build. You can catch the payload to process it accordingly through the environmental variable $BITBUCKET_PAYLOAD.
Regards

Resources