What is the best way to generate events from Jenkins? - jenkins

I have a series of jenkins pipeline jobs to move Apps to Cloud Foundry. My client application need to be able to listen to all the updates of a push. I.e. apart from getting text logs, i need other events like Git repo cloned, cloud foundry logged in, App pushed.
One crud way of doing this is to submit POST requests to an event server from a shell script(Curl). However, I think it is unlikely that such a functionality does not exist already on Jenkins(either through a plugin or something like that).
I need an advice from best practices point of view.
Thanks.

As commented by mdabdullah. But this needs a person to set up kibana or splunk. (I did not try this).
Statistics gatherer plugin
https://plugins.jenkins.io/statistics-gatherer/
Jenkins notification plugin
https://plugins.jenkins.io/notification/
Both 2,3 are available plugins in the Jenkins community. They need to configured for server endpoints before use.

Related

How can I trigger jenkins jobs with a slack bot?

I have Jenkins running on one of the VMs on GCP. I have a bunch of jobs and I want to give my team access to run them from Slack. They haven't used Jenkins earlier, and not at all familiar with the UI. So, they want to run it from Slack. We already have a slack bot. So, here are my following questions:
Can I integrate our existing slackbot with jenkins, so that it can trigger the jobs? If yes, how can I do it? (any tutorial would be greatly appreciated)
I know there is a way to do this with slash commands. But I don't want run a different command for each job, it's actually not really clean. Cause if I have 20 jobs, I have to create tokens for all of them, and configure 20 slash commands.
What are the other ways of triggering Jenkins jobs from Slack?
PS: I'm looking for something like #bot run "job" "parameter" or #bot run "job". And it would be great if the bot can tag the user and respond to a request
you have to create a slack app for this and enable events and interactivity.
https://api.slack.com/apis/connections/events-api#the-events-api__subscribing-to-event-types__events-api-request-urls__request-url-configuration--verification__url-verification-handshake

Trigger Jenkins Pipeline when new file gets added to blob storage

I've an Azure blob storage bucket with some video files.I need to trigger a Jenkins Pipeline whenever a file gets added to the bucket. I was thinking I could have a microservice in the Azure Functions to monitor the bucket and trigger Jenkins but it would be great if I could do this directly without an additional microservice.
Is there a way I can get Jenkins to trigger a pipeline based on my bucket? A plugin or a script or something?
PS: I found this question, but I'm looking for something different.
You could trigger a build without parameters by setting up an event subscription on your storage account to call your Jenkins build endpoint. Since your build won't have parameters, your script would have to keep track of the blobs processed (assuming they are not deleted once processed).
But if you need build parameters then you would have to transform the payload coming from the blob event before calling the Jenkins API.
Though you mentioned that you wouldn't want to include another service for this, sharing options just in case, in increasing order of complexity
If you have your Jenkins API behind an API Gateway, like Azure APIM, you could transform the body before forwarding the request to Jenkins.
Use a simple Logic App to trigger on the event and then call the Jenkins API, passing the parameters extracted from the event as required
Similar to what is mentioned in the other question you linked, Azure Functions.
If you don't have APIM (or something similar), Logic Apps are a great solution considering the use case with almost no-code to write.

Jenkins and GitLab -- Gitlab Hook plugin is the right choice?

There are so many posts about this, and being inexperienced in Git doesn't help to get a good grip on this.
I just joined a new company that dont have CI at all, so jumped on the opportunity to create a proof of concept (using Jenkins locally on my Windows box for now, until I get a dedicated server for it). I've used and semi-configured Jenkins in the past, using SVN, and it was so simple and fast to get it working. In this company, they don't use SVN, only GitLab (I believe its private - we have our own site, not .gitlab.com), and nothing works for me.
I followed a few turorials, but mainly this seemed like the one that meets my needs. It didn't work (the reasons and symptoms are probably worth a post of its own).
When I look at Gitlab Hook plugin in Jenkins, I see a big red warning saying it is not safe ("Gitlab API token stored and displayed in plain text").
So my question, for this POC that i am working on, how serious is this warning? Should I avoid this plugin and then this method altogether because of this?
And while i'm at it, I might also throw an additional general question to open up my options here ... If I want Jenkins to work with Gitlab (meaning, I checkin something and it triggers a build), do I absolutely need to use the SSH method, or it could work with HTTPS as well?
Thank you.
This is indeed SECURITY-263 / CVE-2018-1000196
Gitlab Hook Plugin does not encrypt the Gitlab API token used to access Gitlab. This can be used by users with master file system access to obtain GitHub credentials.
Additionally, the Gitlab API token round-trips in its plaintext form, and is displayed in a regular text field to users with Overall/Administer permission. This exposes the API token to people viewing a Jenkins administrator’s screen, browser extensions, cross-site scripting vulnerabilities, etc.
As of publication of this advisory, there is no fix.
So:
how serious is this warning?
Serious, but it does require access to the Jenkins server filesystem, or it requires Jenkins administration level. So that risk can be documented, acknowledged and, for now, set aside, provided mitigation steps are in place, ie.:
the access to the Jenkins server is properly monitored
the list of Jenkins admin account is properly and regularly reviewed.
do I absolutely need to use the SSH method, or it could work with HTTPS as well?
You can use https for accessing GitLab repositories in a Jenkins job.
But for the GitLab hook plugin, SSH remains the recommended way, considering you would use a token (instead of a user account name/password), that you can revoke at any time.

How to share Jenkins BUILD Monitor

We're running a Jenkins build server that is responsible for composing dozens of jobs for our team. The Build Monitor Plugin is being use to create a dashboard of various builds, and we then project this on a TV in the office; the TV is connected to a dedicated computer (chromebox) that is logged into the Jenkins server and shows the plugin dashboard.
How could I see the dashboard without sitting next to the TV? Ideally, anyone on the team from anywhere should see the Build Monitor dashboard without logging into the jenkins server (we'd have to share a login).
Any ideas on how to achieve this? The best I can think of is to turn the chromebox into a remote access point it remotely, but this feels sloppy.
You can use Role Strategy Plugin to a get finer grain authorization control.
Then you can limit Anonymous read access to just the Dashboard view you want to make public. And require authentication for everything else.
In summary, you require:
A Project Role for anonymous users
That Project Role will only have read views permission for the pattern of the dashboard (.*view/your-build-monitor-name/)
Assing this role to Anonymous special user

JIRA Integration with external systems

I'm working on a POC to automate downstream processes in external systems based on JIRA processes and have hit a wall with the API. It appears to have great integration for pulling data about tickets out of JIRA and for the ability to externally generate tickets into JIRA.
However I don't see how to trigger external calls as a part of my workflows. For example if a ticket should be prevented from being routed to the next stage of a workflow without accessing a database to ensure availability of inventory first how could I do that in JIRA?
Based on attributes in the JIRA ticket upon final completion of the workflow we'd like to send a JMS or REST message or possibly update an external database. Is this possible?
Thanks all in advance for the help!
If you want to do a "before" check, use a Validator on the Workflow Transition.
I strongly suggest deploying the (free) Script Runner add-on. There you can implement a ton of things. For example, you'll get a new validator option "Script Validator", where you can specify a Groovy script that decides if it lets through the transition or aborts it.

Resources