Jenkins Terraform GitOps - how to stop jenkinsfile exploitation? - jenkins

This model looks good: https://cloud.google.com/architecture/managing-infrastructure-as-code-with-terraform-jenkins-and-gitops#infrastructure_proposal
However it's possible for anyone with repo access to create a feature branch, put any old content into a jenkinsfile, and create a PR - at which point Jenkins will run whatever's in the jenkinsfile. To me this kinda negates the controls to enforce peer review of code, or stops you allowing certain individuals only permitted to deploy changes to dev etc.
I'm not aware that you can "protect" the jenkinsfile and stop this happening (at least not in github).
Best solution may be to enforce controls at the cloud access key credential level (I'm an AWS user so I think in terms of secret access key) - so only certain jenkins (or github) users can pull the creds to make infrastructure changes?

Related

What is the best way to generate events from Jenkins?

I have a series of jenkins pipeline jobs to move Apps to Cloud Foundry. My client application need to be able to listen to all the updates of a push. I.e. apart from getting text logs, i need other events like Git repo cloned, cloud foundry logged in, App pushed.
One crud way of doing this is to submit POST requests to an event server from a shell script(Curl). However, I think it is unlikely that such a functionality does not exist already on Jenkins(either through a plugin or something like that).
I need an advice from best practices point of view.
Thanks.
As commented by mdabdullah. But this needs a person to set up kibana or splunk. (I did not try this).
Statistics gatherer plugin
https://plugins.jenkins.io/statistics-gatherer/
Jenkins notification plugin
https://plugins.jenkins.io/notification/
Both 2,3 are available plugins in the Jenkins community. They need to configured for server endpoints before use.

Trigger Jenkins Pipeline when new file gets added to blob storage

I've an Azure blob storage bucket with some video files.I need to trigger a Jenkins Pipeline whenever a file gets added to the bucket. I was thinking I could have a microservice in the Azure Functions to monitor the bucket and trigger Jenkins but it would be great if I could do this directly without an additional microservice.
Is there a way I can get Jenkins to trigger a pipeline based on my bucket? A plugin or a script or something?
PS: I found this question, but I'm looking for something different.
You could trigger a build without parameters by setting up an event subscription on your storage account to call your Jenkins build endpoint. Since your build won't have parameters, your script would have to keep track of the blobs processed (assuming they are not deleted once processed).
But if you need build parameters then you would have to transform the payload coming from the blob event before calling the Jenkins API.
Though you mentioned that you wouldn't want to include another service for this, sharing options just in case, in increasing order of complexity
If you have your Jenkins API behind an API Gateway, like Azure APIM, you could transform the body before forwarding the request to Jenkins.
Use a simple Logic App to trigger on the event and then call the Jenkins API, passing the parameters extracted from the event as required
Similar to what is mentioned in the other question you linked, Azure Functions.
If you don't have APIM (or something similar), Logic Apps are a great solution considering the use case with almost no-code to write.

Any strategies for locking down Jenkins credentials to a shared-library?

I like the use of Jenkinsfiles and shared-libraries for their purported benefits, but i have some governance concerns about execution of arbitrary Jenkinsfiles (and potential use of broadly scoped credentials).
I'm thinking it would be convenient to lock down credentials to use by a specific shared-library to enforce usage patterns (at the same time, i think it's entirely possible that there are better ways to approach the problem space), so i'm just reaching out for any ideas/guidance in this space.
Not sure if our solution would work for you or not.
We have a shared library; reviewed and rather locked down so that not just anyone can make a change to it. We have that library attached to two separate folders on our Jenkins.
In one folder, our users have their own personal or team based folders and can create jobs and credentials in that space. There are no shared credentials on this big folder and teams do not share their credentials with each other.
There is a second folder, which none of the users/teams have edit access to, but can execute existing jobs. There are no Jenkinsfile, SCM-based jobs in that folder where they can modify the job via SCM. This folder has the "locked down" credentials on it that are not meant to be shared or accessible. The users/teams can edit their jobs (in their folders) to call these jobs - but they can't edit these protected jobs to gain access to the credentials.
It's awkward in a way. But it has preserved the divide between users and credentials they shouldn't have access to.

Unable to add SSH key in GitHub

In our GitHub we have around 20 repositories. For the CI Build we have enabled Git polling option.
Our Jenkins master has attached with multiple nodes. For Git Polling we usually add our Jenkins Master ssh key to repective user's GitHub under settings SSH key section. While adding the key getting Error: Key already in use. Let me know to add the same.
As per error message for other repository build we have already added our Jenkins Master key with different user's
account.
A SSH key can only be attached to a single user on GitHub, since it is used to authenticate and authorize this user. There is no way to add to multiple accounts.
GitHub provides a guide about dealing with SSH keys for automated scripts here: Managing deploy keys. The two interesting options are:
Typically, you would use deploy keys to gain access to a repository from a server. Deploy keys have a similar restriction as a user's SSH key though, and can only be attached to a single repository. This reduces the potential damage that can be done if the key is compromised. For build servers they are often not well suited, because it is often not possible to configure authentication per repository.
For your use case, a machine user seems to be the best option. This is a dedicated user account that is only used by your build server. Make sure to use a strong password and two factor authentication for this account, and add Jenkins' master key to it. You can then add the machine user as a collaborator on the repositories you need in Jenkins.
With regards to security, be as restrictive as possible: only the repositories that are required, and only with read permissions. This is also the reason why you should use a machine user instead of an actual user account. For Jenkins, you (usually) don't need write access to a repository. By limiting the access rights for the server key, the impact of a compromised key is reduced.

Jenkins and GitLab -- Gitlab Hook plugin is the right choice?

There are so many posts about this, and being inexperienced in Git doesn't help to get a good grip on this.
I just joined a new company that dont have CI at all, so jumped on the opportunity to create a proof of concept (using Jenkins locally on my Windows box for now, until I get a dedicated server for it). I've used and semi-configured Jenkins in the past, using SVN, and it was so simple and fast to get it working. In this company, they don't use SVN, only GitLab (I believe its private - we have our own site, not .gitlab.com), and nothing works for me.
I followed a few turorials, but mainly this seemed like the one that meets my needs. It didn't work (the reasons and symptoms are probably worth a post of its own).
When I look at Gitlab Hook plugin in Jenkins, I see a big red warning saying it is not safe ("Gitlab API token stored and displayed in plain text").
So my question, for this POC that i am working on, how serious is this warning? Should I avoid this plugin and then this method altogether because of this?
And while i'm at it, I might also throw an additional general question to open up my options here ... If I want Jenkins to work with Gitlab (meaning, I checkin something and it triggers a build), do I absolutely need to use the SSH method, or it could work with HTTPS as well?
Thank you.
This is indeed SECURITY-263 / CVE-2018-1000196
Gitlab Hook Plugin does not encrypt the Gitlab API token used to access Gitlab. This can be used by users with master file system access to obtain GitHub credentials.
Additionally, the Gitlab API token round-trips in its plaintext form, and is displayed in a regular text field to users with Overall/Administer permission. This exposes the API token to people viewing a Jenkins administrator’s screen, browser extensions, cross-site scripting vulnerabilities, etc.
As of publication of this advisory, there is no fix.
So:
how serious is this warning?
Serious, but it does require access to the Jenkins server filesystem, or it requires Jenkins administration level. So that risk can be documented, acknowledged and, for now, set aside, provided mitigation steps are in place, ie.:
the access to the Jenkins server is properly monitored
the list of Jenkins admin account is properly and regularly reviewed.
do I absolutely need to use the SSH method, or it could work with HTTPS as well?
You can use https for accessing GitLab repositories in a Jenkins job.
But for the GitLab hook plugin, SSH remains the recommended way, considering you would use a token (instead of a user account name/password), that you can revoke at any time.

Resources