Access Jenkins instance with shared-libraries for pipelines - jenkins

We were pretty hooked onto using some plugins which are not supported in pipelines anymore and would like to implement their usage in shared-libraries of our pipelines. One of the main items required for that would be to get hold of Jenkins Instance, can someone share a way to do that ?
Are there any restrictions or proper way to get hold of Jenkins.getActiveInstance() under "src" or "vars" folder ?
I have tried to get Jenkins.getActiveInstance() under src code as well vars code but it returns null, am I missing something here? any help will be appreciated.
thanks

Try 'Hudson.instance'. This pipeline below works for me on Jenkins 2.32.x. You may have to do some script approvals or turn off the sandbox.
pipeline {
agent none
stages{
stage('Instance Info') {
steps {
script {
def jenkinsInstance = Hudson.instance
for (slave in jenkinsInstance.slaves) {
echo "Slave: ${slave.computer.name}"
}
}
}
}
}
}
Bill

This ticket can be closed, there were few issues :
1. Fix the access via (Manage Jenkins -> In script approval)
2. some scripts contain Non-cps code

Related

What to do about error in Jenkins pipeline Process working directory '/var/lib/jenkins/workspace/<yourpipelinenamehere> doesn't exist

I have found this question asked in various places in various forms and even fought it myself.
I believe that I have found the solution for the scenario in which I have encountered this and am curious if it helps others who similarly encounter this.
The short answer of what i found to do, is set the first stage of my pipeline to a known module that has logic to create the workspace. I.E. the following:
pipeline {
agent any
stages {
stage('Opening Workspace') {
steps {
script {
def date = new Date()
def data = "I am arbitrary text\nSecond line\n" + date
writeFile(file: 'workspacecreated.txt', text: data)
sh "ls -l"
}
}
}
stage('alltherest') {
<< the rest of your steps and end of your pipeline to paste here>>
In my case of fighting this, My first stage was ansiblePlaybook() which, as it turns out... does not seem to try and create this workspace. I have filed this as a bug in jenkins against the ansible plugin.
So the first question is,
If you hit this error message in jenkins, does setting the first step to writefile help you?
If so what was your original first step? Perhaps you should post that first steps plugin failing to create a workspace for itself as a bug to jenkins.
The second question is,
Does anyone have a more elegant solution of this workaround?

Disable or auto approve Script Approval for scripts executed in Job Dsl (Active Choice Parameters)?

Running Jenkins 2.289.1.
I have this pipelineJob Job Dsl setting up Active Choice parameters:
https://plugins.jenkins.io/uno-choice/
pipelineJob("test") {
parameters {
activeChoiceParam('CHOICE-1') {
description('Allows user choose from multiple choices')
filterable()
choiceType('SINGLE_SELECT')
groovyScript {
script('return ["choice1", "choice2", "choice3"];')
fallbackScript('"fallback choice"')
}
}
}
definition {
cpsScm {
scm {
git {
remote {
credentials("${creds}")
url("${gitUrl}")
}
branch("${gitBranch}")
}
}
scriptPath("${pathToFile}")
}
}
}
To make sure I can run Job Dsl in the first place without having to manually approve that I have added the following to jcasc:
jenkins:
security:
globalJobDslSecurityConfiguration:
useScriptSecurity: false
But that is not enough. Before I can run the generated pipeline based on above Job Dsl I still need to manually approve:
How do I configure Job Dsl, jcasc or something else to either disable script approval for anything that goes on in a Job Dsl or automatically approve any script that might be created inside a job dsl?
Hopefully I don't have to hack my way around that like suggested here:
https://stackoverflow.com/a/64364086/363603
I am aware that there is a reason for this feature but its for a local only jenkins that I am using for experimenting and this is currently killing my productivity. Related:
https://issues.jenkins.io/browse/JENKINS-28178?focusedCommentId=376405&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-376405
What worked for me:
Manage Jenkins > Configure Global Security > CSRF Protection (section header -- not sure why) > Enable script security for Job DSL scripts (the name of the option that I disabled).

Multiple Jenkins pipelines for a single repo

At the moment I have two MultiJob Projects for a single repo:
First runs on develop branch
Second runs on all opened Pull Requests
Each has a lot of nested Freestyle jobs.They are are quite different.
I'm looking at switching to Pipeline-as-Code by using Jenkinsfile. So my question is is there a way to switch Jenkinsfile path/name based on, say branch name. I tried to use MultiBranch Pipeline job type, but it only allows to set a single Jenkinsfile path and it uses it across any branch including PullRequests.
Maybe there is a better way to achieve that? I'm open to discussion. Thank you
You can do it in one jenkinsfile by using when expression, I assume your pipeline is not quite big
pipeline {
agent any
stages {
stage("Set variables from external input") {
when {
branch "develop"
}
steps{
#add the thing which you want execute when branch is develop
}
}
stage("2 for Pull request") {
when {
expression {return !env.GIT_BRANCH.contains('master|develop')}
}
steps{
#add the thing which you want execute when branch is pull request
}
}
}
}

jenkins declarative pipeline ignoring changelog of jenkinsfiles

I have apps and their codes on git repositories. Also jenkinsfiles for building apps and these files on another repository. The problem is jenkins builds changelog. Jenkins add jenkinsfiles changelog to build changesets and I don't want to that. Because these changes are according to infrastructure not relevant with apps. How to prevent this? I didn't find any workaround or solution.
If I got well your question... I don't think you can remove Jenkinsfile changes from the change set that Jenkins reads from git, but instead, you can skip your pipeline to build if there are only changes on Jenkinsfile.
If it helps...
First place, you need to read the change set:
def changedFiles = []
for (changeLogSet in currentBuild.changeSets) {
for (entry in changeLogSet.getItems()) {
for (file in entry.getAffectedFiles()) {
changedFiles.add(file.getPath())
}
}
}
Then, you can check if it is only Jenkinsfile:
if (changedFiles.size() == 1 && changedFiles[0] == "Jenkinsfile"){
println "Only Jenkinsfile has changed... Skipping build..."
currentBuild.getRawBuild().getExecutor().interrupt(Result.SUCCESS) // this skips your build prematurely and makes the rest of the stages green
sleep(1) // you just need this
}else{
println "There are other changes rather than only Jenkinsfile, build will proceed."
}
P.S. You have several ways to terminate the jobs earlier without failing the build, but this one is the cleanest in my experience, even though you need to allow some Admin Signatures Permission on Jenkins (I've seen it in another thread here some time ago, can't find it now though)

How to include multiple pipeline scripts into jenkinsfile

I have a jenkins file as below
pipelineJob('My pipeline job'){
displayName('display name')
logRotator {
numToKeep(10)
daysToKeep(30)
artifactDaysToKeep(7)
artifactNumToKeep(1)
}
definition{
cps {
script(readFileFromWorkspace('./cicd/pipelines/clone_git_code.groovy'))
script(readFileFromWorkspace('./cicd/pipelines/install_dependencies_run_quality_checks.groovy'))
}
}
}
with above jenkinsfile the last script file is replacing other scripts.
Basically I have split tasks into multiple groovy files so that i wont repeat the same code in all jenkinsfile and reuse the same for other jobs as well, like I can now use the clone_git_code.groovy script in dev build as well as QA builds.
You have to use shared libraries (https://jenkins.io/doc/book/pipeline/shared-libraries/). You can define multiple groovy files with classes to return a processed object or simply creating calls with method where you define a step and the execution will be sequential.
I had this same issue when trying to include multiple scripts into a Jenkins job. After doing some research, I found the below solution to be the simplest:
definition {
cps {
script (
ScriptsLibrary.pipelineTest('did it work?') +
ScriptsLibrary.scmConf('repoURL_input', 'accessCredentials', 'activeBranch')
)
}
}
Add the "+" to concatenate the Strings. Got the job done for me :)

Resources