jenkins job dsl plugin issue where none of the internal jobs have access to the external jobs - jenkins

I am using jenkins jobDsl as follows:
#!groovy
node('master') {
stage('Prepare') {
deleteDir()
checkout scm
}
stage('Provision Jobs') {
jobDsl(targets: ['jenkins/commons.groovy', 'folderA/jenkins/jobA.groovy'].join('\n'),
removedJobAction: 'DELETE',
removedViewAction: 'DELETE',
sandbox: false)
}
}
Where I want to use from the jobA.groovy a function that is defined on commons.groovy.
Currently, the jobA.groovy doesn't have access to the function defined on commons.groovy, how can I allow this behavior?
Attached:
jobA.groovy:
test_job("param1", "param2")
common.groovy:
def test_job(String team, String submodule) {
pipelineJob("${team}/${submodule}/test_job") {
displayName("Test Job")
description("This is a Continuous Integration job for testing")
properties {
githubProjectUrl("githubUrl")
}
definition {
cpsScm {
scm {
git {
remote {
url('githubUrl')
credentials('credentials')
refspec('+refs/pull/*:refs/remotes/origin/pr/*')
}
branch('${sha1}')
scriptPath("scriptPath")
}
}
}
}
}
}
The idea would be to be able to call this method test_job("param1", "param2") from jobA.groovy with no issues and I am currently getting:
ERROR: (jobA.groovy, line 9) No signature of method: test_job() is applicable for argument types: (java.lang.String, java.lang.String)

JobDSL creates the jobs. Then at runtime you want your job to call your function. The function must be imported through a shared library.
Create a shared lib
here is a sample: https://github.com/sap-archive/jenkins-pipelayer
the most important piece there is that you need to create a vars/ folder that will define the functions you can call from your pipelines. host the lib on its own repo or orphan branch
Import a shared lib
To import a lib library in Jenkins. From Manage page, go to Configure System Under section Global Pipeline Libraries, add a new library with name of your choice, ie name-of-your-lib, default version master, modern scm git https://urlofyoursharedlib.git
Run a first time the jobDSL job, then go to the In Process Script Approval page and approve everything.
Use a shared lib
To import a library inside your job you must include at the top of the file, after the shebang the statement #Library('name-of-your-lib')_
There is also a similar statement that exists, "library 'name-of-your-lib'". this one is useful to "debug & fix" a shared library because when you hit that replay button you'll see the shared library files used in the pipeline
Finally if all you are trying is to create job templates, I would recommend to try to get what this shared library I shared is doing, it helps with creating declarative templates and solves issues and limitations you will encounter with jobdsl & shared pipeline

Related

Disable or auto approve Script Approval for scripts executed in Job Dsl (Active Choice Parameters)?

Running Jenkins 2.289.1.
I have this pipelineJob Job Dsl setting up Active Choice parameters:
https://plugins.jenkins.io/uno-choice/
pipelineJob("test") {
parameters {
activeChoiceParam('CHOICE-1') {
description('Allows user choose from multiple choices')
filterable()
choiceType('SINGLE_SELECT')
groovyScript {
script('return ["choice1", "choice2", "choice3"];')
fallbackScript('"fallback choice"')
}
}
}
definition {
cpsScm {
scm {
git {
remote {
credentials("${creds}")
url("${gitUrl}")
}
branch("${gitBranch}")
}
}
scriptPath("${pathToFile}")
}
}
}
To make sure I can run Job Dsl in the first place without having to manually approve that I have added the following to jcasc:
jenkins:
security:
globalJobDslSecurityConfiguration:
useScriptSecurity: false
But that is not enough. Before I can run the generated pipeline based on above Job Dsl I still need to manually approve:
How do I configure Job Dsl, jcasc or something else to either disable script approval for anything that goes on in a Job Dsl or automatically approve any script that might be created inside a job dsl?
Hopefully I don't have to hack my way around that like suggested here:
https://stackoverflow.com/a/64364086/363603
I am aware that there is a reason for this feature but its for a local only jenkins that I am using for experimenting and this is currently killing my productivity. Related:
https://issues.jenkins.io/browse/JENKINS-28178?focusedCommentId=376405&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-376405
What worked for me:
Manage Jenkins > Configure Global Security > CSRF Protection (section header -- not sure why) > Enable script security for Job DSL scripts (the name of the option that I disabled).

How to implement shared library in jenkins, without configuring "Global Pipeline Libraries" in "Manage Jenkins"?

Since I don't have access to "Manage Jenkins" menu in my organization, I'm unable to configure my shared library in "Global Pipeline Libraries" in "Manage Jenkins".
Is there any other way to implement this, without configuring in Manage Jenkins?
(or)
Is it possible to configure "Global Pipeline Libraries" part through pipeline script, irrespective of access privileges?
If possible, requesting you to share some code snippets in the answer.
Without configuring in "Manage Jenkins". We can use "LibraryIdentifier" in Pipeline script to load the libraries in the jenkins build.
Load the library
you can load the library from source control (like git) like this:
def myLib= library(
identifier: 'myLib#master', retriever: modernSCM
(
[
$class: 'GitSCMSource',
remote: 'https://bitbucket.org/shaybc/commonlib.git',
credentialsId: 'bitbucketCreds'
]
)
)
Groovy class from the library
assuming this is the groovy class:
package my.domain
class Tester
{
public static String staticTest()
{
return "this is from a static method";
}
public String test()
{
return "this is from an instance method";
}
}
Call the methods from scripted pipeline
then you call a static method like this:
myLib.my.domain.Tester.staticTest();
and an instance method like this:
// call the constructor (you can also call a constructor with parameters)
def tester = myLib.my.domain.Tester.new();
// call your instance method
tester.test();
read more:
loading libraries dynamically
the difference between #Library annotation and library step
private shared library by example
As mentioned in some of the answers above, you can load the library at runtime using library identifier or you can also configure the library at folder level of the Jenkins job that you're trying to run.
In most of the cases, the developers don't get admin access to Jenkins. However, they are allowed to access and update configurations at folder level. You can check if you have those privileges. It would be more convenient that loading libraries at runtime for all your pipeline scripts.

Can we use a single jenkins file for multibranch piepeline in jenkins using shared libraries?

I am trying to write a jenkinsfile which will take the data from shared libraries in jenkins for multibranch pipeline, something like below:-
#Library('Template')_
if (env.BRANCH_NAME == 'master') {
jenkins1(PROJECTNAME: 'test', GITURL: 'http://test/test.git')
} else {
jenkins2(PROJECTNAME: 'test1', GITURL: 'http:////test/test.git')
}
so the pipeline take the shared library depending upon the if condition, if the branch is master if statement data should work or else should be build.
Yes that’s possible. Actually we’re using a multibranch project to test our changes to our shared library that way.
You have to use the library step to load the library instead of the #Library annotation, like:
if (condition) {
library(‘someLib#${env.BRANCH_NAME}’)
} else {
library(‘someOtherLib’)
}
See https://jenkins.io/doc/pipeline/steps/workflow-cps-global-lib/#library-load-a-shared-library-on-the-fly for all details.
By the way: In case you’re planning to do Pull Requests the following Post might be useful to you as well: https://stackoverflow.com/a/51915362/4279361

Is it possible to include calls in Declarative Pipelines in the shared library (Jenkins)?

As you know in shared libraries in Jenkins, it is possible to call in the Jenkinsfile a content of a file in vars folder.
For example in vars folder in the shared library we can have a file named build.groovy and in the Jenkinsfile we can call it by :
build {
parameter1 = "some param1"
parameter2 = "some param2"
}
As it is described in this section .
I have no problems having groovy files and calling them with call() method in Jenkinsfile.
But I want to customize a pipeline and make it as generic as possible.
So I want to call a groovy file contained in vars folder but in the same pipeline : Calling a genericStage.groovy in an other file contained in the same vars folder in the shared library.
So what I have is a groovy file in vars folder : genericStage.groovy
What I have is :
Pipeline{
agent{label myNode}
stages{
stage("init"){
//steps
}
genericStage{
parameter1 = "some param"
}
}
}
And in the genericStage :
def call(Closure body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
stage(config.parameter1){
steps{
//steps
}
}
}
But I get the error :
Expected a stage # line 125, column 6.
genericStage{
^
1 error
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1085)
So how to make calls in the same shared library as we do for the Jenkinsfile ?
In your shared library method defining a DSL style step like this you can either define a block of steps and script that does some common processing you need to share between jobs or stages in a job or a whole complete pipeline. So your genericStage.groovy can only contain what comes after the "// steps" comment and cannot contain the stage and steps definition like this. I do this type of library custom step quite a lot in the style you have here but without trying to define the stage / steps inside it and it works fine. What is happening for you here is the pipeline validation / parser is failing the main pipeline syntax before it even gets to handling your custom step because you don't have it wrapped in a stage and steps definition.
If you read the documentation link you have at the end there is a section on defining a complete declarative pipeline in a shared library. It says at the end "Only entire pipelines can be defined in shared libraries as of this time. This can only be done invars/*.groovy, and only in a call method. Only one Declarative Pipeline can be executed in a single build, and if you attempt to execute a second one, your build will fail as a result." This suggests that a partial pipeline like you have won't work, you should have just steps / script or a complete pipeline.

How to include multiple pipeline scripts into jenkinsfile

I have a jenkins file as below
pipelineJob('My pipeline job'){
displayName('display name')
logRotator {
numToKeep(10)
daysToKeep(30)
artifactDaysToKeep(7)
artifactNumToKeep(1)
}
definition{
cps {
script(readFileFromWorkspace('./cicd/pipelines/clone_git_code.groovy'))
script(readFileFromWorkspace('./cicd/pipelines/install_dependencies_run_quality_checks.groovy'))
}
}
}
with above jenkinsfile the last script file is replacing other scripts.
Basically I have split tasks into multiple groovy files so that i wont repeat the same code in all jenkinsfile and reuse the same for other jobs as well, like I can now use the clone_git_code.groovy script in dev build as well as QA builds.
You have to use shared libraries (https://jenkins.io/doc/book/pipeline/shared-libraries/). You can define multiple groovy files with classes to return a processed object or simply creating calls with method where you define a step and the execution will be sequential.
I had this same issue when trying to include multiple scripts into a Jenkins job. After doing some research, I found the below solution to be the simplest:
definition {
cps {
script (
ScriptsLibrary.pipelineTest('did it work?') +
ScriptsLibrary.scmConf('repoURL_input', 'accessCredentials', 'activeBranch')
)
}
}
Add the "+" to concatenate the Strings. Got the job done for me :)

Resources