Jenkins Pipeline dynamically set parameter based on Artifactory trigger URL - jenkins

I have a pipeline job that triggers when finding changes in Artifactory at a specific path. I want to pass the Artifactory url value to a parameter (my goal for this job is to allow users to manually build this job and enter a path as well as have this job automatically trigger when changes are found in a specific path in Artifactory and pass that value to the parameter).
node {
def server
def rtTriggerUrl = currentBuild.getBuildCauses('org.jfrog.hudson.trigger.ArtifactoryCause')[0]?.url
stage ('Artifactory configuration') {
server = Artifactory.server 'artifactory-1'
}
stage('Trigger build') {
server.setBuildTrigger spec: "H/2 * * * *", paths: "maven-examplerepo-local/path/to/jar"
}
}
pipeline {
parameters {
string(
name: 'JAR_LOCATION',
defaultValue: rtTriggerUrl,
trim: true,
description: 'Artifactory URL of jar'
I have tried setting it a couple different ways:
defaultValue: rtTriggerUrl
defaultValue: "${currentBuild.getBuildCauses('org.jfrog.hudson.trigger.ArtifactoryCause')[0]?.url}"
However these gave blank or null values (I also tried setting the rtTriggerUrl function before node to see if that'd make it available to the parameter but that didn't work either).
Has anyone figured out how to do this? As a workaround I created an upstream job that triggers when changes are in Artifactory, then it triggers the downstream job and passes the url value to the downstream job's parameter. I wanted to see if I could combine that logic into one job.

Related

Jenkins role strategy plugin and restrict user to build if not in proper group -- Jenkins file

We have two environments, qa and dev, and that is configured as parameters in Jenkinsfile. Plugin Role-based Authorization Strategy is enabled, and there are two groups of users, qa and dev (same as environment). The problem here is that qa users can start to build jobs with dev environment. Is there any way that we restrict this behavior?
Here is a simple example:
pipeline {
agent any
choice(name: 'environment', choices: ['dev', 'qa']
stages {
stage('test') {
script {
if (params.environment == 'dev' && env.BUILD_USER_ID not in env.BUILD_USER_GROUPS) {echo "User ${env.BUILD_USER_ID} can not start build on DEV enviroment"}
else if (params.environment == 'qa' && env.BUILD_USER_ID not in env.BUILD_USER_GROUPS) {echo "User ${env.BUILD_USER_ID} can not start build on QA enviroment"}
else {echo "You can run job, You are in proper group for this enviroment"}
}
}
}
}
An example is not real, and maybe not working, but I hope that can be understood what I want to accomplish.
P.S. Documentation for this is not so good, and also can't find much more examples on web.
Instead of blocking (or failing) the execution after it started, you can use a different approach and prevent an unauthorized user to even start the build with irrelevant parameters (dev environment in this case).
To do so you can use the Extended Choice Parameter plugin, it enables you to create a select list value (multi or single select) based on the return value of a groovy script.
Then you can use the following script:
def buildUserGroup = ["group1","group 2","group3"]
def environments = ['qa'] // This default value will be available to everyone
// All the groups that the current logged in user is a member of
def currentUserGroups = hudson.model.User.current().getAuthorities()
if (currentUserGroups.any{ buildUserGroup.contains(it) }) {
environments.add("dev") // Add relevant environments according to groups
}
return environments
This way you can define the logic that will add environments according to group membership and adjusted it according to your needs. The user that builds the job wont even see the environments that he is not allowed to build and you will get the restriction you need.
In a Pipeline Job using your requirements the configuration can be simplified and will look like:
pipeline {
agent any
parameters {
extendedChoice(name: 'environment', type: 'PT_SINGLE_SELECT', description: 'Environment type', visibleItemCount: 10,
groovyScript:"return hudson.model.User.current().getAuthorities().contains('dev') ? ['dev','qa'] : ['qa']")
}
stages {
stage('test') {
....
}
}
}
Update:
If you are using the Role-based Authorization Strategy and want to use the above solution with roles instead of groups you can use the following code (based on this script in your parameter:
def environments = ['qa'] // This default value will be available to everyone
def userID = hudson.model.User.current().id // The current user it
def authStrategy = jenkins.model.Jenkins.instance.getAuthorizationStrategy()
def permissions = authStrategy.roleMaps.inject([:]){map, it -> map + it.value.grantedRoles}
// Validate current user is in the 'dev' role
if (permissions.any{it.key.name == 'dev' && it.value.contains(userID)}) {
environments.add("dev") // Add relevant environments according to groups
}
return environments

GitHub Webhook not triggering the Jenkins Pipeline

I want to trigger a pipeline when Pull Request is Merged..ie "action": "closed","merged":"true"
Webhook got 200 response for Jenkins
pipeline.groovy:
pipelineJob(job_name) {
parameters {
stringParam('APP_URL', app_url, 'URL of the Application')
stringParam('APP_MERGE_STATUS', app_merge_status, 'Merge Status to Trigger Job')
booleanParam('MERGED', true, 'Flag to Trigger the job')
stringParam('APP_ARTIFACT_BUCKET', artifact_bucket, 'Bucket url to upload artifacts')
stringParam('payload')
}
triggers {
genericTrigger {
genericVariables {
genericVariable {
key("APP_MERGE_STATUS")
value("\$.action")
expressionType("JSONPath")
}
genericVariable {
key("MERGED")
value("\$pull_request.merged")
expressionType("JSONPath")
}
}
printPostContent(true)
regexpFilterText("\$action")
regexpFilterExpression("")
}
}
Generic Variables I have mentioned are also used to trigger the job without github..[using parameters]
I am not sure how to write the generic trigger variables and regex for the trigger
Scenario: PR is closed and merged
If your Jenkins is exposed to the internet, you can subscribe to the webhook in Github itself
or use jenkins declarative pipeline to make use of
Got the solution..I missed to add "generic-webhook-trigger" in payload url

Jenkins Pipeline passing password parameter to downstream job

I want to pass a value, from the Password Parameter plugin, in a Jenkins Pipeline job, to another freestyle job, to be used for login. I don't want to see it in the output or anywhere else. I can do it between two freestyle jobs but it seems that the pipeline is a bit different.
Even if I'm able to send as a string, it would be visible in the Parameters tab or the Environment Variables tab.
Does anyone have any idea how this could be achieved?
I've spent hours trying different solutions for the same problem as you've had and here is the final solution, which worked for me:
In your pipeline script:
stages {
stage("Do something with credentials and pass them to the downstream job") {
steps {
build job: 'your/jobname/path', parameters: [
[$class: 'hudson.model.PasswordParameterValue', name: 'PASSWORD', value: env.PASSWORD],
[$class: 'TextParameterValue', name: 'USERNAME', value: env.USERNAME]
]
}
}
The trick is to use hudson.model.PasswordParameterValue class when passing password parameter to downstream (Freestyle) job, but you must use then the same class for parameter definition in your main pipeline (parent job) in order to make it work.
For example in your pipeline job you would configure password parameter:
configure {
it / 'properties' / 'hudson.model.ParametersDefinitionProperty' / 'parameterDefinitions' << 'hudson.model.PasswordParameterDefinition' {
name('PASSWORD')
description('My password')
}
}
You should make sure that parent and child job both are using password parameters. Then, this parameters tab will mask you password. Making build parameters as password parameter will not mask passwords in environment variables tab, for that you need to enable mask password in child and parent job configuration or use Inject passwords to the build as environment variables and enable mask password.
You should use credentials plugin, which in pipeline you write with withCredentials block. For example:
withCredentials([usernamePassword(credentialsId: 'abcd1234-56ef-494f-a4d9-d5b5e8ac357d',
usernameVariable: 'USERNAME',
passwordVariable: 'PASSWORD')])
{
echo 'username='+USERNAME
echo 'password='+PASSWORD
}
where abcd1234-56ef-494f-a4d9-d5b5e8ac357d is the id of credentials you have in jenkins, and of course, as long as you don't echo the variables (as I did in the example obviously for demonstration purposes), username and password are not visible.
You can trigger you downstream job with the help of below plugin
Parameterized+Trigger+Plugin

How to get the last build number from a Trigger/call builds on other projects if a multibranch pipeline Build is called?

I use the Parameterized Trigger Plugin for Jenkins to trigger a Multibranch Pipeline project (RED Outlook Addin). After the build has finished I want to copy the artifacts via Copy Artifact Plugin.
I Add a build step "copy artifacts from another project" with project name "RED Outlook Addin/${CIOS_BRANCH_NAME}" because I get the branch name as a parameter. This works if I specify the build number like "12". But if I set the build number to $TRIGGERED_BUILD_NUMBER_RED_Outlook_Addin_${CIOS_BRANCH_NAME} I get this error: Unable to find project for artifact copy.
How can I call the $TRIGGERED_BUILD_NUMBER_ Parameter with the specified branch?
Thx for help
Chris
You could query the json api of your jenkins server, for example using httpRequest plugin:
#NonCPS
def parseJson(String text) {
def sup = new JsonSlurper()
def json = sup.parseText(text)
sup = null
return json
}
def getLastStableBuildNumber(String project, String branchName = 'master') {
def response = httpRequest url: "http://jenkins/job/${project}/job/${branchName}/lastStableBuild/api/json", validResponseCodes: '200'
def json = parseJson(response.content)
return json.number
}

How can I add job description in Jenkins 2 multi-branch pipeline?

I have a multi-branch pipeline job in Jenkins 2, connected to a GitHub repository (available here). Each pull request in the GitHub repository creates a new "job" in Jenkins, but the job inherits its name from the pull request number (i.e. jobs are called PR-1, PR-2, and so on) which is meaningless in a Jenkins context.
Is it possible (and how) to configure the job or Jenkinsfile to add a job description to each pull request ?
Here is how I was able to set the job description from the content of the pull request :
if (env.BRANCH_NAME.startsWith('PR')) {
def resp = httpRequest url: "https://api.github.com/repos/xxx/yyy/pulls/${env.BRANCH_NAME.substring(3)}"
def ttl = getTitle(resp)
def itm = getItem(env.BRANCH_NAME)
itm.setDisplayName("PR '${ttl}'")
}
}
#NonCPS
def getItem(branchName) {
Jenkins.instance.getItemByFullName("sonar-openedge/${branchName}")
}
#NonCPS
def getTitle(json) {
def slurper = new groovy.json.JsonSlurper()
def jsonObject = slurper.parseText(json.content)
jsonObject.title
}
This allows having job description available directly from the job overview page (as in this example: https://ci.rssw.eu/job/sonar-openedge/ )
The full commit and Jenkinsfile are available here:
https://github.com/Riverside-Software/sonar-openedge/commit/e2c76ca58b812e4ceac65c406f0b2aae9fbf3f5f

Resources