credential issues with curl in job dsl - jenkins

I'm looking to get a get response on different jenkins pipelines to recreate them using job dsl plugin, but I'm facing issues with the credentials, so far i have been using the logic below but if trying to use the jenkins credentials in credentialsBinding, it fails to recognize them, if I use my own user and password it works fine
this is the logic im looking to implement
job('seed'){
wrappers {
credentialsBinding {
usernamePassword('USER','PASSWORD', 'credentials')
}
}
label('centos')
def confXml = "curl -s -XGET ${url} -u \$USER:\$PASSWORD".execute().text.replace("\n", "")
//do something with the respose
//recreate dsl after checking an attribute in the response
pipelineJob("Sandbox_pipelines/pipelineName") {
definition {
cpsScm {
scm {
git(repo_git, "master")
}
scriptPath("somepath")
}
}
}
}
when i run this job this should be creating the other pipelines, please let me know if you can help me on this.
Thanks in advance

The issue is that credentialsBinding loads the credentials during the build of the job being created. You want to use the credential to decide what to create and that's just not how it works.
You can use withCredentials though:
def confXml
withCredentials([usernameColonPassword(credentialsId: 'credentials', variable: 'USERPASS')]) {
confXml = "curl -s -XGET ${url} -u \$USERPASS".execute().text.replace("\n", "")
}

Related

How to pass dynamic values to an environment block during the pipeline execution in Jenkins?

This is related to one question I asked before: Using groovy to parse JSON object in shell scripts for Jenkin
basically I will need to pass a dynamic value as returned from sh scripts to an environment block, so that the following stage can re-use that value and pass the version as a label to JIRA plugin called Xray. But I aware that I cannot pass dynamic values to an environment block during the pipeline execution. So, I think I am going to need try a different route for that, not sure if anyone could give me some tips please?
def setLatestAppVersionLabel() {
def response = sh(script: "curl --silent ${APP_ARTIFACTORY_URL}/${XRAY_PLATFORM}/builds/latest.json", returnStdout: true).trim() as String
def jsonResponse = readJSON text: response
LATEST_VERSION = jsonResponse.id
echo "LATEST_VERSION -> ${LATEST_VERSION}"
}
JSON response looks like that:
{"id":"1.0.0-6",
"version":"1.0.0",
"build":6,
"tag":"android-v1.0.0-6",
"commitHash":"5a78c4665xxxxxxxxxxe1b62c682f84",
"dateCreated":"2020-03-02T08:11:29.912Z"}
and there is an environment block where I would like to pass the value to one of the variable defined there#
environment {
AWS_DEFAULT_REGION = 'uk-xxx'
XRAY_ENVIRONMENT = 'e2e'
VERSION_KEY = 'id'
XRAY_PLATFORM = 'Android'
APP_ARTIFACTORY_URL = 'https://artifactory.example.com/mobile'
LATEST_VERSION = ''
}
If this path not working, what else could I use? Want to re-use the latest version taken from JSON response for the next stage in the pipeline to use.
Next stage looks like this:
stage('Import Result to Xray') {
when {
expression { return fileExists('xxx-executor/target/AndroidxxxxE2EResults/cucumber-reports/Cucumber.json')}
}
steps {
xrayResultsImport('xxx-executor/target/AndroidxxxxxE2EResults/cucumber-reports/Cucumber.json', 'xxx_ANDROID_E2E_xxxxxxx_Tests', XRAY_LABELS, ['E2E', 'Android', LATEST_VERSION], env.BUILD_URL)
}
}
Sorry I have to put xxxx to make this question general due to project confidentiality.
To put it simple, you want to use the version you fetched from a JSON response and want to use it in all stages of your Jenkins pipeline.
Ensure you've jq utility installed in your jenkins agent.
pipeline {
agent any
environment {
XRAY_LATEST_VERSION = ''
}
stages {
stage(‘Get Version ') {
steps {
script {
XRAY_LATEST_VERSION = sh(script: 'curl -s ${APP_ARTIFACTORY_URL}/${XRAY_PLATFORM}/builds/latest.json | jq .version | sed \'s/"//g\'', returnStdout: true).trim()
}
}
}
stage('Print'){
steps{
echo "${XRAY_LATEST_VERSION}"
}
}
}
}
You can use the variable ${XRAY_LATEST_VERSION} in any stages you want the and the value will be rendered across.

How to post a custom comment back to a Github PR from jenkins build

I am basically looking at how i can post a comment to a GitHub PR using jenkins multibranch pipeline job. Everything works for me and PRs are triggered and any commit to the source branch also triggers the PR build for that branch. Also the variables are getting substituted just fine, but somehow the script fails while doing a post of the custom comment from the build. Here is my sample declarative Jenkinsfile.
def PULL_REQUEST = env.CHANGE_ID
pipeline {
agent {
label "pod-custom"
}
stages {
stage('Checkout') {
steps {
checkout scm
}
}
stage('Test Plan') {
steps {
withCredentials([string(credentialsId: 'github-api', variable: 'GITHUB_TOKEN')]) {
sh "curl -s -H \"Authorization: token ${GITHUB_TOKEN}\" -X POST -d '{\"body\": \"This is my first test comment from jenkins\",\"commit_id\": \"4d0f019b93c11f1fabc8313da4c461dbdbde1fd5\",\"path\": \"Jenkinsfile\",\"position\": 4}' \"https://github.***.com/api/v3/repos/***/${env.GIT_URL.tokenize("/")[-1].tokenize(".")[0]}/pulls/${PULL_REQUEST}/comments\""
}
}
}
}
}
Here is the error i see :-
Running shell script
+ curl -s -H 'Authorization: token ****' -X POST -d '{"body": "This is my first test comment from jenkins","commit_id": "4d0f019b93c11f1fabc8313da4c461dbdbde1fd5","path": "Jenkinsfile","position": 4}' https://github.***.com/api/v3/repos/***/***/pulls/4/comments
{
"message": "Validation Failed",
"errors": [
{
"resource": "PullRequestReviewComment",
"code": "invalid",
"field": "path"
}
],
"documentation_url": "https://developer.github.com/enterprise/2.14/v3/pulls/comments/#create-a-comment"
}
I am wondering what is the GitHub API looking for as far as this error is concerned. My use case is just that i need to be able to post a comment to the PR i am pulling in the build as you can see, and this comment should be a straight comment to the PR and not the issue in GitHub.
Any help/suggestions here will be greatly appreciated as always.
i was able to figure this out by following the below post :-
Create comment on pull request. I think i wasn't quite understanding that github treats every PR as an issue while not vice-versa, and so what you could achieve by doing a POST /repos/:owner/:repo/issues/:number/comments, is exactly what i was looking here. I could test this just fine using the below :-
def PULL_REQUEST = env.CHANGE_ID
withCredentials([string(credentialsId: 'github-api', variable: 'GITHUB_TOKEN')]) {
sh "curl -s -H \"Authorization: token ${GITHUB_TOKEN}\" -X POST -d '{\"body\": \"This is my first test comment from jenkins\"}' \"https://github.***.com/api/v3/repos/***/${env.GIT_URL.tokenize("/")[-1].tokenize(".")[0]}/issues/${PULL_REQUEST}/comments\""
}
This posted the comment "This is my first test comment from jenkins" just fine under the PR conversation tab, which is what i needed.

How to read log file from within pipeline?

I have a pipeline job that runs a maven build. In the "post" section of the pipeline, I want to get the log file so that I can perform some failure analysis on it using some regexes. I have tried the following:
def logContent = Jenkins.getInstance()
.getItemByFullName(JOB_NAME)
.getBuildByNumber(
Integer.parseInt(BUILD_NUMBER))
.logFile.text
Error for the above code
Scripts not permitted to use staticMethod jenkins.model.Jenkins
getInstance
currentBuild.rawBuild.getLogFile()
Error for the above code
Scripts not permitted to use method hudson.model.Run getLogFile
From my research, when I encounter these, I should be able to go to the scriptApproval page and see a prompt to approve these scripts, but when I go to that page, there are no new prompts.
I've also tried loading the script in from a separate file and running it on a different node with no luck.
I'm not sure what else to try at this point, so that's why I'm here. Any help is greatly appreciated.
P.S. I'm aware of the BFA tool, and I've tried manually triggering the analysis early, but in order to do that, I need to be able to access the log file, so I run into the same issue.
You can use pipeline step httpRequest from here
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Test fetch build log'
}
post {
always {
script {
def logUrl = env.BUILD_URL + 'consoleText'
def response = httpRequest(
url: logUrl,
authentication: '<credentialsId of jenkins user>',
ignoreSslErrors: true
)
def log = response.content
echo 'Build log: ' + log
}
}
}
}
}
}
If your jenkins job can run on linux machine, you can use curl to archive same goal.
pipeline {
agent any
stages {
stage('Build') {
environment {
JENKINS_AUTH = credentials('< credentialsId of jenkins user')
}
steps {
sh 'pwd'
}
post {
always {
script {
def logUrl = env.BUILD_URL + 'consoleText'
def cmd = 'curl -u ${JENKINS_AUTH} -k ' + logUrl
def log = sh(returnStdout: true, script: cmd).trim()
echo 'Build log: ' +
echo log
}
}
}
}
}
}
Above two approaches both require the credentials is Username and password format. More detail about what is it and how to add in Jenkins, please look at here
Currently this is not possible via the RunWrapper object that is made available. See https://issues.jenkins.io/browse/JENKINS-46376 for a request to add this.
So the only options are:
explicitly whitelisting the methods
read the log via the URL as described in the other answer, but this requires either anonymous read access or using proper credentials.

Fetching github payload in jenkinsfile

Need some help on fetching the GitHub payload into the Jenkins file without installing any plugin.
If anyone can provide the Jenkins file code snippet to access the GitHub payload from the webhook. it would be of great help.
I am able to call the Jenkins job from GitHub webhook. but need the payload as well to process further.
Any help would be appreciated. Thanks.
Please find the below groovy script:
stage('Pull Request Info') {
agent {
docker {
image 'maven'
args '-v $HOME/.m2:/home/jenkins/.m2 -ti -u 496 -e MAVEN_CONFIG=/home/jenkins/.m2 -e MAVEN_OPTS=-Xmx2048m'
}
}
steps {
script {
withCredentials([usernameColonPassword(credentialsId: "${env.STASH_CREDENTIAL_ID}",
variable: 'USERPASS')]) {
def hasChanges = maven.updateDependencies()
if (hasChanges != '') {
def pr1 = sh(
script: "curl -s -u ${"$USERPASS" } -H 'Content-Type: application/json' https://xx.example/rest/some/endpoint",
returnStdout: true
)
def pr = readJSON(text: pr1)
println pr.fromRef
}
}
}
}
}
Above script uses, curl to fetch the details about Pull request. I have stored the credentials in the jenkins and created an environment variable for the credentialId.
You can replace the url with your endpoint. You can also modify the script to use jq if you have jq installed in the machine.
In this case, I'm using readJSON to parse the JSON which is part of pipeline utilities plugin. My question would be, why not use the plugin as it provides the needed functionalities?
If you still dont want to use plugin, take a look at json parsing with groovy.

jenkinsfile variable scope

I'm trying to call a remote job from one Jenkins server to another, I have this working fine via a shell script. However, trying to translate it into a Jenkinsfile is causing me issues. The environment variable is always "null" when used inside of a stage, even thought this article says it should be globally available?
pipeline {
agent any
/* get crumb for CSRF */
environment {
def crumb = sh 'curl https://jenkins-remote/crumbIssuer/'
}
stages {
/* call remote job */
stage("remote") {
steps {
sh "curl -X POST -H ${env.crumb} https://jenkins-remote/buildWithParameters?foo"
}
}
}
}
The trimmed output looks like:
[remote_pipeline] Running shell script
+ curl -X POST -H null
I am using Jenkins v2.89.4, new "Pipeline" job, "pipeline script".
Thanks to #TrafLaf for pointing out the variable is null because it does not get set to the output of the curl command. My hacky solution was this:
environment {
def crumbRequest = sh 'curl https://jenkins-remote/crumbIssuer/ > crumbHeader'
crumbHeader = readFile('crumbHeader').trim()
}
As per the official documentation, This is how you define environment variables.
pipeline {
agent any
environment {
DISABLE_AUTH = 'true'
DB_ENGINE = 'sqlite'
}
stages {
stage('Build') {
steps {
echo "${env.DB_ENGINE}" # to access
}
}
}
}
But you have coded wrong,
environment {
def crumb = sh 'curl https://jenkins-remote/crumbIssuer/'
}
So please do the rest.
The sh task can now return output, so in theory the following should work (untested):
environment {
crumb = sh(script: 'curl https://jenkins-remote/crumbIssuer/',
returnStdout: true).trim()
}

Resources