In my Jenkins Pipeline, the first stage executes a freestyle job. Therefore, I need to get the output/logs from that job because it will return a string of IPs which will be used on my second stage.
def getEc2ListResult = []
pipeline {
agent {
label 'master'
}
stages{
stage('Get EC2 List'){
steps {
script {
def getEc2List = build job: 'get-ec2-by-tag', parameters: [
string(name: 'envTagValue', value: "${envTagValue}"),
string(name: 'OS', value: "${OS}")
]
getEc2ListResult = getEc2List.getPreviousBuild().getLog(20)
}
}
}
}
}
This is the error that I'm getting:
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper.getLog() is applicable for argument types: (java.lang.Integer) values: [20]
Possible solutions: getId(), getAt(java.lang.String), getClass()
getEc2List is of type RunWrapper, also getEc2List.getPreviousBuild().
RunWrapper doesn't supply a getLog() api, it is supplied by rawBuild.
you can get getEc2List's rowBuild by call getEc2List.rawBuild or getEc2List.getRawBuild().
But getRawBuild() is not in #Whitelisted of RunWrapper, so you will get following message in jenkins log:
Scripts not permitted to use method
org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper
getRawBuild. Administrators can decide whether to approve or reject
this signature.
One option, it's to ask Jenkins admin to change Script Approve
Another option, it's do as following:
stage('') {
environment {
JENKINS_AUTH = credentials('<credentail id of jenkins auth>')
// add an credentail with a jenkins user:password before use it at here
}
steps {
script {
def getEc2List = build job: 'get-ec2-by-tag', parameters: [
string(name: 'envTagValue', value: "${envTagValue}"),
string(name: 'OS', value: "${OS}")
]
logUrl = getEc2List.absoluteUrl + 'consoleText'
log = sh(script: 'curl -u {JENKINS_AUTH} -k' + logUrl,
returnStdout: true).trim()
// parse the log to extract the ip
...
}
}
}
Related
pipeline {
agent { label 'linux' }
stages{
stage("verify1"){
steps {
script {
build(job: "verfiy1", parameters: [string(name: 'verfiy1', value: "${params.verfiy1}")])
}
}
}
stage("verify2"){
steps {
script {
build(job: "verfiy2", parameters: [string(name: 'verfiy2', value: "${params.verfiy2}")])
}
}
}
stage("verify3"){
steps {
script {
build(job: "verify3", parameters: [string(name: 'verify3', value: "${params.verify3}")])
}
}
}
}
}
=================================================================
Hello
can anyone help me, right now from the above pipeline i am able to build 3 jobs sucessfull but the problem is every single job is executing on new ec2 slave instance instead of the instance where the job has started. I am expecting the output as once the above pipeline starts all the builds in the pipeline must execute on the same node (ec2 instance).
Thanks in advance
You can pass the upstream job's agent node to the downstream job.
Add one more job parameter to accept node
Pass upstream job's agent node via env.NODE_NAME when call build job step
// verify 1 job
pipeline {
agent { label "${params.agentNode}" }
parameters {
string(name: "agentNode",
defaultValue="<give default value in case run it directly>" )
}
}
// upstream job
build(job: "verify1", parameters: [
string(name: 'agentNode', value: "${env.NODE_NAME}"),
string(name: 'verify3', value: "${params.verify3}")
])
Using the declarative pipeline syntax, I want to be able to define parameters based on an array of repos, so that when starting the build, the user can check/uncheck the repos that should not be included when the job runs.
final String[] repos = [
'one',
'two',
'three',
]
pipeline {
parameters {
booleanParam(name: ...) // static param
// now include a booleanParam for each item in the `repos` array
// like this but it's not allowed
script {
repos.each {
booleanParam(name: it, defaultValue: true, description: "Include the ${it} repo in the release?")
}
}
}
// later on, I'll loop through each repo and do stuff only if its value in `params` is `true`
}
Of course, you can't have a script within the parameters block, so this won't work. How can I achieve this?
Using the Active Choices Parameter plugin is probably the best choice, but if for some reason you can't (or don't want to) use a plugin, you can still achieve dynamic parameters in a Declarative Pipeline.
Here is a sample Jenkinsfile:
def list_wrap() {
sh(script: 'echo choice1 choice2 choice3 choice4 | sed -e "s/ /\\n/g"', , returnStdout: true)
}
pipeline {
agent any
stages {
stage ('Gather Parameters') {
steps {
timeout(time: 30, unit: 'SECONDS') {
script {
properties([
parameters([
choice(
description: 'List of arguments',
name: 'service_name',
choices: 'DEFAULT\n' + list_wrap()
),
booleanParam(
defaultValue: false,
description: 'Whether we should apply changes',
name: 'apply'
)
])
])
}
}
}
}
stage ('Run command') {
when { expression { params.apply == true } }
steps {
sh """
echo choice: ${params.service_name} ;
"""
}
}
}
}
This embeds a script {} in a stage, which calls a function, which runs a shell script on the agent/node of the Declarative Pipeline, and uses the script's output to set the choices for the parameters. The parameters are then available in the next stages.
The gotcha is that you must first run the job with no build parameters in order for Jenkins to populate the parameters, so they're always going to be one run out of date. That's why the Active Choices Parameter plugin is probably a better idea.
You could also combine this with an input command to cause the pipeline to prompt the user for a parameter:
script {
def INPUT_PARAMS = input message: 'Please Provide Parameters', ok: 'Next',
parameters: [
choice(name: 'ENVIRONMENT', choices: ['dev','qa'].join('\n'), description: 'Please select the Environment'),
choice(name: 'IMAGE_TAG', choices: getDockerImages(), description: 'Available Docker Images')]
env.ENVIRONMENT = INPUT_PARAMS.ENVIRONMENT
env.IMAGE_TAG = INPUT_PARAMS.IMAGE_TAG
}
Credit goes to Alex Lashford (https://medium.com/disney-streaming/jenkins-pipeline-with-dynamic-user-input-9f340fb8d9e2) for this method.
You can use CHOICE parameter of Jenkins in which user can select a repository.
pipeline {
agent any
parameters
{
choice(name: "REPOS", choices: ['REPO1', 'REPO2', 'REPO3'])
}
stages {
stage ('stage 1') {
steps {
// the repository selected by user will be printed
println("$params.REPOS")
}
}
}
}
You can also use the plugin Active Choices Parameter if you want to do multiple select : https://plugins.jenkins.io/uno-choice/#documentation
You can visit pipeline syntax and configure in below way to generate code snippet and you can put it in the jenkins file:
Copy the snippet code and paste it in jenkinsfile at the start.
I have a parameterized Jenkins Pipeline with a default value and I'm trying to pass that param as a script argument but it doesn't seem to pass anything. Here is the script :
pipeline {
agent any
stages {
stage('Building') {
steps {
build job: 'myProject', parameters: [string(name: 'configuration', value: '${configuration}')]
}
}
stage('Doing stuff') {
steps {
sh "~/scripts/myScript ${configuration}"
}
}
}
}
It seems to work for the build step but not for the script. I returns an error saying I have no argument.
I tried to get it with ${configuration}, ${params.configuration} and $configuration.
What is the right way to access a param and pass it correctly to a script ?
Thanks.
Actually, you are using the build step, to pass a parameter to the Jenkins job 'myProject'.
build job: 'myProject', parameters: [string(name: 'configuration', value: '${configuration}')]
If you want to declare a Parameter in this job you need to declare your parameter in a "parameters" block.
pipeline {
agent any
parameters {
string(defaultValue: '', description: '', name: 'configuration')
}
stages {
stage('Doing stuff') {
steps {
sh "~/scripts/myScript ${configuration}"
}
}
}
}
I have a multibranch pipeline in Jenkins.
I defined multiple check boxes (over 20) for each parameter to be passed to a script, which then starts my application and runs corresponding test case (this might not be an optimal solution but this framework was created before I started at current company and I am not going to refactor it):
booleanParam(name: 'cluster_number', defaultValue: false, description: '')
booleanParam(name: 'post_cluster_wu', defaultValue: false, description: '')
etc.
I need to collect user selection for each checkbox (true-false). I would prefer to do it in a loop, like this:
sh """
for (element in params) {
// testing:
echo "${element.key} ${element.value}"
}
"""
but keep getting an error:
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: element for class: groovy.lang.Binding
at groovy.lang.Binding.getVariable(Binding.java:63)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:264)
at org.kohsuke.groovy.sandbox.impl.Checker$6.call(Checker.java:288)
Also tried to put loop outside of shell script. No luck so far.
steps {
echo "username: ${params.OWNER_USERNAME}"
for (element in params) {
echo "${element.key} ${element.value}"
}
...
Wonder if anyone was able to loop through params?
Thanks in advance!
This works:
pipeline {
agent any
parameters {
booleanParam(name: 'alpha', defaultValue: true)
booleanParam(name: 'beta', defaultValue: true)
booleanParam(name: 'gamma', defaultValue: false)
}
stages {
stage('only') {
steps {
script {
params.keySet().each {
echo "The value of the ${it} parameter is: ${params[it]}"
}
}
}
}
}
}
I have a groovy script that calls other jobs to stop and start tasks. (see below). I would like to re-use the code inside the steps { over and over again. Can I do this without having to repeat the code?
Basically I want to have the next stage for another API that I can start or stop, then another, etc. These are then built with parameters on the Jenkins where radio buttons decide whether to stop or start.
#!/usr/bin/env groovy
pipeline {
environment {
containerInstanceIdsToStartOn = "463b8b6f-9388-4fbd-8257-b056e28c0a43"
region = "eu-west-1"
cluster = "mis-core-dev"
}
agent any
stages {
stage('Authentication API (dev)') {
environment {
apiName = "authentication_API"
taskDefinitionFamily = "mis-core-dev-authentication-api"
taskDefinition = "mis-core-dev-authentication-api"
}
steps {
script {
if (params."${apiName}".contains('Stop Task')) {
build(job: 'Stop ECS Task (utility)',
parameters: [
string(name: 'region', value: params."${region}"),
string(name: 'cluster', value: params."${cluster}"),
string(name: 'family', value: params."${taskDefinitionFamily}")
])
}
else if (params."${apiName}".contains('Start Task')) {
build(job: 'Start ECS Task (utility)',
parameters: [
string(name: 'region', value: params."${region}"),
string(name: 'cluster', value: params."${cluster}"),
string(name: 'taskDefinition', value: params."${taskDefinition}"),
string(name: 'containerInstanceIds', value: params."${containerInstanceIdsToStartOn}")
])
}
else if (params."${apiName}" == null || params."${apiName}" == "") {
echo "Did you forget to check a box?"
}
}
}
}
}
post {
always {
cleanWs()
}
}
}
It is not possible to share parts of a declarative pipeline. The declarative pipeline DSL is processed in a special way at runtime where you can't split out some of the pieces. You could share some logic in how some of the blocks are executed (like the code used inside of a script block), but the sharing capabilities are limited to basically the entire pipeline definition itself
From the Shared Library documentation:
Only entire pipelines can be defined in shared libraries as of this time. This can only be done in vars/*.groovy, and only in a call method. Only one Declarative Pipeline can be executed in a single build, and if you attempt to execute a second one, your build will fail as a result.