We are having one Pipeline job which is structured like below example
parallel(
"Stage-First": {
stage("First"){
catchError(buildResult: 'UNSTABLE', stageResult: 'FAILURE') {
build job: 'Trigger-Another-Job', parameters: [string(name: 'NAME', value: "testname"),
string(name: 'IP', value: "1.1.1.1"),
string(name: 'GATEWAY', value: "10.12.12.1"),
string(name: 'TENANT', value: "test")
]
},
....stage2,stage3..etc
}
we have around 50 stages now , and all stages are repeated one after another.
But now we are in a situation to add 50 more different stages. but that created another excpetion
java.lang.runtimeexception method code too large jenkinsfile
thinking of storing the values some where else and do this parallel operation in loop so that i wont hit above exception.
Any suggestion from jenkins experts here..?
Related
I am able to run the parallel Jenkins jobs using parallel syntax and able to get the result back of each child job using wait: true and propagate: false.
example -
pipeline {
stages {
stage('Parallel Jobs'){
steps{
script{
def result = parallel(
"JobAKey":{
build job: "job-A", wait: true, propagate: false, parameters: [string(name: 'param1', value: val1)]
},
"JobBKey":{
build job: "job-B", wait: true, propagate: false, parameters: [string(name: 'param1', value: val1)]
}
)
print(result['JobAKey'].result)
print(result['JobBKey'].result)
if (result['JobAKey'].result == 'SUCCESS' || result['JobBKey'].result == 'SUCCESS') {
build job: "job-C", wait: false, parameters: [string(name: 'param2', value: val2)]
build job: "job-D", wait: false, parameters: [string(name: 'param2', value: val2))]
}
}
}
}
}
}
I want to run job-C and job-D if any of one job job-A or job-B returns SUCCESS.
If job-B returns SUCCESS quickly then I don't want to wait for job-A to complete (SUCCESS / FAILURE) or vice versa. I don't know which job is going to finish quickly and going to return SUCCESS.
With "wait: True", parallel waits for both jobs to finish and then it starts job-C and job-D.
Is there something like successFast (like failFast) ?
Condition is, any one job from list of parallel jobs should return SUCCESS on completion so that we can start next jobs (job-C and job-D).
Thanks in advance!
I have two declarative pipelines in Jenkins. I would like to trigger pipelineB within parameters from the stage that is running inside pipeline A and check the build/stage results of pipelineB to decide whether if pipelineA should be continue or aborted.
If pipelineB build/stage results is success then pipelineA should continue with Stage C, unless it should be aborted.
stage('A'){
steps{
script{
//Do something
}
stage ('B'){
steps {
script {
// Trigger another pipeline and check result of this
build job: 'pipelineB', parameters: [
string(name: 'param1', value: "value1")
]
}
}
}
stage('C'){
steps{
script{
//Do something
}
Get the downstream job build result and assign to upstream job build result.
script {
// Trigger another pipeline and check result of this
ret = build(job: 'pipelineB',
parameters: [
string(name: 'param1', value: "value1")
],
propagate: true,
wait: true)
echo ret.result
currentBuild.result = ret.result
}
Read here for detail
As the title states, I want to capture the logs for all the stages in my build, which looks like this:
pipeline {
agent any
stages {
stage('Build First Repo') {
steps {
build job: 'jobOne', parameters: [string(name: 'branch', value: "${params.branch}")], quietPeriod: 1
}
}
stage ('Build Second Repo') {
steps {
build job: 'jobTwo', parameters: [string(name: 'branch', value: "${params.someOtherBranch}")], quietPeriod: 1
}
}
stage ('Deploy') {
steps {
build job: 'jobThree', parameters: [string(name: 'buildEnvironment', value: "${params.environment}")], quietPeriod: 1
}
}
stage ('Remote Build') {
steps {
build job: 'jobFour', parameters: [string(name: 'Hosts', value: "${params.hosts}")], quietPeriod: 1
}
}
}
post {
always {
mail to:me#mydomain.com, subject: "${currentBuild.currentResult} - ${currentBuild.fullDisplayName}", body:"...${currentBuild.rawBuild.getLog(100)}"
}
}
}
Currently, I can only get the pipeline build log (which I am e-mailing in the post/always section), which is helpful but not sufficient; I'd like to get the logs from each of the stages. I thought of maybe capturing them per stage and creating an environment variable or something but I'm not sure how to even access the logs for the build of those jobs. Can someone point me in the right direction on how to capture the logs for those jobs?
You can add a post section after an any stage.
And you can setup sendind emails there.
To send single email from the build you can use stash/unstash command. You can stash log at each section and finally unstash them and send an email.
Im trying to convert my freestyle job to a declarative pipeline job since the pipeline provides more flexibility. I cannot figure out how to use the NodeLabel parameter plugin (https://wiki.jenkins.io/display/JENKINS/NodeLabel+Parameter+Plugin) in a pipeline however.
pipeline {
agent any
parameters {
// Would like something like LabelParameter here
}
stages {
stage('Dummy1') {
steps {
cleanWs()
sh('ls')
sh('pwd')
sh('hostname')
}
}
stage('Dummy2') {
steps {
node("comms-test02") {
sh('ls')
sh('pwd')
sh('hostname')
}
}
}
}
I basically just need a way to start the job using a parameter that specifies where to build the job (using slave label).
Jenkins requires an agent field to be present which i set to 'any'. But it doesnt seem like there is a labelparameter available ?
As an alternative I tried using the 'node' command (https://jenkins.io/doc/pipeline/steps/workflow-durable-task-step/#-node- allocate node). But that leaves me with two running jobs which, while working, doesnt look that pretty.
Does anyone if the NodeLabel parameter plugin can be used ? or maybe someone has a cleaner approach ?
Edit: Maybe I wasn't clear. I need to be able to run jobs on different nodes. The node to run on should be decided when triggering the job through a parameter. The node label plugin does this perfectly. However, I have not been able to reproduce this behavior in pipeline.
Here's a full example:
pipeline {
parameters {
choice(name: 'node', choices: [nodesByLabel('label')], description: 'The node to run on') //example 1: just listing all the nodes with label
choice(name: 'node2', choices: ['label'] + nodesByLabel('label'), description: 'The node to run on') //example 2: add the label itself as the first choice to make "Any of the nodes" the default choice
}
agent none
stages {
stage('Test') {
agent { label params.node}
stages {
stage('Print environment settings') {
steps {
echo "running on ${env.NODE_NAME}"
sh 'printenv | sort'
}
}
}
}
}
}
Let's say you added the parameter(say named slaveName) using the NodeLabel plugin on your pipeline. You now need to extract the value of slaveName and feed it into the agent->node->label field.
You can specify the node using the node property inside the agent.
Like this -
agent
{
node
{
label "${slaveName}"
}
}
The following script worked for me to run the multiple jobs parallelly on different Node.
I have taken the reference from the build step plugin documentation.
https://www.jenkins.io/doc/pipeline/steps/pipeline-build-step/
def build_one()
{
parallel one: {
stage('XYZ') {
catchError(buildResult: 'SUCCESS', stageResult:'FAILURE') {
build job: 'yourDownStreamJob', parameters: [[$class: 'NodeParameterValue', name: 'NodeToRun',labels: ['nodeName'], nodeEligibility: [$class: 'AllNodeEligibility']], string(name: 'ParentBuildName', value: "XX"), string(name: 'Browser', value: 'chrome'), string(name: 'Environment', value: 'envName')]
}
}
},
two : {
stage('SecondArea') {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
build job: 'yourDownStreamJob', parameters: [[$class: 'NodeParameterValue', name: 'NodeToRun',labels: ['Your'], nodeEligibility: [$class: 'AllNodeEligibility']], string(name: 'ParentBuildName', value: "XYX"), string(name: 'Browser', value: 'firefox'), string(name: 'Environment', value: 'envName')]
}
}
}
}
build_one()
I have project A and project B. I would like to pass parameters (like the BranchName and ArtifactoryID) from project A to project B. Both are multi-branch pipelines using a Declarative Script Jenkinsfile.
When I use the Snippet Generator it tells me the project "is not parameterized". When looking at the config of the multi-branch pipeline, I don't see a way to parameterize it. What am I missing? (see attached)
A google result shows this, but I'm not sure how it's supposed to pass params between multi-branch pipelines: https://issues.jenkins-ci.org/browse/JENKINS-32780
I figured this out. I leveraged an answer from a comment here: Pipeline pass parameters to downstream jobs
For a detailed explanation using my example shown above, my Project A jenkinsfile would have the following before the stages:
parameters
{
string(name: 'BRANCH_PASSED_OVER', defaultValue: '${env.BRANCH_NAME}', description: 'pass branch value')
string(name: 'PERSON2', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')
}
...and the following for the build step phase
stage('Build downstream')
{
steps
{
build job: 'BUILD/CMTest2/' + env.BRANCH_NAME.replaceAll("/", "%2F"), wait: false, parameters: [string(name: 'PERSON2', value: params.PERSON2), string(name: 'PASS_BRANCH_NAME', value: env.BRANCH_NAME)]
}
}
In Project B then in my jenkinsfile I could call the param like so:
stage('Collect Info')
{
steps
{
echo "Hello ${params.PERSON2}"
echo "PASS_BRANCH_NAME: ${params.PASS_BRANCH_NAME}"
}
}