In Jenkins declarative pipeline we can define build parameters like
pipeline {
…
parameters {
string(name: 'PARAMETER', defaultValue: 'INITIAL_DEFAULT')
choice(name: 'CHOICE', choices: ['THIS', 'THAT'])
}
…
}
However the parameter definitions of the job are only updated when the job runs after the build parameters dialog was already shown. That is, when I change the INITIAL_DEFAULT to something else, the next build will still default to INITIAL_DEFAULT and only the one after that will use the new value.
The same problem is with the choices, and there it is even more serious, because string default can be overwritten easily when starting the build, but if the new option isn't there, it cannot be selected at all.
So is there a way to define functions or expressions that will be executed before the parameter dialog to calculate current values (from files, variable in global settings or any other suitable external configuration)?
I remember using some plugins for this in the past with free-style jobs, but searching the plugin repository I can't find any that would mention how to use it with pipelines.
I don't care too much that the same problem applies to adding and removing parameters, because that occurs rarely. But we have some parameters where the default changes often and we need the next nightly to pick up the updated value.
It turns out the extended-choice-parameter does work with pipeline, and the configurations can be generated by the directive generator. It looks something like
extendedChoice(
name: 'PARAMETER',
type: 'PT_TEXTBOX',
defaultPropertyFile: '/var/lib/jenkins/something.properties',
defaultPropertyKey: 'parameter'
)
(there are many more options available in the generator)
Groovy script to get global environment variables can be had from this other answer.
Related
Sometimes, we want to create multiple jobs that use the same Jenkinsfile instead of a single one. This could happen for example because we want to maintain logs divided based on parameters, instead of having a single job on which look for the right log.
However, in this case, we can't use the parameter definition in the Jenkinsfile, because whatever default value we would define on the job instance would be overwritten by the following execution with whatever is defined in the Jenkinsfile (and this is also happening if we don't define a default value).
So, in this situation, the only way we figure out is to remove the parameter definition in the Jenkinsfile and define the parameters directly on the jobs, which is kind of not optimal.
I mean, I agree that this is the right behavior in most of the cases, as you don't want your parameter to be out of synch and not versioned, but is there a way to specify to Jenkins to skip the parameter reconfiguration or to override the default parameter written in the Jenkinsfile? Something that can be activated/deactivated job by job.
Had this problem myself, we solved it like this:
string(name: 'parameterName', defaultValue: params.parameterName ?:'your default value')
Now the default values defined through Jenkins job configuration will not be overridden.
I seem to have found a bug with trying to pass environment variables from one Jenkins job to another.
I have a Jenkins job which contains a Powershell build step. My question is not about the Powershell script, as that does exactly what I want (it goes to Artifactory, finds a list of all the builds and then gets the build number of the latest one). The script ends up with the Artifactory build number as a text string '$LATEST_BUILD_NO_SLASH' (for clarity, this is not the Jenkins build number). This is eventually stored as an environment variable called 'LATEST_BUILD_NUM_VAL'
This is definitely creating an environment variable with my value stored in it, as it can be seen in the 'Environment Variables' list.
This environment variable is passed in the standard way in the parameterized build step.
My issue is that when I use this environment variable in a downstream build having passed it using 'LATEST_BUILD_NUM = ${LATEST_BUILD_NUM_VAL}', I get '${LATEST_BUILD_NUM_VAL}' as the value passed to the downstream job:
But, if I pass a Jenkins created environment variable i.e.'LATEST_BUILD_NUM = ${JOB_BASE_NAME}' I get the correct variable in the downstream job:
I have spent all day banging my head around this and don't really know where to go from here. I seem to be creating the environment variable correctly, as it is in the environment variables list and it works if I use a standard environment variable. I have declared 'LATEST_BUILD_NUM' as a parameter in my downstream build.
Is there any other way of achieving what I am trying to do?
I have checked in the 'Jenkins Issues' log for issues with parameterised builds and I can't find anything similar to my issue.
In case it is of any relevance, the Jenkins Environment Injector plugin is v2.1.6 and the Parameterized Trigger plugin is v2.35.2.
This is easy to achieve in Jenkins Pipeline:
Your second job (JobB) is called from your first job (JobA) as a downstream job. Thus somewhere, (probably the end of your JobA pipeline) you will have:
build job: 'CloudbeeFolder1/Path/To/JobB', propagate: false, wait: false, parameters: [[$class: 'StringParameterValue', name: 'MY_PARAM', value: "${env.SOME_VALUE}"]]
Then in JobB on the "other side" you have:
environment {
PARAM_FROM_PIPELINE = "${params.MY_PARAM}"
}
This gets the value of your parameter into an environment variable in JobB.
in first job in post build action--> Trigger Parameterized build on other projects select this
In that project build --> give name of downward jobname
select Add parameter in that add Predefined parameter give parameter in key value format e.g Temp=${BUILD_ID}
In second job select project is parameterized in select any option e.g string parameter and put name as Temp and used this parameter in shell or anywhere as $Temp ...
I am trying to create a Jenkins job that has dependent parameters.
Firstly I want to be able to choose a main parameter:
And then secondly to be able to choose from a set of options that are dependent parameters of the main parameter.
If I select a different main parameter:
I then want to have a different set of options as the dependencies to the second main parameter.
Please, can you help me with how I can achieve this?
I would suggest the Active Choices plugin (also known as "uno-choice"). (This question has references to both, though they're not the accepted answer.)
For your specific use case, you'll want to add two parameters to your job:
Active Choices Parameter
Name: MainOption
Script: Groovy Script
return ['A','B']
Active Choices Reactive Parameter
Name: DependentOption
Script: Groovy Script
def choices
switch(MainOption){
case 'A':
choices = ['Blue','Green','Yellow']
break
case 'B':
choices = ['Black','White','Grey']
break
default:
choices = ['N/A']
break
}
return choices
Fallback Script: Groovy Script
return ['Option error']
Referenced parameters:
MainOption
The "Referenced parameters" setting is the key - when that value is changed, the plugin will re-evaluate the Groovy script, giving you the dependent parameter effect.
For all of you who stumble upon the same kind of problem (like I did). There is a fairly new Jenkins plugin available that does exactly what is desired here: display a dependent set of drop-downs, updating dependent boxes when a main box changes selection.
For your job you just need to follow the following steps:
Install "Multiselect Parameter Plugin"
The "multiselect parameter plugin" can be installed from Jenkins plugin management, the documentation is available on its Jenkins Plugin page.
Add new parameter
Use "Multiselect Parameter" type
Set name to a sensible value
give a short description
configure like shown below:
H,Main option,Dependent option
V,SELECTED_MAIN,SELECTED_DEPENDENT
C,A,Blue
C,A,Green
C,A,Yellow
C,B,Black
C,B,White
C,B,Grey
Use selected values
When you run "build with parameters" in your job, the following boxes are displayed for selection:
In your build script you can simply use the configured environment variables SELECTED_MAIN and SELECTED_DEPENDENT, which contain the selected values from both select boxes.
I want to add a new mandatory job property to capture the some custom fields in the jenkins job. I searched in the plugins list but couldn't find any relevant plugin that solves the issue. Is there any plugin to solve this ? (Note: Extra columns plugin doesn't solve my usecase)
A freestyle job can be configured to build with parameters. See: https://wiki.jenkins.io/display/JENKINS/Parameterized+Build
You can configure the parameter type (string, boolean, drop down etc), give a description of the parameter and a default value. The string parameters can include validation rules:
https://wiki.jenkins.io/display/JENKINS/Validating+String+Parameter+Plugin
Though this only warns when the current parameter value does not meet the regex validation rule, it doesn't prevent the build from being submitted. If submitted in this state, however, the build will fail.
From a quick google, it appears this doesn't work for pipeline jobs, See the last comment on the plugin page url above from Miguelángel Fernández:
If you look at the implementation of class ValidatingStringParameterValue you'll see that it overrides the implementation of public BuildWrapper createBuildWrapper(AbstractBuild build) in a way that aborts if the string is invalid. This will only work on Freestyle jobs and other job types extending AbstractBuild. I'm afraid this does not apply to pipeline jobs. Maybe in your prior project you used freestyle jobs.
An alternative for freestyle jobs is to do in job validation before initiating any build steps using the 'Prepare an environment for the run' from:
https://wiki.jenkins.io/display/JENKINS/EnvInject+Plugin
You would need to write groovy to check the parameters submitted and abort the build at this point if the values aren't suitable. Something like:
def validateString = binding.variables.get('testParam')
if (!binding.variables.get('testParam').matches('\\d+')) {
println "failure of parameter validation - does not match regex"
throw new InterruptedException()
} else {
println "Validation passed carry on with build"
}
This doesn't work on pipeline builds - as the plugin is quote:
'This plugin has some known limitations. For Example, Pipeline Plugin is not fully supported.'.
But if you are using scripted pipelines you can implement something similar:
stage 'start up'
if(!env.testParam.matches('\\d+')) {
error 'failure of parameter validation - does not match regex'
}
I have a parameterized project. With the variable VAR1.
I'm using the the Xray for JIRA Jenkins Plugin for Jenkins. There you can fill four parameters:
JIRA Instance
Issues
Filter
File Path
I'm new to Jenkins but what I have learned so far, that you can't fill this fields with environment variables. Something like
Issues: ${VAR1} - doesn't work.
So I thought I can do this with a pipeline. When I click on Pipeline Syntax and chose step: General Build Step I can choose Xray: Cucumber Features Export Task. Then I fill the fields with my environment variable and click Generate Pipeline Script The output is as follows:
step <object of type com.xpandit.plugins.xrayjenkins.task.XrayExportBuilder>
That doesn't work. What I'm doing wrong?
All you're doing is OK, but what you want is not supported by Jenkins whether it is pipeline or not, since the parameters' load is happening prior to the pipeline-flow or the definition of the ${VAR1}.
You can try to overcome this by defining the 'Issues' value as a pipeline internal value instead of a parameter and base it on the ${VAR1} value.
If it must be a parameter, use 2 jobs where one defines the value of 'Issues' based on a the ${VAR1} and pass it to the other job that gets the 'Issues' as a fixed value.