Please read the comments to understand the problem.
job(buildV2PerfTest) {
displayName('Performance Test')
steps {
//I am loading a value into a properties file using the shell command. The name of the variable is callbackUrl
shell('echo "callbackUrl=http://`curl http://169.254.169.254/latest/meta-data/public-hostname`:8080" > env.properties')
//then I add the propeties file to Jenkins properties
environmentVariables {
propertiesFile('env.properties')
}
maven {
goals('-P performance test')
//I want to pass to maven a loaded property here
property('callbackUrl', "${callbackUrl}")
}
}
}
The problem is that when I compile this code it says that the property does not exists. Indeed. It will exist when I trigger the job. I want to know how to reference dynamic properties.
P.S.
The documentation tells how to load the variables, but fails in explain how to access them
The solution was:
property('callbackUrl', "\$callbackUrl")
Related
I'm new to jenkins and inherited a bunch of declarative pipelines of unknown code quality. Each pipeline uses folder properties to set shared default param values. This puts essential variables outside of source control, which kills our PR process and our history for debugging. For example
//pipelineA/Jenkinsfile
pipeline {
parameters {
string name: 'important_variable', defaultValue: folderProperty('important_variable')
}
//etc
}
//pipelineB/Jenkinsfile
pipeline {
parameters {
string name: 'important_variable', defaultValue: folderProperty('important_variable')
}
//etc
}
Then in the root folder a property important_variable is set to "Hello World"
Is there a way to get this into source control either by setting the folder property to extract the variable from a yaml, or by using shared libraries?
Thank you for any help!
In case anyone reads this, we ended up:
Create a bootstrap.groovy file
This file MUST go in a /vars directory at the absolute top of your repo
Using the Jenkins UI we went to the pipeline's parent directory > config and created a shared library called config-lib that points at our repo and automatically exposes the bootstrap.groovy file methods as long as the file is in the right place
The bootstrap.groovy file has a call method that returns a map with key value pairs for our default parameters. This method has to be named call
In the Jenkinsfile for the pipeline we include the following two lines:
#Library("config-lib") _
config = bootstrap()
The library decorator (note it ends with _) imports the config-lib methods defined in the jenkins ui
The bootstrap function calls the call method from the bootstrap.groovy file in that config-lib library
in the Jenkinsfile use the config map to populate the param default values
pipeline {
parameters {
string name: 'foo', defaultValue: config.foo
}
And it's done.
This video helped immensely: https://youtu.be/Wj-weFEsTb0
I'm creating Build Monitor view with DSL Script, but there is no method in API to set the job order. I can set the order manually in configuration after view is created, but I need to do that within the script.
I'm using https://jenkinsci.github.io/job-dsl-plugin/#path/buildMonitorView as a reference. The only way I suspect it could be possible is configure(Closure) method, but I would still have the same question of how to do it.
My current code:
biuldMonitorView("name-of-the-view") {
jobs {
regex("some regex to include jobs")
recurse()
}
// I would expect something like:
view {
orderByFullName()
}
}
After some trial and error and println calls everywhere I came to this solution:
biuldMonitorView("name-of-the-view") {
jobs { // This part is as before
regex("some regex to include jobs")
recurse()
}
// The solution:
view.remove(view / order)
view / order(class: "com.smartcodeltd.jenkinsci.plugins.buildmonitor.order.ByFullName")
}
Above solution sets job order to "Full name" instead of default "Name".
I found the remove idea at Configure SVN section of job-dsl-plugin, fully qualified names of job order options can be found in the source of jenkins-build-monitor-plugin.
I had the same question today and managed to get Aivaras's proposal to work in the following way:
buildMonitorView("name-of-the-view") {
// Set properties like jobs
jobs {
regex("some regex to include jobs")
recurse()
}
// Directly manipulate the config to set the ordering
configure { view ->
view.remove(view / order)
view / order(class: "com.smartcodeltd.jenkinsci.plugins.buildmonitor.order.ByFullName")
}
I am trying to implement active reactive choice parameter .
In reactive parameter basically I am hard coding the different options based on the active parameter
Below is the sample code
if (Target_Environment.equals("Dev01")) {
return ["test_DEV"]
} else if (Target_Environment.equals("Dev02")) {
return ["test3_DEV02","test2_DEV02"]
} else if (Target_Environment.equals("Dev03")) {
return ["test3_DEV03"]
} else if (Target_Environment.equals("Sit03")) {
return ["test3_SIT03"]
}else if (Target_Environment.equals("PPTE")) {
return ["test3_PPTE"]
}
else {
return ["Please Select Target Environment"]
}
Instead hard coding the choices I want to read from a file in jenkins workspace and show the content as the choices , what would be an ideal way to go with that ?
The readFile is not working under return function
I am also trying with extended choice parameter but
there I am passing a property file with filename however how can I pass the property file with if else condition
I'm not 100% sure if this is accurate. But I would expect that you can't read from the job workspace in a parameter like that because workspaces are created after a build (see below error message).
So if you create the job with no previous builds, there will be no workspace file to read from?
Whenever I have seen that parameter type used in the past, they usually read from a file on the server instead of the workspace.
Error message from Jenkins jobs that have no previous builds:
Error: no workspace
A project won't have any workspace until at least one build is performed.
Run a build to have Jenkins create a workspace.
I am creating a Jenkins pipeline, I want certain stage to be triggered only when a particular log file's(log file is located in the server node where all the stages are going to run) last modified date is updated after the initiation of pipeline job, I understand we need to use "When" condition but not really sure how to implement it.
Tried referring some of the pipeline related portals but could not able to find an answer
Can some please help me through this?
Thanks in advance!
To get data about file is quite tricky in a Jenkins pipeline when using the Groovy sandbox since you're not allowed to do new File(...).lastModified. However there is the findFiles step, which basically returns a list of wrapped File objects with a getter for last modified time in millis, so we can use findFiles(glob: "...")[0].lastModified.
The returned array may be empty, so we should rather check on that (see full example below).
The current build start time in millis is accessible via currentBuild.currentBuild.startTimeInMillis.
Now that we git both, we can use them in an expression:
pipeline {
agent any
stages {
stage("create file") {
steps {
touch "testfile.log"
}
}
stage("when file") {
when {
expression {
def files = findFiles(glob: "testfile.log")
files && files[0].lastModified < currentBuild.startTimeInMillis
}
}
steps {
echo "i ran"
}
}
}
}
I am trying to add a call to my jenkins job dsl that will configure the job to give permission to another build to copy artifacts. However, I am unable to find a command for it in the Jenkins Job DSL API:
https://jenkinsci.github.io/job-dsl-plugin/
Here is the option I am trying to set using the DSL:
Does this command exist? Is there anyways to setup my groovy to do this if it doesnt?
There is no built-in DSL to set that permission, but you can use the Dynamic DSL.
The Job DSL API viewer can be opened at http://localhost:8080/plugin/job-dsl/api-viewer/index.html where localhost is your Jenkins host. Search for copyArtifactPermission as an example:
job('example') {
properties {
copyArtifactPermissionProperty {
projectNames('one, two')
}
}
}
is it this one?
job('example') {
steps {
copyArtifacts('upstream') {
includePatterns('*.xml', '*.properties')
excludePatterns('test.xml', 'test.properties')
targetDirectory('files')
flatten()
optional()
buildSelector {
latestSuccessful(true)
}
}
}
}
EDIT
It seems this may have been fixed in the google group for job-dsl
configure { project ->
project / 'properties' / 'hudson.plugins.copyartifact.CopyArtifactPermissionProperty' / 'projectNameList' {
'string' "*-foo"
}
}
I think they may have changed the interface though and you need to provide explicit job names now, but I haven't got the plugin so I can't check