Jenkins Job DSL Plugin - Reading in External File - jenkins

I'm having trouble with the Jenkins Job DSL Plugin. I can create the pipeline job via the plugin casc script, which is great. However, now I am trying to read in a csv file to autopopulate some parameters for it. readFileFromWorkspace is failing and readFile isn't working either. I'm trying like this: def file = readFileFromWorkspace("${ENV_VAR}file.csv") (which gives us /usr/share/jenkins/casc-configs/file.csv)
I verified file is present on the Jenkins Master at the above path (which is a Docker Container - running on Kubernetes). I can shell to the container and see it there.
Does anyone have experience with an effort similar to this? The errors I get are 'No signature of method: script.readFile() is applicable for argument types (org.codehaus.groovy.runetime.GStringImpl) values :( /file/location/here) <--- which I verified the file is present at this value on the Jenkins master container.
The readFileFromWorkspace error is:
java.lang.NullPointerException at javaposse.jobdsl.plugin.JenkinsJobManagement.locateValidFileInWorkspace(JenkinsJobManagement.java:428)

Related

JMeter did not execute the properly the # of users using Jenkins parameterization

I set up a Jenkins to integrate the JMeter script. Using the Freestyle project in Jenkins, I enabled "This project is parameterized" to set a String parameters for the Threads, Loop_Count and Think Time.
In .jmx file, I used the Function Parameters function to define those
variables, as shown below image:
User Defined Variable
In Jenkins, I configured the parameterization as shown below:
Set Parameterization
Command Line
However, when running the test for 40 users using the Build parameter in Jenkins, it looks like the # of threads/users are not correct that is being executed, the Samples that are being generated only 3 for most of the pages. Only the Homepage (which is the first page on the test) is only getting the correct # of Samples, but the rest of the URLS/pages are not correct. Below the actual output.
Output
Can you please help what might the causing this issue, I already checked the Jmeter script and jenkins config and appears to be correct but still I'm getting the issue. Thanks for the help.
Setting parameters in Jenkins itself is not sufficient to pass them to JMeter, you need to pass this parameterized value to JMeter startup script via -J command-line argument
-Jusers=%users%
End-to-end demo:
More information: Apache JMeter Properties Customization Guide

Unable to determine what the withEnv is doing in Jenkinsfile

I have just started writing Jenkinsfile. I was viewing the following two URLs to learn how to build a Java application, push it to Nexus and then invoke Ansible to deploy.
Redhat Jenkinsfile description
Actual Jenkinsfile
In the second link the following is mentioned several times whose function I am unable to understand:
withEnv(["PATH+MAVEN=${tool 'm3'}/bin"])
What I can find from net is that withEnv is used to create/override a environment variables. But what is ${tool 'm3'}/bin doing? Normally the syntax of withEnv is VARIABLE_NAME=value/expression.
The ${} is substituting a command/variable into the GString. See groovy docs on string Interpolation
From the looks of it, it would be safe to assume tool 'm3' is returning the install path which then gets /bin appended.
So the end result would be
PATH+MAVEN=/my/path/to/m3/etc/bin
Additionally to #metalisticpain's answer, there's some background configuration to the tools directive on the Jenkins server itself that configures the installation paths to be used.
Let's say you have jdk-1.8.0 installed as a tool name on the Jenkins server, then it can be used in the Jenkinsfile as such in your example:
withEnv(["PATH+JDK=${tool 'jdk-1.8.0'}/bin"])
Taken from the documentation linked above:
The tool name must be pre-configured in Jenkins under Manage Jenkins → Global Tool → Configuration.

Jenkins: Passing a variable between scripts, and accessing it on a post build actions

I have a Jenkins job, with SCM from bitbucket, two shell scripts, and a post build action publishing the result to Slack.
Naively I want to pass a concluded variable in the first shell script to the second, add some information to that variable in the second shell script, and then to append that variable to the Slack custom message.
I was expecting this to be a built in feature, and now spending few days on and off at it. I've tired the EnvInject, Environment Inject, Global Variable String Parameter plugins, but in any configuration I've tried it didn't work.
In some cases I got this error:
21:01:08 [EnvInject] - [ERROR] - The given properties file path 'build.properties' doesn't exist.
I know this file does not exist.. I expected the plugin to create it, so I can add new content to it in first shell script, and to be loaded in every other step of the job.
Am I missing something or misusing these plugins?
So like I've seen it happens too often, after asking the question, I was able to solve it like this:
First we create a shell script to create the file, I've already added a value:
Then we tell Jenkins to inject the variables from the build.properties file:
Then we change the value of the variable in the file:
Then AGAIN we tell Jenkins to inject the variables from the same file:
Then we can observe the value changes in the next shell:
Also in the post build action:
And success:

Extending the Jenkins Groovy DSL

How can I add/edit new code to my Jenkins instance that would be accesible in a DSL script? Context follows
I've inherited a Jenkins instance. Part of this inheritance includes spending the night in a haunted house writing some new automation in groovy via the Jobs DSL plugin. Since I'm fearful of ruining our jenkins instance, my first step is setting up a local development instance.
I'm having trouble running one of our existing DSL Scripts on my local development instance -- my builds on the local server fail with the following in the Jenkins error console.
Processing DSL script jobs.groovy
ERROR: startup failed:
jobs.groovy: 1: unable to resolve class thecompanysname.jenkins.extensions
The script in question starts off like this.
import thecompanysname.jenkins.extensions
use(extensions) {
def org = 'project-name'
def project = 'test-jenkins-repo'
def _email = 'foo#example.com'
So, as near I can tell, it seems like a predecesor has written some custom Groovy code that they're importing
import thecompanysname.jenkins.extensions
What's not clear to me is
Where this code lives
How I can find it in our real Jenkins instance
How I can add to to my local instance
Specific answers are welcome, as our here's how you can learn to fish answers.
While there may be other ways to accomplish this, after a bit of poking around I discovered
The Jenkins instance I've installed has an older version of the Jobs DSL plugin installed.
This version of the Jobs DSL plugin allowed you to set an additional classpath in your Process DSL Builds job section that pointed to additional jar files.
These jar files can give you access to additional classes in your groovy scripts (i.e. thecompanysname.jenkins.extensions)
Unfortunately, more recent versions of the Jobs DSL plugin have removed this option, and it's not clear if it's possible to add it back. That, however, is another question.
Configure Global Security -> uncheck "Enable script security for Job DSL
scripts".
works for me

How to re-use groovy script in Jenkins Groovy Post Build plugin?

I have some groovy code which I am planning to re-use in Jenkins Groovy Post Build plugin of multiple jobs. How can I achieve this? Is there a place I can store the script in a global variable and call that in the jobs where ever I need?
You can load any groovy file living on the Jenkins master within the groovy postbuild and execute it. For example, you could have a special directory on the c drive where all the common scripts live. I'll update my answer later with some code that shows you how to load the script in.
Update
Assuming you have a test.groovy file on your C: drive, it should be as simple as the following in Groovy Postbuild:
evaluate(new File("C:\\test.groovy"))
Please view the comment section of the Groovy Postbuild for more examples and possibly other ways.
Here is the solution that worked for me:
Installed Scriptler plugin for Jenkins and saved the Groovy script in that. Now the script is available in JENKINS_HOME/scriptler/scripts directory. This way we can avoid manual step of copying files to Jenkins master.
Used the groovy file in Post build:
def env = manager.build.getEnvironment(manager.listener) evaluate(new File(env['JENKINS_HOME'] + "\\scriptler\\scripts\\GroovyForPostBuild.groovy"))
This is a copy of my answer to this similar question on StackOverflow:
If you wish to have the Groovy script in your Code Repository, and loaded onto the Build / Test Slave in the workspace, then you need to be aware that Groovy Postbuild runs on the Master.
For us, the master is a Unix Server, while the Build/Test Slaves are Windows PCs on the local network. As a result, prior to using the script, we must open a channel from the master to the Slave, and use a FilePath to the file.
The following worked for us:
// Get an Instance of the Build object, and from there
// the channel from the Master to the Workspace
build = Thread.currentThread().executable
channel = build.workspace.channel;
// Open a FilePath to the script
fp = new FilePath(channel, build.workspace.toString() + "<relative path to the script in Unix notation>")
// Some have suggested that the "Not NULL" check is redundant
// I've kept it for completeness
if(fp != null)
{
// 'Evaluate' requires a string, so read the file contents to a String
script = fp.readToString();
// Execute the script
evaluate(script);
}

Resources