How to execute a groovy share library with optional parameters - jenkins

I'm developing a shared library to execute it from my Jenkinsfile. This library has a function with optional parameters and I want to be able to execute this function with any number of parameters by specifying my value. I've been googling a lot, but couldn't find a good answer, so maybe somebody here could help me.
Example:
The function looks this way:
def doRequest(def moduleName=env.MODULE_NAME, def environment=env.ENVIRONMENT, def repoName=env.REPO_NAME) {
<some code goes here>
}
If I execute it from my Jenkinsfile this way:
script {
sendDeploymentStatistics.doRequest service_name
}
the function puts "service_name" value to the moduleName, but how do I specify "repoName" parameter?
In Python you would do it somehow like:
function_name(moduleName=service_name, repoName=repo_name)
but in Groovy + Jenkinsfile I can't find the right way.
Can anybody please help me to find out the right syntax?
Thank you!

Groovy has the concept of Default Parameters. If you change the order of the parameters, such that the environment comes last:
def doRequest(def moduleName=env.MODULE_NAME, def repoName=env.REPO_NAME, def environment=env.ENVIRONMENT) {
<some code goes here>
}
Then your call to function_name will take the default value for environment:
function_name(moduleName=service_name, repoName=repo_name)
Groovy however also has some sort of support for Named Parameters. It is not as nice as Python but you can get it to work as follows:
env = [MODULE_NAME: 'foo', ENVIRONMENT: 'bar', REPO_NAME: 'baz']
def doRequest(Map args = [:]) {
defaultMap = [moduleName: env.MODULE_NAME, environment: env.ENVIRONMENT, repoName: env.REPO_NAME]
args = defaultMap << args
return "${args.moduleName} ${args.environment} ${args.repoName}"
}
assert 'foo bar baz' == doRequest()
assert 'foo bar qux' == doRequest(repoName: 'qux')
assert '1 2 3' == doRequest(repoName: '3', moduleName: '1', environment: '2')
For Named Parameters you need a parameter of type Map (with a default value of the empty map). Groovy will then map the arguments upon calling the function to entries in a Map.
To use default values you need to create a map with the default values, and merge that defaultMap with the passed-in arguments.

Related

Kedro - how to pass nested parameters directly to node

kedro recommends storing parameters in conf/base/parameters.yml. Let's assume it looks like this:
step_size: 1
model_params:
learning_rate: 0.01
test_data_ratio: 0.2
num_train_steps: 10000
And now imagine I have some data_engineering pipeline whose nodes.py has a function that looks something like this:
def some_pipeline_step(num_train_steps):
"""
Takes the parameter `num_train_steps` as argument.
"""
pass
How would I go about and pass that nested parameters straight to this function in data_engineering/pipeline.py? I unsuccessfully tried:
from kedro.pipeline import Pipeline, node
from .nodes import split_data
def create_pipeline(**kwargs):
return Pipeline(
[
node(
some_pipeline_step,
["params:model_params.num_train_steps"],
dict(
train_x="train_x",
train_y="train_y",
),
)
]
)
I know that I could just pass all parameters into the function by using ['parameters'] or just pass all model_params parameters with ['params:model_params'] but it seems unelegant and I feel like there must be a way. Would appreciate any input!
(Disclaimer: I'm part of the Kedro team)
Thank you for your question. Current version of Kedro, unfortunately, does not support nested parameters. The interim solution would be to use top-level keys inside the node (as you already pointed out) or decorate your node function with some sort of a parameter filter, which is not elegant either.
Probably the most viable solution would be to customise your ProjectContext (in src/<package_name>/run.py) class by overwriting _get_feed_dict method as follows:
class ProjectContext(KedroContext):
# ...
def _get_feed_dict(self) -> Dict[str, Any]:
"""Get parameters and return the feed dictionary."""
params = self.params
feed_dict = {"parameters": params}
def _add_param_to_feed_dict(param_name, param_value):
"""This recursively adds parameter paths to the `feed_dict`,
whenever `param_value` is a dictionary itself, so that users can
specify specific nested parameters in their node inputs.
Example:
>>> param_name = "a"
>>> param_value = {"b": 1}
>>> _add_param_to_feed_dict(param_name, param_value)
>>> assert feed_dict["params:a"] == {"b": 1}
>>> assert feed_dict["params:a.b"] == 1
"""
key = "params:{}".format(param_name)
feed_dict[key] = param_value
if isinstance(param_value, dict):
for key, val in param_value.items():
_add_param_to_feed_dict("{}.{}".format(param_name, key), val)
for param_name, param_value in params.items():
_add_param_to_feed_dict(param_name, param_value)
return feed_dict
Please also note that this issue has already been addressed on develop and will become available in the next release. The fix uses the approach from the snippet above.
As mentioned by Dmitry, kedro 0.16.0 introduced nested parameter values inside the node inputs which can be accessed via . operator:
node(func, "params:a.b", None)
whereas kedro 0.17.6 enabled overriding nested parameters with params in CLI, e.g.
kedro run --params="model.model_tuning.booster:gbtree"

I have a Jenkins global variable in a string - how do I evaluate it?

I need to accept all kinds of global Jenkins variables as strings (basically as parameters to ansible like system - a template stored in \vars).
def proof = "\"${params.REPOSITORY_NAME}\""
echo proof
def before = "\"\${params.REPOSITORY_NAME}\""
echo before
def after = Eval.me(before)
echo after
The result is:
[Pipeline] echo
"asfd"
[Pipeline] echo
"${params.REPOSITORY_NAME}"
groovy.lang.MissingPropertyException: No such property: params for class: Script1
the first echo proves that the param value actually exists.
the second echo is the what the input actually looks like.
the third echo should have emitted asdf instead I get the exception.
Any ideas? I'm hours into this :-(
You may want to check:
groovy: Have a field name, need to set value and don't want to use switch
1st Variant
In case you have: xyz="REPOSITORY_NAME" and want the value of the parameter REPOSITORY_NAME you can simply use:
def xyz = "REPOSITORY_NAME"
echo params."$xyz" // will print the value of params.REPOSITORY_NAME
In case if your variable xyz must hold the full string including params. you could use the following solution
#NonCPS
def split(string) {
string.split(/\./)
}
def xyz = "params.REPOSITORY_NAME"
def splitString = split(xyz)
echo this."${splitString[0]}"."${splitString[1]}" // will print the value of params.REPOSITORY_NAME
2nd Variant
In case you want to specify an environment variable name as parameter you can use:
env.“${params.REPOSITORY_NAME}”
In plain groovy env[params.REPOSITORY_NAME] would work but in pipeline this one would not work inside the sandbox.
That way you first retrieve the value of REPOSITORY_NAME and than use it as key to a environment variable.
Using directly env.REPOSITORY_NAME will not be the same as it would try to use REPOSITORY_NAME itself as the key.
E.g. say you have a job named MyJob with the following script:
assert(params.MyParameter == "JOB_NAME")
echo env."${params.MyParameter}"
assert(env."${params.MyParameter}" == 'MyJob')
This will print the name of the job (MyJob) to the console assuming you did set the MyParameter parameter to JOB_NAME. Both asserts will pass.
Please don’t forget to open a node{} block first in case you want to retrieve the environment of that very node.
After trying all those solutions, found out that this works for my problem (which sounds VERY similar to the question asked - not exactly sure though):
${env[REPOSITORY_NAME]}

Jenkins pipeline script for passing variables from one function to another function

I am new to Jenkins pipeline scripting. I am developing a Jenkins pipeline in which the Jenkins code is as follows. The logic looks like this:
node{
a=xyz
b=abc
//defined some global variables
stage('verify'){
verify("${a}","${b}")
abc("${a}","${b}")
echo "changed values of a and b are ${a} ${b}"
}}
def verify(String a, String b)
{ //SOme logic where the initial value of a and b gets changed at the end of this function}
def verify(String a, String b){
//I need to get the changed value from verify function and manipulate that value in this function}
I need to pass the initial a and b(multiple) values to the verify function and pass the changed value on to the other function. I then need to manipulate the changed value, and pass it to the stage in the pipeline where echo will display the changed values. How can I accomplish all this?
Ok, here's what I meant:
def String verify_a(String a) { /* stuff */ }
def String verify_b(String b) { /* stuff */ }
node {
String a = 'xyz'
String b = 'abc'
stage('verify') {
a = verify_a(a)
b = verify_b(b)
echo "changed values of a and b are $a $b"
}
stage('next stage') {
echo "a and b retain their changed values: $a $b"
}
}
The easiest way I have found to pass variables between stages is to just use Environment Variables. The one - admittedly major - restriction is that they can only be Strings. But I haven't found that to be a huge issue, especially with liberal use of the toBoolean() and toInteger() functions. If you need to be passing maps or more complex objects between stages, you might need to build something with external scripts or writing things to temporary files (make sure to stash what you need if there's a chance you'll switch agents). But env vars have served me well for almost all cases.
This article is, as its title implies, the definitive guide on environment variables in Jenkins. You'll see a comment there from me that it's really helped me grok the intricacies of Jenkins env vars.

How to save a variable in an rspec expect?

I have something along the following lines in one of my spec files:
expect(my_instance).to receive(:my_function).with(arg: instance_of(String))
I want to be able to capture the actual value of arg in a variable I can use in the spec. Is there a way to do that? I checked the rspec docs but didn't find anything like that.
You could declare the variable, say captured_arg before the expect (or allow, if you don't want it to fail if my_instance does not receive my_function). Then you can collect the arguments in a block and set captured_arg within that block.
captured_arg = nil
expect(my_instance).to receive(:my_function) { |arg| captured_arg = arg }
Edit: (Keyword Arguments)
If you are using keyword arguments, just modify the script above slightly, using arg as the keyword argument you'd like to capture:
captured_arg = nil
expect(my_instance).to receive(:my_function) { |args| captured_arg = args[:arg] }

How to write properly an if statement in regards to a BooleanParameter in Jenkins pipeline Jenkinsfile?

I'm setting a Jenkins pipeline Jenkinsfile and I'd like to check if a booleanparameter is set.
Here's the relevant portion of the file:
node ("master") {
stage 'Setup' (
[[$class: 'BooleanParameterValue', name: 'BUILD_SNAPSHOT', value: 'Boolean.valueOf(BUILD_SNAPSHOT)']],
As I understand, that is the way to access the booleanparameter but I'm not sure how to state the IF statement itself.
I was thinking about doing something like:
if(BooleanParameterValue['BUILD_SNAPSHOT']){...
What is the correct way to write this statement please?
A boolean parameter is accessible to your pipeline script in 3 ways:
As a bare parameter, e.g: isFoo
From the env map, e.g: env.isFoo
From the params map, e.g: params.isFoo
If you access isFoo using 1) or 2) it will have a String value (of either "true" or "false").
If you access isFoo using 3) it will have a Boolean value.
So the least confusing way (IMO) to test the isFoo parameter in your script is like this:
if (params.isFoo) {
....
}
Alternatively you can test it like this:
if (isFoo.toBoolean()) {
....
}​​​​​​​​​​​​​​​​​​
or
if (env.isFoo.toBoolean()) {
....
}​​​​​​​​​​​​​​​​​​
the toBoolean() is required to convert the "true" String to a boolean true and the "false" String to a boolean false.
The answer is actually way simpler than that !
According to the pipeline documention, if you define a boolean parameter isFoo you can access it in your Groovy with just its name, so your script would actually look like :
node {
stage 'Setup'
echo "${isFoo}" // Usage inside a string
if(isFoo) { // Very simple "if" usage
echo "Param isFoo is true"
...
}
}
And by the way, you probably should'nt call your parameter BUILD_SNAPSHOT but maybe buildSnapshot or isBuildSnapshot because it is a parameter and not a constant.
simply doing if(isFoo){...} that will not guarantee it working :) To be safe, use if(isFoo.toString()=='true'){ ... }

Resources