How to use dockerhub-notification-plugin in Jenkins scripted pipeline? - docker

I want to trigger a pipeline when a new image is pushed to docker hub.
I installed dockerhub-notification-plugin.
If I use web UI it's possible to specify the docker hub repo:
I tried to use pipeline snippet generator, but it is not working correctly: if I specify a repo it's ignored in generated code.
For example:
generates code:
properties([pipelineTriggers([[$class: 'DockerHubTrigger', options: []]])])
As you can see there is no docker hub repo specified in the generated code.

The correct way to do this is to write your properties like below:
properties([
pipelineTriggers([[$class: 'DockerHubTrigger', options: [[$class: 'TriggerOnSpecifiedImageNames', repoNames: ["YOUR_REPO_NAME"].toSet()]]]])
])
First notice the additional parenthesis around options value. This is due to the way how groovy scripts are evaluated in jenkins.
But why set?
According to the javadoc TriggerOnSpecifiedImageNames class has three constructors: without parameters, with varargs of strings and with collection. But groovy will use reflection to instantiate this class, which means that the default constructor will be called and later respective properties will be applied. And this brings us to the toSet() because as you can see in javadoc there is a setter for repo names property which looks like follow: setRepoNames(Set<String> repoNames).

Related

Dynamically evaluate default in Jenkins pipeline build parameter

In Jenkins declarative pipeline we can define build parameters like
pipeline {
…
parameters {
string(name: 'PARAMETER', defaultValue: 'INITIAL_DEFAULT')
choice(name: 'CHOICE', choices: ['THIS', 'THAT'])
}
…
}
However the parameter definitions of the job are only updated when the job runs after the build parameters dialog was already shown. That is, when I change the INITIAL_DEFAULT to something else, the next build will still default to INITIAL_DEFAULT and only the one after that will use the new value.
The same problem is with the choices, and there it is even more serious, because string default can be overwritten easily when starting the build, but if the new option isn't there, it cannot be selected at all.
So is there a way to define functions or expressions that will be executed before the parameter dialog to calculate current values (from files, variable in global settings or any other suitable external configuration)?
I remember using some plugins for this in the past with free-style jobs, but searching the plugin repository I can't find any that would mention how to use it with pipelines.
I don't care too much that the same problem applies to adding and removing parameters, because that occurs rarely. But we have some parameters where the default changes often and we need the next nightly to pick up the updated value.
It turns out the extended-choice-parameter does work with pipeline, and the configurations can be generated by the directive generator. It looks something like
extendedChoice(
name: 'PARAMETER',
type: 'PT_TEXTBOX',
defaultPropertyFile: '/var/lib/jenkins/something.properties',
defaultPropertyKey: 'parameter'
)
(there are many more options available in the generator)
Groovy script to get global environment variables can be had from this other answer.

How to add to existing properties in groovy script?

I have a dsl file in which properties such as log rotator and parameters are being defined for a jenkins job. However, I want to add a property to the jenkins job in a groovy script. I do this by putting
properties([pipelineTriggers([githubPush()])])
in the corresponding groovy script. However, this overwrites all other parameters and properties as defined in the dsl script.
What I have right now is putting all properties in the dsl script in the groovy script as well but that causes two different places where developers need to change the properties. Is there a way in groovy to simply add a new property instead of overwriting the old ones.
Something like
properties.add([pipelineTriggers([githubPush()])])
would be very helpful.

How can I override a jenkinsfile's default parameters?

Sometimes, we want to create multiple jobs that use the same Jenkinsfile instead of a single one. This could happen for example because we want to maintain logs divided based on parameters, instead of having a single job on which look for the right log.
However, in this case, we can't use the parameter definition in the Jenkinsfile, because whatever default value we would define on the job instance would be overwritten by the following execution with whatever is defined in the Jenkinsfile (and this is also happening if we don't define a default value).
So, in this situation, the only way we figure out is to remove the parameter definition in the Jenkinsfile and define the parameters directly on the jobs, which is kind of not optimal.
I mean, I agree that this is the right behavior in most of the cases, as you don't want your parameter to be out of synch and not versioned, but is there a way to specify to Jenkins to skip the parameter reconfiguration or to override the default parameter written in the Jenkinsfile? Something that can be activated/deactivated job by job.
Had this problem myself, we solved it like this:
string(name: 'parameterName', defaultValue: params.parameterName ?:'your default value')
Now the default values defined through Jenkins job configuration will not be overridden.

Jenkins Addon in Jenkins Pipeline

I have a parameterized project. With the variable VAR1.
I'm using the the Xray for JIRA Jenkins Plugin for Jenkins. There you can fill four parameters:
JIRA Instance
Issues
Filter
File Path
I'm new to Jenkins but what I have learned so far, that you can't fill this fields with environment variables. Something like
Issues: ${VAR1} - doesn't work.
So I thought I can do this with a pipeline. When I click on Pipeline Syntax and chose step: General Build Step I can choose Xray: Cucumber Features Export Task. Then I fill the fields with my environment variable and click Generate Pipeline Script The output is as follows:
step <object of type com.xpandit.plugins.xrayjenkins.task.XrayExportBuilder>
That doesn't work. What I'm doing wrong?
All you're doing is OK, but what you want is not supported by Jenkins whether it is pipeline or not, since the parameters' load is happening prior to the pipeline-flow or the definition of the ${VAR1}.
You can try to overcome this by defining the 'Issues' value as a pipeline internal value instead of a parameter and base it on the ${VAR1} value.
If it must be a parameter, use 2 jobs where one defines the value of 'Issues' based on a the ${VAR1} and pass it to the other job that gets the 'Issues' as a fixed value.

Can a workflow step access environment variables provided by an EnvironmentContributingAction?

A custom plugin we wrote for an older version of Jenkins uses an EnvironmentContributingAction to provide environment variables to the execution so they could be used in future build steps and passed as parameters to downstream jobs.
While attempting to convert our build to workflow, I'm having trouble accessing these variables:
node {
// this step queries an API and puts the results in
// environment variables called FE1|BE1_INTERNAL_ADDRESS
step([$class: 'SomeClass', parameter: foo])
// this ends up echoing 'null and null'
echo "${env.FE1_INTERNAL_ADDRESS} and ${env.BE1_INTERNAL_ADDRESS}"
}
Is there a way to access the environment variable that was injected? Do I have to convert this functionality to a build wrapper instead?
EnvironmentContributingAction is currently limited to AbstractBuilds, which WorkflowRuns are not, so pending JENKINS-29537 which I just filed, your plugin would need to be modified somehow. Options include:
Have the builder add a plain Action instead, then register an EnvironmentContributor whose buildEnvironmentFor(Run, …) checks for its presence using Run.getAction(Class).
Switch to a SimpleBuildWrapper which defines the environment variables within a scope, then invoke it from Workflow using the wrap step.
Depend on workflow-step-api and define a custom Workflow Step with comparable functionality but directly returning a List<String> or whatever makes sense in your context. (code sample)
Since PR-2975 is merged, you are able to use new interface:
void buildEnvVars(#Nonnull Run<?, ?> run, #Nonnull EnvVars env, #CheckForNull Node node)
It will be used by old type of builds as well.

Resources