jenkinspipeline groovy.lang.MissingMethodException: No signature of method - jenkins

I have created a pipeline that takes an array of JSON objects and will call a shared library which will iterate over the JSON objects
When trying to run the Jenkins job to test that I can forward the objects but I'm seeing the following error:
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: imageBuild.call() is applicable for argument types: (java.lang.String) values: [[{"dockerConfig":"home/pipelines/conf/journeys/dockerbuild.yaml","aquaConfig":"home/pipelines/conf/journeys/aquascan.yaml","gtag": "v1.0.1","preFixName": "journey1"},{"dockerConfig":"home/pipelines/conf/if.com/dockerbuild.yaml","aquaConfig":"home/pipelines/conf/if.com/aquascan.yaml","gtag": "v2.0.2","preFixName": "journey2"},{"dockerConfig":"home/pipelines/conf/colleague/dockerbuild.yaml","aquaConfig":"home/pipelines/conf/colleague/aquascan.yaml","gtag": "v3.0.3","preFixName": "journey2"}]]
Possible solutions: call(), call(java.util.Map), wait(), any(), wait(long), main([Ljava.lang.String;)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:58)
Code:
library identifier: 'jenkins-sharedlib-cooking#BB-3611', retriever: modernSCM([$class: 'GitSCMSource',
remote: 'https://github.com/lbg-gcp-foundation/jenkins-sharedlib-cooking-lifecycle.git',
credentialsId: 'jenkinsPAT'])
def configList = '[{"dockerConfig":"home/pipelines/conf/journeys/dockerbuild.yaml","aquaConfig":"home/pipelines/conf/journeys/aquascan.yaml","gtag": "v1.0.1","preFixName": "journey1"},{"dockerConfig":"home/pipelines/conf/if.com/dockerbuild.yaml","aquaConfig":"home/pipelines/conf/if.com/aquascan.yaml","gtag": "v2.0.2","preFixName": "journey2"},{"dockerConfig":"home/pipelines/conf/colleague/dockerbuild.yaml","aquaConfig":"home/pipelines/conf/colleague/aquascan.yaml","gtag": "v3.0.3","preFixName": "journey2"}]'
pipeline {
environment {
def config =
brand =
environmentName =
CLUSTER_NAME =
CLUSTER_PROJECT =
VERSION = '1.0.0'
}
options {
ansiColor('xterm')
timeout(time: 150, unit: 'MINUTES')
disableConcurrentBuilds()
buildDiscarder(logRotator(numToKeepStr: '100'))
}
agent {
kubernetes {
label "jrn-${UUID.randomUUID().toString()}"
yamlFile "pipelines/conf/podTemplate.yaml"
}
}
stages {
stage('Stage 6 - Docker Image ') {
parallel {
stage ('Docker Image - Journeys') {
steps {
echo "*********************** Docker Journeys ***********************************"
container('docker') {
echo "Building the docker image..."
imageBuild(configList)
}
archiveArtifacts artifacts: "*.html", allowEmptyArchive: true
}
}
}enter code here
}
}

Looks like the groovy method or global variable imageBuild in the shared library Jenkins-shared lib-cooking#BB-3611 does not expecting any parameter but you are trying to pass a string, that's the reason you are getting MissingMethodException.
with the details
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: imageBuild.call() is applicable for argument types: (java.lang.String) values
To fix the issue, you have to change the shared library method or global variable imageBuild from imageBuild() to imageBuild(String param) or imageBuild(def param)
I will try to illustrate with an example similar to your example for your reference.
Assuming you have a shared library in a git repository named - jenkins-sharedlib-cooking-lifecycle and you are following the folder structure suggested by Jenkins shared library documentation
// vars/imageBuild.groovy
import groovy.json.JsonSlurper
def call(def imagebuildParameter) {
def jsonSlurper = new JsonSlurper()
def object = jsonSlurper.parseText(imagebuildParameter)
// use loop on object to retrive the value like below
println object[0].name
}
//Jenkinsfile
node {
stage('stageName') {
def x = '[{"name": "foo"},{"name": "bar"}]'
imageBuild(x)
}
}

Related

Error creating jobdsl parameters programatically

I am creating jobs for use with Terraform. There are several environments and the number is growing all the time. Rather than update both the pipeline file and the jobdsl file as the parameters change, I started working from the standpoint of scanning the repo for environment files and updating the pipeline and jobdsl file as needed.
My jobdsl script:
#Library('mylib') _
params = [
"serviceName": "infrastructure-${repo}",
"repoUrl": "${repoUrl}",
"sshCredentials": 'git-readonly',
"environment": "${env.Environment}",
"configParams": getTFConfigs(
repoUrl,
"env/${env.AccountName}/${env.AWSRegion}/${env.Environment}")
]
template = libraryResource('dslTemplates/infra.groovy')
jobDsl scriptText: helpers.renderTemplate(template, params)
shared library method: getTFConfigs
#!/usr/bin/env groovy
#NonCPS
import java.util.zip.ZipEntry
import java.util.zip.ZipInputStream
def call(String repoUrl, String filter=""){
def gitProc = new ProcessBuilder(
"git",
"archive",
"--format=zip",
"--remote=${repoUrl}",
"main").start()
def zipIn = new ZipInputStream(gitProc.inputStream)
def zipMembers = []
while (true) {
def ZipEntry entry = zipIn.getNextEntry()
if (entry == null) break
if ( (entry.getName()).contains(filter) ) {
entryName = entry.getName()
zipMembers.push("${entryName}")
}
}
println zipMembers
return zipMembers
}
dslTemplates/infra.groovy template
pipelineJob("${serviceName}") {
description("Apply TF for ${serviceName} to all environment configurations")
definition {
parameters {
<% configParams.each { %>
booleanParam(name: "<%= "${it}" %>", defaultValue: true, description: "<%= "${it}" %>" )
<% } %>
}
logRotator {
numToKeep(20)
}
cpsScm {
scm {
git{
remote{
url("${repoUrl}")
credentials("${sshCredentials}")
branch('*/main')
}
}
}
scriptPath('infra.groovy')
}
}
}
Template result
...
definition {
parameters {
booleanParam(name: env1.tfvars, defaultValue: true, description: env1.tfvars )
booleanParam(name: env2.tfvars, defaultValue: true, description: env2.tfvars )
}
...
When the seed job runs and executes the code, the parameters should be updated with a checkbox for each environment. However, the jobdsl fails with this:
ERROR: (script, line 6) No signature of method: javaposse.jobdsl.dsl.helpers.BuildParametersContext.booleanParam() is applicable for argument types: (java.util.LinkedHashMap) values: [[name:env1.tfvars, defaultValue:true, ...]]
Possible solutions: booleanParam(java.lang.String), booleanParam(java.lang.String, boolean), booleanParam(java.lang.String, boolean, java.lang.String)
Finished: FAILURE
I have tried to applying "toString()" at various steps and cannot seem to find any solution to this.
I have tried to write the entire jobdsl script to a file and read it back in using "jobDsl targets: filename" and got the same result!
Banging my head! as it were!
Thanks
It looks like you used Pipeline Syntax for the parameters in DSL script. If you want to define a parameter in a DSL script do not use name, defaultValue and description. (See Job DSL Plugin)
booleanParam('BOOL_PARAM', true, 'This is a boolean param')

how to solve "hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method:"

I got following error while calling another groovy file in my Jenkinsfile-
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method:
java.util.LinkedHashMap.test() is applicable for argument types:
(java.lang.String, java.lang.String) values: [${REPO_PATH},
${APP_NAME}]
Possible solutions: wait(), sort(), getAt(java.lang.String), keySet(), keySet(), keySet()
I am calling a function from another groovy file. following is mine Jenkinsfile and groovy functional file.
Jenkinsfile:
Map modules = [:]
pipeline{
agent any
environment{
REPO_PATH='/home/manish/Desktop'
APP_NAME='test'
}
stages{
stage('calling function'){
steps{
script{
modules.test = load "testfun.groovy"
modules.test('${REPO_PATH}','${APP_NAME}')
}
}
}
}
}
testfun.groovy:
def call(def path,def app){
stages{
stage('build and deploy'){
steps{
sh'''
cd ${path}/${app}/
mkdir testing
'''
}
}
}
}
Suggest me the right way.

How to Fix -- Jenkins Post-build Action Groovy script fails to evaluate

I'm new to Groovy and am trying to invoke a Groovy script as a Jenkins Post-build Action, but whenever I run it, I'm getting "ERROR: Failed to evaluate groovy script":
groovy.lang.MissingMethodException: No signature of method: Script1.stage() is applicable for argument types: (org.codehaus.groovy.runtime.GStringImpl, Script1$_run_closure1) values: [branch_1, Script1$_run_closure1#7e39737b]
Possible solutions: wait(), any(), isCase(java.lang.Object) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:58)
Here is my code :
def warList1= ["one.war", "two.war", "three.war" ]
def branches = [:]
for (int i = 0; i < 10 ; i++) {
int index=i, branch = i+1
stage ("branch_${branch}"){
branches["branch_${branch}"] = {
node {
sshagent(credentials : ['someuser-SSH']){
sh "scp ${WORKSPACE}/${warList1[index]} someuser#<somefqdn>:/tmp/pscp/dev"
}
}
}
}
}
}
I think that your issue come from the fact that you can not use stage method in a Groovy Post Build Action. This method is only available in a pipeline script.

Is there a way to set environment variables based on branch in a Declarative Jenkinsfile?

I'm hoping to find a way to remove duplicated stages in a Declarative Jenkinsfile by loading environment variables based on the branch.
Currently I have something like:
#Library("MySharedLibrary#v1") _
String tagBaseDev = "repo.org/myspace/image:dev"
String tagBaseTest = "repo.org/myspace/image:test"
String tagBaseProd = "repo.org/myspace/image:prod"
pipeline {
agent none
stages {
// Always Run This
stage ('Maven Build and Unit Tests') {
agent {label 'docker-slave'}
steps {
sharedLibraryBuild mavenGoals:'clean package', additionalProps:['ci.env':'']
stash 'artifacts'
}
}
// Dev Only
stage ('Build Dev Docker Image and Push') {
when {
branch 'dev'
}
agent {label 'docker-slave'}
steps {
unstash 'artifacts'
sharedLibraryDockerImageBuildPush tag:"$tagBaseDev"
}
}
// Test Only
stage ('Build Test Docker Image and Push') {
when {
branch 'test'
}
agent {label 'docker-slave'}
steps {
unstash 'artifacts'
sharedLibraryDockerImageBuildPush tag:"$tagBaseTest"
}
}
// Prod Only
stage ('Build Prod Docker Image and Push') {
when {
branch 'prod'
}
agent {label 'docker-slave'}
steps {
unstash 'artifacts'
sharedLibraryDockerImageBuildPush tag:"$tagBaseProd"
}
}
}
}
I want to be able to reduce that into one stage block and dynamically load in the needed $tagBaseXXX based on branch. This is just an example but I'm planning to have four or five variables that will have different values for each environment.
My thought was to create EnvDev, EnvTest, and EnvProd maps with corresponding values and then create a EnvMap which is a Map that correlates the branch name to Environment Map. For instance:
def EnvDev = [
url: "dev.com",
tag: "dev",
var: "Dev Var"
]
def EnvProd = [
url: "prod.com",
tag: "prod",
var: "prod Var"
]
def EnvMap = [
dev: EnvDev,
prod: EnvProd
]
I then try to create a Shared Library call that looks something like this:
def call(String branch, Map envMapping) {
Map use_me = envMapping.get("${branch}")
String url = use_me.get("URL")
echo ("${url}")
}
With the idea being to pass the Map and pull the corresponding environment map based on the branch and then use the variables as needed.
So I have something like this:
#Library("MySharedLibrary#v1") _
def EnvDev = [
url: "dev.com",
tag: "dev",
var: "Dev Var"
]
def EnvProd = [
url: "prod.com",
tag: "prod",
var: "prod Var"
]
def EnvMap = [
dev: EnvDev,
prod: EnvProd
]
pipeline {
agent {label 'docker-slave'}
stages {
stage ('Test Env Vars') {
steps {
echo "$env.GIT_BRANCH"
sharedLibrarySetupEnv branch: "$env.GIT_BRANCH", evnMapping: EnvMap
}
}
}
}
But I get the following error:
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: setupEnv.call() is applicable for argument types: (java.util.LinkedHashMap) values: [[branch:dev, env_mapping:[dev:[url:dev.com, tag:dev, var:Dev Var], ...]]]
Possible solutions: call(java.lang.String, java.util.Map), wait(), any(), wait(long), main([Ljava.lang.String;), each(groovy.lang.Closure)
Is there an easier way to accomplish what I'm trying to do?
This is my first time trying to write a Shared Library function so I'm guessing it may just be some Groovy syntax/concept I'm not familiar with.
Thanks!
your function signature is def call(String branch, Map envMapping), but your calling is branch: xxx, env_mapping:xxx.
Change to sharedLibrarySetupEnv branch: "$env.GIT_BRANCH", envMapping: EnvMap
The problem was with how I was trying to invoke the Shared Library function. I thought I was able to reference the variable names which led to the Jenkinsfile/pipeline passing a LinkedHashMap to the shared library and not two separate variables.
There are two solutions to this:
Have the Shared Library call method take in a Map<String, Object> parms and within the call reference the variables with parms.varname.
Shared Library:
def call(Map<String, Object> parms) {
echo "${parms.branch}"
Map use_this_map = parms.envMapping.get(branch)
}
Jenkinsfile:
setupEnv branch: "$env.GIT_BRANCH", envMapping: EnvMap
Don't pass the variable names in the Jenkinsfile and have the Shared Library call method take in corresponding variables.
Shared Library:
def call(String branch, Map<String, Map> envMapping) {
echo "${branch}"
Map use_this_map = envMapping.get(branch)
}
Jenkinsfile:
setupEnv $env.GIT_BRANCH, EnvMap
You can use the variable BRANCH_NAME and put it in a condition like below:-
if (env.BRANCH_NAME == master)
{
//set all the environment variable you need
} else {
//variable required if the condition doesn't match
}
You can use REGEX in the condition.

Jenkins Pipelines: How to use withCredentials() from a shared-variable script

I'd like to use a withCredentials() block in a shared-variable ("vars/") script rather than directly in the Jenkins pipeline because this is a lower-level semantic of a particular library, and also may or may not be required depending on the situation. However, withCredentials (or, at least, that signature of it) doesn't appear to be in scope.
script:
def credentials = [
[$class: 'UsernamePasswordMultiBinding', credentialsId: '6a55c310-aaf9-4822-bf41-5500cd82af4e', passwordVariable: 'GERRIT_PASSWORD', usernameVariable: 'GERRIT_USERNAME'],
[$class: 'StringBinding', credentialsId: 'SVC_SWREGISTRY_PASSWORD', variable: 'SVC_SWREGISTRY_PASSWORD']
]
withCredentials(credentials) {
// ...
}
Console:
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: BuildagentInstallAndRun.withCredentials() is applicable for argument types: (java.util.ArrayList, org.jenkinsci.plugins.workflow.cps.CpsClosure2) values: [[[$class:UsernamePasswordMultiBinding, credentialsId:6a55c310-aaf9-4822-bf41-5500cd82af4e, ...], ...], ...]
Has anyone had any success with this?
I'm using a shared library rather than a shared variable, but I guess it is a similar situation.
I'm not using the $class parameter, but i'm calling directly one of the functions suggested by the pipeline snippet generator. You can have a list here. In the example below, I use the usernameColonPassword binding.
In the pipeline, I instantiate the class utilities and I pass this to the constructor. Then, in the library, I use the step object to access the pipeline steps (such as withCredentials or usernameColonPassword).
class Utilities implements Serializable {
def steps
Utilities(steps) {
this.steps = steps
}
def doArchiveToNexus(String credentials, String artifact, String artifact_registry_path){
try {
this.steps.withCredentials([steps.usernameColonPassword(credentialsId: credentials, variable: 'JENKINS_USER')]) {
this.steps.sh "curl --user " + '${JENKINS_USER}' + " --upload-file ${artifact} ${artifact_registry_path}"
}
} catch (error){
this.steps.echo error.getMessage()
throw error
}
}
}
You can try following:
import jenkins.model.*
credentialsId = '6a55c310-aaf9-4822-bf41-5500cd82af4e'
def creds = com.cloudbees.plugins.credentials.CredentialsProvider.lookupCredentials(
com.cloudbees.plugins.credentials.common.StandardUsernameCredentials.class, Jenkins.instance, null, null ).find{
it.id == credentialsId}
println creds.username
println creds.password
But it is not secure, everything will be in console log
I was able to obtain credentials inside the shared library with proper passwords masking with such code:
class Utilities implements Serializable {
def steps
Utilities(steps) {
this.steps = steps
}
def execute() {
this.steps.withCredentials(
bindings: [
this.steps.usernameColonPassword(
credentialsId: this.credentialsId,
variable: "unameColonPwd")
]) {
this.steps.sh "echo {this.steps.env.unameColonPwd}"
}
}

Resources