java.lang.NoSuchMethodError: No such DSL method 'steps' found among steps
Basically i get this error both on "triggers" brackets and "post" brackets, i have been looking on other stackoverflow posts and i didnt find any answer that suits my case
My jenkins pipeline:
node{
triggers {
cron 'H */2 * * *'
}
def build_ok = true
def itrList = ["Run1", "Run2", "Run3"]
itrList.each { val ->
stage('#1 SoftSync 4.5.1 CPU Usage Test') {
build job: 'SoftSync_4.5.1_CPU_Usage_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/SoftSync_CPU_Usage_Test.robot')]
}
stage ('#2 SoftSync 4.5.1 Improvments to system time management Test '){
build job: 'SoftSync_4.5.1_Improvments_To_System_TimeManagement_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/SoftSync_Improvments_to_system_time_management.robot')]
}
stage ('#3 SoftSync 4.5.1 Telematics and statistics Test'){
build job: 'SoftSync_4.5.1_Telematics_and_statistics_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/SoftSync_Telementry_and_Statistics.robot')]
}
stage ('#4 SoftSync 4.5.1 PTP Profiles Slave Lock Test'){
build job: 'SoftSync_4.5.1_PTP_Profiles_SlaveLock_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/PTP_Bundle/SoftSync_PTP_Lock_validation.robot')]
}
stage ('#5 SoftSync 4.5.1 Alarms Test'){
build job: 'SoftSync_4.5.1_Alarms_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/SoftSync_Alarms_Test.robot')]
}
stage ('#6 SoftSync_4.5.1_EP_BP_DelayResp_DealyReq_method Test'){
build job: 'SoftSync_4.5.1_EP_BP_Default_Profiles_DelayResp_DelayReq_method_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/SoftSync_EP_BP_Default_Profiles_DelayResp_DealyReq_method.robot')]
}
stage ('#7 SoftSync TimeTraceability Status Test'){
build job: 'SoftSync_4.5.1_TimeTraceability_Status_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/Softsync_Time_Traceability_Status.robot')]
}
stage ('#8 SoftSync 4.5.1 NTP Clock Test'){
build job: 'SoftSync_4.5.1_NTP_Clock_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/Softsync_NTP_Clock.robot')]
}
if(build_ok) {
currentBuild.result = "SUCCESS"
} else {
currentBuild.result = "FAILURE"
}
post{
always{
junit allowEmptyResults: true, testResults: '/var/lib/jenkins/output/*.xml'
}
}
}
}
When having a look to https://www.jenkins.io/doc/book/pipeline/syntax/ you can see that the post {} is only available in the declarative pipeline, but not in the scripted pipeline. When starting the pipeline with node instead of pipeline then you're using the scripted pipeline. In general (but this is a topic that might be discussed) it's best to use the declarative pipline. So my suggestion is that you're switching to a declarative pipline and then you're able to use it.
Related
pipeline {
agent { label 'linux' }
stages{
stage("verify1"){
steps {
script {
build(job: "verfiy1", parameters: [string(name: 'verfiy1', value: "${params.verfiy1}")])
}
}
}
stage("verify2"){
steps {
script {
build(job: "verfiy2", parameters: [string(name: 'verfiy2', value: "${params.verfiy2}")])
}
}
}
stage("verify3"){
steps {
script {
build(job: "verify3", parameters: [string(name: 'verify3', value: "${params.verify3}")])
}
}
}
}
}
=================================================================
Hello
can anyone help me, right now from the above pipeline i am able to build 3 jobs sucessfull but the problem is every single job is executing on new ec2 slave instance instead of the instance where the job has started. I am expecting the output as once the above pipeline starts all the builds in the pipeline must execute on the same node (ec2 instance).
Thanks in advance
You can pass the upstream job's agent node to the downstream job.
Add one more job parameter to accept node
Pass upstream job's agent node via env.NODE_NAME when call build job step
// verify 1 job
pipeline {
agent { label "${params.agentNode}" }
parameters {
string(name: "agentNode",
defaultValue="<give default value in case run it directly>" )
}
}
// upstream job
build(job: "verify1", parameters: [
string(name: 'agentNode', value: "${env.NODE_NAME}"),
string(name: 'verify3', value: "${params.verify3}")
])
I have the following pipeline. I need this pipeline to run on 4 different nodes at the same time. I have read that using a matrix section within the declarative pipeline is key to making this work. How can I go about doing that with the pipeline below?
pipeline
{
stages
{
stage ('Test')
{
steps
{
script
{
def test_proj_choices = ['AD', 'CD', 'DC', 'DISP_A', 'DISP_PROC', 'EGI', 'FD', 'FLT', 'FMS_C', 'IFF', 'liblO', 'libNGC', 'libSC', 'MISCMP_MP', 'MISCMP_GP', 'NAV_MGR', 'RADALT', 'SYS', 'SYSIO15', 'SYSIO42', 'SYSRED', 'TACAN', 'VOR_ILS', 'VPA', 'WAAS', 'WCA']
for (choice in test_proj_choices)
{
stage ("${choice}")
{
echo "Running ${choice}"
build job: "UH60Job", parameters: [string(name: "TEST_PROJECT", value: choice), string(name: "SCADE_SUITE_TEST_ACTION", value: "all"), string(name: "VIEW_ROOT", value: "myview")]
}
}
}
}
}
}
}
One helpful article can be found here : https://www.jenkins.io/blog/2019/11/22/welcome-to-the-matrix/
The official documentation here: https://www.jenkins.io/doc/book/pipeline/syntax/#declarative-matrix
Accordingly, the syntax should be:
pipeline {
agent none
stages {
stage('Tests') {
matrix {
agent any
axes {
axis {
name 'CHOICE'
values 'AD', 'CD', 'DC', 'DISP_A', 'DISP_PROC', 'EGI', 'FD', 'FLT', 'FMS_C', 'IFF', 'liblO', 'libNGC', 'libSC', 'MISCMP_MP', 'MISCMP_GP', 'NAV_MGR', 'RADALT', 'SYS', 'SYSIO15', 'SYSIO42', 'SYSRED', 'TACAN', 'VOR_ILS', 'VPA', 'WAAS', 'WCA'
}
}
stages {
stage("Test") {
steps {
echo "Running ${CHOICE}"
build job: "UH60Job", parameters: [string(name: "TEST_PROJECT", value: CHOICE), string(name: "SCADE_SUITE_TEST_ACTION", value: "all"), string(name: "VIEW_ROOT", value: "myview")]
}
}
}
}
}
}
}
Note that your inner stage cannot be named dynamically, you'd get a syntax error trying to expand "${CHOICE}".
To explain the issue, consider that I have 2 jenkins jobs.
Job1 : PARAM_TEST1
it accepts a parameterized value called 'MYPARAM'
Job2: PARAM_TEST2
it also accepts a parameterized value called 'MYPARAM'
Sometimes I am in need of running these 2 jobs in sequence - so i created a separate pipeline job as shown below. It works just fine.
it also accepts a parameterized value called 'MYPARAM' to simply pass it to the build job steps.
pipeline {
agent any
stages {
stage("PARAM 1") {
steps {
build job: 'PARAM_TEST1', parameters: [string(name: 'MYPARAM', value: "${params.MYPARAM}")]
}
}
stage("PARAM 2") {
steps {
build job: 'PARAM_TEST2', parameters: [string(name: 'MYPARAM', value: "${params.MYPARAM}")]
}
}
}
}
My question:
This example is simple. Actually I have 20 jobs. I do not want to repeat parameters: [string(name: 'MYPARAM', value: "${params.MYPARAM}")] in every single stage.
Is there any way to set the parameters for all the build job steps in one single place?
What you could do is place the common params on the pipeline level and add specific ones to those in the stages
pipeline {
agent any
parameters {
string(name: 'PARAM1', description: 'Param 1?')
string(name: 'PARAM2', description: 'Param 2?')
}
stages {
stage('Example') {
steps {
echo "${params}"
script {
def myparams = params + string(name: 'MYPARAM', value: "${params.MYPARAM}")
build job: 'downstream-pipeline-with-params', parameters: myparams
}
}
}
}
}
I am new to Jenkins. I have spent the last few weeks creating jobs to execute chains of shell commands, but now when I tried to find out how to chain jobs together, I have failed to find the answer I was looking for.
I have a CreateStack job, and if it fails somehow, I'd like to run DeleteStack to remove the stuff that CreateStack left behind while failing. If CreateStack does not fail, build the rest of the jobs.
Something like this:
b = build(job: "CreateStack", propagate: false, parameters: [string(name: 'TASVersion', value: "$TASVersion"), string(name: 'CloudID', value: "$CloudID"), string(name: 'StackName', value: "$StackName"), booleanParam(name: 'Swap partition required', value: true)]).result
if(b == 'FAILURE') {
echo "CreateStack has failed. Running DeleteStack."
build(job: "DeleteStack", parameters: [string(name: 'CloudID', value: "$CloudID"), string(name: 'StackName', value: "$StackName")]
}
else {
build job: 'TAS Deploy', parameters: [string(name: 'FT_NODE_IP', value: "$FT-NodeIP"), string(name: 'TASVersion', value: "RawTASVersion")]
}
Could somebody help me out with this, please?
Also, can I use variables in a pipeline script like this? I set the project to be parameterized and added the necessary choice parameters, e.g.: $StackName
You can try something like this in a scripted pipeline:
node {
try {
stage('CreateStack') {
build(job: 'CreateStack', parameters: [<parameters>])
}
stage('OtherJobs') {
#build the rest of the jobs
}
} catch (error) {
build(job: 'DeleteStack', parameters: [<parameters>])
currentBuild.result = "FAILURE"
throw error
} finally {
build(job: 'LastJob', parameters: [<parameters>])
}
}
Please note, that the catch block is executed if any job fails. There you have to implent a little additional logic.
I've tried using the following Script, but all downstream jobs are running on different nodes.
Any idea how can I get a random node and run all downstream jobs on the same one?
#!/usr/bin/env groovy
pipeline {
agent { label 'WindowsServer' }
stages{
stage("Get Dev Branch"){
steps {
script {
build(job: "GetDevBranchStep", parameters: [string(name: 'DevBranchName', value: "${params.CloudDevBranch}")])
}
}
}
stage("Get SA Branch"){
steps {
script {
build(job: "GetSABranchStep", parameters: [string(name: 'SABranchName', value: "${params.SABranch}")])
}
}
}
stage("Compile Models and Copy To Network Folder"){
steps {
script {
build(job: "CompileNewModelsAndCopyToNetwork", parameters: [string(name: 'DevBranchName', value: "${params.CloudDevBranch}"), string(name: 'SABranchName', value: "${params.SABranch}"), string(name: 'GetSAStepJobName', value: "GetSABranchStep"), string(name: 'GetDevRepoJobName', value: "GetDevBranchStep"), string(name: 'NetworkFoderToCopyTo', value: "NetworkFolderAddress")])
}
}
}
}
}
provide a downstream job with ${NODE_NAME} as additional parameter
in downstream job in agent section you can use:
agent { label "${params.NODE_NAME}" }
(meanwhile did not found how to inject parameters of upstream job to the downstream without actually insert them one by one as input parameters)