I am having hard to trying to figure out how to add envVars to kubernet inside Jenkinsfile.
I am pretty sure the issue is in my syntax because I am getting following error
ava.lang.ClassCastException: class org.csanchez.jenkins.plugins.kubernetes.
ContainerTemplate.setEnvVars() expects java.util.List<org.csanchez.jenkins.plugins.kubernetes.model.TemplateEnvVar
> but received class java.lang.String
when I have it coded this way
stage("build") {
agent {
kubernetes {
label 'kubernetes'
containerTemplate {
name 'jnlp'
image 'ubuntu:last'
ttyEnabled true
label 'label'
envVars '
envVar(key: "filePath", value: "/home/abcde/abc" )'
}
}
}
Can you guys please point me to right direction? How do I define list variable in Jenkinsfile?
My Jenkinsfile setup
pipeline {
agent any
parameters {
string(name: 'Abc', defaultValue: 'origin', description: 'test project')
}
options {
timestamps()
timeout(60)
}
stages {
stage('Build') {
parallel {
stage("build") {
agent {
kubernetes {
label 'kubernetes'
containerTemplate {
name 'jnlp'
image 'ubuntu:latest'
ttyEnabled true
label 'label'
envVars 'envVar(key: "filePath", value: "/home/abcde/abc" )'
}
}
}
steps {
container('jnlp') {
timeout(60) {
// build process
}
}
}
}
}
}
}
post {
success {
sh "success"
}
failure {
sh "failed"
}
unstable {
sh "unsable"
}
}
}
With above code, I will get following error
ava.lang.ClassCastException: class org.csanchez.jenkins.plugins.kubernetes.
ContainerTemplate.setEnvVars() expects java.util.List<org.csanchez.jenkins.plugins.kubernetes.model.TemplateEnvVar
> but received class java.lang.String
Look at their example https://github.com/jenkinsci/kubernetes-plugin/blob/f6cff5d7e9ce9da3279660159e0cb064efac534f/examples/selenium.groovy#L18
looks like in your case it should be
stage("build") {
agent {
kubernetes {
label 'kubernetes'
containerTemplate {
name: 'jnlp',
image: 'ubuntu:last',
ttyEnabled: true,
label: 'kub_catkin_node',
envVars: [
containerEnvVar(key: "filePath", value: "/home/abcde/abc" )
]
}
}
}
}
This is something supported from the UI and also from pipelines, but it might not be well supported in declarative pipelines.
One solution could be to use pipeline scripts. An other could be to check if its better supported in later version (if you are not already on the latest).
This is how i got this to work. Careful with Yaml syntax. Yaml doesn't like tabs
pipeline {
agent any
parameters {
string(name: 'Abc', defaultValue: 'origin', description: 'The Gitlab project name')
}
options {
timestamps()
timeout(60)
}
stages {
stage('Build') {
parallel {
stage("build") {
agent {
kubernetes {
label 'label'
defaultContainer 'jnlp'
yaml """
apiVersion: v1
kind: Pod
metadata:
labels:
some-label: label
spec:
containers:
- name: jnlp
image: ubuntu:latest
tty: true
env:
- name: 'filePATH'
value: 'fileValue'
"""
}
steps {
container('jnlp') {
timeout(60) {
// build process
}
}
}
}
}
}
}
post {
success {
sh "success"
}
failure {
sh "failed"
}
unstable {
sh "unsable"
}
}
}
Related
properties([gitLabConnection(gitLabConnection: 'GitLab Connection', jobCredentialId: ''), [$class: 'GitlabLogoProperty', repositoryName: ''], parameters([extendedChoice(multiSelectDelimiter: ',', name: 'choice', quoteValue: false, saveJSONParameterToFile: false, type: 'PT_CHECKBOX', value: 'mongo, mysql', visibleItemCount: 10)])])
pipeline {
agent anyenter image description here
stages {
stage('mongo') {
when {
expression { choice == 'mongo' }
}
steps {
echo "${params.choice}"
}
}
stage('mysql') {
when {
expression { choice == 'mysql' }
}
steps {[enter image description here][1]
echo "${params.choice}"
}
}
}
}
When I select both mongo and mysql checkbox then both stage should work but both stage mongo and mysql are skipped
I couldn't get the extendedChoice Parameters setup in the declarative pipeline. Instead, I used Boolean Parameters. Please refer to the following for your use case. Note I have declared a global variable named choice.
def choice = ""
pipeline {
agent any
stages {
stage("Get details") {
steps{
timeout(time: 300, unit: 'SECONDS') {
script {
// Select the product image
choice = input message: 'Please select the product', ok: 'Build',
parameters: [
booleanParam(defaultValue: false, name: 'mongo'),
booleanParam(defaultValue: false, name: 'mysql')]
}
}
}
}
stage('Echo') {
steps {
script {
echo "::: Product : ${choice}"
}
}
}
stage('mongo') {
when {
expression { return choice.get("mongo") }
}
steps {
echo "MONGO"
}
}
stage('mysql') {
when {
expression { return choice.get("mysql") }
}
steps {
echo "MYSQL"
}
}
}
post {
success {
echo 'The process is successfully Completed....'
}
}
}
I have this configuration in my pipeline job
def k8sTestPodTemplate(docker_image) {
return """
apiVersion: v1
kind: Pod
metadata:
name: my-agent
labels:
name: my-agent
spec:
serviceAccountName: jenkins
containers:
- name: python
image: ${docker_image}
command: ["/bin/bash", "-c", "cat"]
tty: true
"""
}
pipeline {
agent none
stages {
stage('Run tests') {
parallel {
stage('Tests Python 3.5') {
agent {
kubernetes {
defaultContainer 'jnlp'
yaml k8sTestPodTemplate('python:3.5')
}
}
steps {
container('python') {
sh "echo 'Hello from Python 3.5'"
}
}
}
stage('Tests Python 3.6') {
agent {
kubernetes {
defaultContainer 'jnlp'
yaml k8sTestPodTemplate('python:3.6')
}
}
steps {
container('python') {
sh "echo 'Hello from Python 3.6'"
}
}
}
stage('Tests Python 3.7') {
agent {
kubernetes {
defaultContainer 'jnlp'
yaml k8sTestPodTemplate('python:3.7')
}
}
steps {
container('python') {
sh "echo 'Hello from Python 3.7'"
}
}
}
}
}
}
}
But as you can see I could easily improve this code to something like that:
def k8sTestPodTemplate(docker_image) {
return """
apiVersion: v1
kind: Pod
metadata:
name: my-agent
labels:
name: my-agent
spec:
serviceAccountName: jenkins
containers:
- name: python
image: ${docker_image}
command: ["/bin/bash", "-c", "cat"]
tty: true
"""
}
def generateStage(docker_image) {
return {
stage("Tests ${docker_image}") {
agent {
kubernetes {
defaultContainer 'jnlp'
yaml k8sTestPodTemplate("${docker_image}")
}
}
steps {
container('python') {
sh "echo ${docker_image}"
}
}
}
}
}
pipeline {
agent none
stages {
stage('Run tests') {
parallel {
generateStage("python:3.5")
generateStage("python:3.6")
generateStage("python:3.7")
}
}
}
}
But I cannot get this to work. The problem is that Jenkins is raising an error
No such DSL method 'agent' found among steps
I am using the "agent" directive inside the "step" directive and the agent is being generated dynamically.
I have several microservices which use the same pipeline from a shared library which is named jenkins-shared-pipelines . The Jenkinsfile for a microservice is as follows:
#Library(['jenkins-shared-pipelines']) _
gradleProjectPrPipeline([buildAgent: 'oc-docker-jdk11', disableIntegrationTestStage: true])
In jenkins-shared-pipelines/vars, the gradleProjectPrPipeline has the following stages:
/**
* gradleProjectPrPipeline is a generic pipeline
* #param pipelineProperties map used to pass parameters
* #return
*/
void call(Map pipelineProperties = [:]) {
.
.
.
pipeline {
agent {
node {
label "${pipelineProperties.buildAgent}"
}
}
options {
skipDefaultCheckout true
timeout(time: 1, unit: 'HOURS')
buildDiscarder(logRotator(
numToKeepStr: '5',
daysToKeepStr: '7',
artifactNumToKeepStr: '1',
artifactDaysToKeepStr: '7'
))
}
stages {
stage('Clone') {
steps {
//clone step
}
}
stage('Compile') {
steps {
script {
/*Some custom logic*/
}
runGradleTask([task: 'assemble',
rawArgs: defaultGradleArgs + " -Pcurrent_version=${releaseTag}"
])
}
}
stage('Tests') {
parallel {
stage('Unit tests') {
steps {
//Unit tests
}
}
stage('Integration tests') {
steps {
//Integration tests
}
}
}
}
stage('Sonar scan') {
steps {
//Sonar scanning
}
}
}
post {
unsuccessful {
script {
bitbucketHandler.notifyBuildFail([
displayName: pipelineName,
displayMessage: "Build ${env.BUILD_ID} failed at ${env.BUILD_TIMESTAMP}."
])
}
}
success {
script {
bitbucketHandler.notifyBuildSuccess([
displayName: pipelineName,
displayMessage: "Build ${env.BUILD_ID} completed at ${env.BUILD_TIMESTAMP}."
])
}
}
}
}
}
Now, in addition to the above pipeline, there will be several more pipelines in jenkins-shared-pipelines(under the same vars directory) e.g: awsPipeline, azurePipeline and so on which will also incorporate the deployment stages. These additional pipelines will require all the stages in the above gradleProjectBranchWrapper and will also add a few of their own stages. Currently, we simply copy-paste these stages in these additional pipelines,
void call(Map pipelineProperties = [:]) {
.
.
.
pipeline {
agent {
node {
label "${pipelineProperties.buildAgent}"
}
}
options {
skipDefaultCheckout true
timeout(time: 1, unit: 'HOURS')
buildDiscarder(logRotator(
numToKeepStr: '5',
daysToKeepStr: '7',
artifactNumToKeepStr: '1',
artifactDaysToKeepStr: '7'
))
}
stages {
stage('Clone') {
steps {
//clone step
}
}
stage('Compile') {
steps {
script {
/*Some custom logic*/
}
runGradleTask([task: 'assemble',
rawArgs: defaultGradleArgs + " -Pcurrent_version=${releaseTag}"
])
}
}
stage('Tests') {
parallel {
stage('Unit tests') {
steps {
//Unit tests
}
}
stage('Integration tests') {
steps {
//Integration tests
}
}
}
}
stage('Sonar scan') {
steps {
//Sonar scanning
}
}
stage('AWS'){
}
}
post {
unsuccessful {
script {
bitbucketHandler.notifyBuildFail([
displayName: pipelineName,
displayMessage: "Build ${env.BUILD_ID} failed at ${env.BUILD_TIMESTAMP}."
])
}
}
success {
script {
bitbucketHandler.notifyBuildSuccess([
displayName: pipelineName,
displayMessage: "Build ${env.BUILD_ID} completed at ${env.BUILD_TIMESTAMP}."
])
}
}
}
}
}
then, we invoke these new pipelines from the microservices, so for example:
#Library(['jenkins-shared-pipelines']) _
awsPipeline([buildAgent: 'oc-docker-jdk11', disableIntegrationTestStage: true])
As obvious, there is code redundancy as the clone to sonarScan stages are common but there is no 'base pipeline' or another way to include these common stages in all the pipelines. I was wondering if there is a way to 'include' the gradleProjectPrPipeline(which can serve as a 'base pipeline') the pipelines like awsPipeline, azurePipeline and so on.
Note:
The workspace(where the clone stag checks out the code and later stages operate) will be used by awsPipeline etc. In other words, the variables and results from the gradleProjectBranchWrapper should be accessible to the awsPipeline etc.
There is a post block in the gradleProjectBranchWrapper, the other
pipelines may have their own post blocks
I have declared a variable TENTATIVE_VERSION in my script, and I need to define/modify it with the value coming from executing a script (or from the script itself in other stage), how can I do this? my current script is something like this:
pipeline {
agent {
label 'machine1'
}
stages {
stage('Non-Parallel Stage') {
agent{label "machine2"}
steps {
script {
TENTATIVE_VERSION="1.0" // working
// TENTATIVE_VERSION="sh echo 123" //not working
}
}
}
stage('Parallel Stage') {
parallel {
stage('A') {
agent {label 'machine3'}
steps {
echo "On other machine"
echo "${TENTATIVE_VERSION}"
build job: 'otherJob', parameters: [[$class: 'StringParameterValue', name: 'VERSION', value: "${TENTATIVE_VERSION}"],
[$class: 'StringParameterValue', name: 'RELEASE', value: '1']]
}
}
stage('B') {
agent {label "machine4"}
steps {
script {
STATUS_S = "OK"
}
echo "On a machine"
}
}
stage('C') {
agent {label "machine5"}
steps {
script {
STATUS_R = "OK"
}
echo "On a machine"
}
}
}
}
}
Try following:
pipeline {
agent {
label 'machine1'
}
stages {
stage('Non-Parallel Stage') {
agent{label "machine2"}
steps {
script {
TENTATIVE_VERSION = sh(returnStdout: true, script: "echo 123").trim()
}
}
}
}
}
I try to use the post steps with the Jenkins kubernetes plugin. Does anyone has an idea?
java.lang.NoSuchMethodError: No such DSL method 'post' found among steps
My pipeline:
podTemplate(
label: 'jenkins-pipeline',
cloud: 'minikube',
volumes: [
hostPathVolume(mountPath: '/var/run/docker.sock', hostPath: '/var/run/docker.sock'),
]) {
node('jenkins-pipeline') {
stage('test') {
container('maven') {
println 'do some testing stuff'
}
}
post {
always {
println "test"
}
}
}
}
As of this writing, Post is only supported in declarative pipelines.
You could have a look at their declarative example if you absolutely must use post.
pipeline {
agent {
kubernetes {
//cloud 'kubernetes'
label 'mypod'
containerTemplate {
name 'maven'
image 'maven:3.3.9-jdk-8-alpine'
ttyEnabled true
command 'cat'
}
}
}
stages {
stage('Run maven') {
steps {
container('maven') {
sh 'mvn -version'
}
}
}
}
}
This example shows how to use the post step using the Kubernetes plugin:
pipeline {
agent {
kubernetes {
label "my-test-pipeline-${BUILD_NUMBER}"
containerTemplate {
name "my-container"
image "alpine:3.15.0"
command "sleep"
args "99d"
}
}
}
stages {
stage('Stage 1') {
steps {
container('my-container') {
sh '''
set -e
echo "Hello world!"
sleep 10
echo "I waited"
echo "forcing a fail"
exit 1
'''
}
}
}
}
post {
unsuccessful {
container('my-container') {
sh '''
set +e
echo "Cleaning up stuff here"
'''
}
}
}
}