Triggering vSphere build via Jenkins pipeline agent - jenkins

My goal is to set up a declarative pipeline job which automatically triggers the vSphere plugin to create a VM on which the build and test runs in a clean environment.
I've configured the vSphere Cloud Plugin in Jenkins' global settings to build slaves with label "appliance-slave", and this does trigger for freestyle jobs with "Restrict where this project can be run" set to that label. However, the following example pipeline never triggers the vSphere plugin (based on tailing the Jenkins log):
pipeline {
agent {
label 'appliance-slave'
}
stages {
stage('Test') {
steps {
sh "hostname && hostname -i"
}
}
}
}
I've searched the documentation without any luck. Is there some configuration option or alternate agent declaration that I'm missing that would allow this?

Finally resolved the problem; the issue was that I needed to go in to the actual slave configuration and set up the slave there. The vSphere plugin modifies the slave configuration page to allow exactly what I was trying to do: shutting down and reverting the VM once the build is complete.

Related

Block Jenkins pipeline build when another particular job is running

I am converting a Jenkins Freestyle build to a pure pipeline based job.
Current configuration uses https://plugins.jenkins.io/build-blocker-plugin/ as shown in the image.
How can I use it in a Declarative pipeline?
pipeline {
agent { label 'docker-u20' }
//Don't run until the "test-job" running
blockon("test-job")
stage{}
}
I did try to look into various jenkins docs but haven't found yet.

How to run a job from a Jenkins Pipeline on the same executor (declarative syntax)

I want to use the Jenkins "PRQA" plugin, which seems not to have the option to use it from a pipeline. The plugin would run static code analysis and publish the results.
In my case, it requires some preparations that are already done in a pipelinejob. Because of that, I want to include the job into that pipeline, but on the same executor with the data prepared by the pipeline as some kind of inlined job-step.
I have tried to create a job for the PRQA-Plugin-Step and execute this with the build step from the pipeline. But this tries to start the job on a new executor (and stalls because I have only one executor).
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Prepare'
}
}
stage('SCA') {
steps {
//Run this without using a new executor with the Environment that exists now
build 'PRQA_Job'
}
}
}
}
What is the correct way to run the job on the same executor with the current working directory.
With specified build 'PRQA_Job' it's not possible to run second job on the same executor (1 job = 1 executor), since main job just waiting for a triggered job to be finished. But you can run another job on the same agent with more than 1 executor to reach workspace from main job.
For a test porpose specify agent name in both jobs: agent 'agent_name_here'
If you want to use plugin functionality for a plugin, which has no native pipeline support, you could try using "step: General Build step" feature for Jenkins Pipelines. You can use the Pipeline Syntax wizzard linked in the Job configuration windows to generate the needed Pipeline description.
If the plugin does not show up in the "step: General Build step" part of Jenkins you can use a separate Job. To copy all the needed files/Data into this second Job you will require to use Archive Artifact/Copy Artifact functionality of Jenkins to save files from your Pipeline build.
For more information on how to sue Archive Artifact/Copy Artifact see https://plugins.jenkins.io/copyartifact/ and
https://www.jenkins.io/doc/pipeline/tour/tests-and-artifacts/

Jenkins Pipeline - Restrict where this project can be run

I am using Jenkins pipeline (Jenkins version - v2.73.2 ,Jenkins pipeline - 2.5) and wanted to force my job to use only two windows slave machines 'Jenkins-s1' or 'Jenkins-s2' which ever is available. To do this I know in legacy Jenkins we have option "Restrict where this project can be run" in the configure page.
Upon googling, with Jenkins pipeline 2, we can mention in the pipeline script as
node('Jenkins-s1||Jenkins-s2') {
stage ('Checkout') {
............ }
stage ('Build'){
............ }
stage ('Deploy') {
............ }
}
Which seems to be hard-code in the code and have to mention same in every child job. Therefore, i am looking for suggestions if there are any plugins available in Jenkins pipeline 2 which gives me option "Restrict where this project can be run" in the configure page or is there any other way i can handle this scenario.
Please share your inputs.

Jenkins Pipeline: Are agents required to utilize Jenkinsfile?

I am investigating the use of Jenkins Pipeline (specifically using Jenkinsfile). The context of my implementation is that I'm deploying a Jenkins instance using Chef. Part of this deployment may include some seed jobs, which will pull job configurations from source control (Jenkinsfile), to automate creation of our build jobs via Chef.
I've investigated the Jenkins documentation for both Pipeline as well as Jenkinsfile, and it seems to me that in order to use Jenkins Pipeline agents are required to be configured and set up in addition to Jenkins Master.
Am I understanding this correctly? Must Jenkins agents exist in order to use Jenkins Pipeline's Jenkinsfile? This specific line in the Jenkinsfile documentation leads me to believe this to be true:
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
echo 'Testing..'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
}
}
The Declarative Pipeline example above contains the minimum necessary
structure to implement a continuous delivery pipeline. The agent
directive, which is required, instructs Jenkins to allocate an
executor and workspace for the Pipeline.
Thanks in advance for any Jenkins guidance!
The 'agent' part of the pipeline is required however this does not mean that you are required to have an external agent in addition to your master. If all you have is the master this pipeline will execute on the master. If you have additional agents available the pipeline would execute on whichever agent happens to be available when you run the pipeline.
If you go into
Manage Jenkins -> Manage Nodes and Clouds, you can see 'Master' itself is treated as one of the Default Nodes. With declarative format agent anyindicates any available agent which (including 'Master' as well from node configuration see below).
In case if you configure any New node, this can then be treated as New Agent in the pipeline agent any can be replaced by agent 'Node_Name'
You may can refer this LINK which give hint on Agent, Node and Slave briefly.

Jenkins Pipeline as Code with Docker Error

For one of my projects that I have on GitHub, I wanted to build it as a docker image and push it to my docker hub. The project is a sbt one with a Scala codebase.
Here is how my JenkinsFile is defined:
#!groovy
node {
// set this in Jenkins server under Manage Jenkins > Credentials > System > Global Credentials
docker.withRegistry('https://hub.docker.com/', 'joesan-docker-hub-credentials') {
git credentialsId: '630bd271-01e7-48c3-bc5f-5df059c1abb8', url: 'https://github.com/joesan/monix-samples.git'
sh "git rev-parse HEAD > .git/commit-id"
def commit_id = readFile('.git/commit-id').trim()
println comit_id
stage "build" {
def app = docker.build "Monix-Sample"
}
stage "publish" {
app.push 'master'
app.push "${commit_id}"
}
}
}
When I tried to run this from my Jenkins server, I get the following error:
java.io.FileNotFoundException
at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:167)
at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:159)
at jenkins.plugins.git.GitSCMFileSystem$3.invoke(GitSCMFileSystem.java:161)
at org.jenkinsci.plugins.gitclient.AbstractGitAPIImpl.withRepository(AbstractGitAPIImpl.java:29)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.withRepository(CliGitAPIImpl.java:65)
at jenkins.plugins.git.GitSCMFileSystem.invoke(GitSCMFileSystem.java:157)
at jenkins.plugins.git.GitSCMFile.content(GitSCMFile.java:159)
at jenkins.scm.api.SCMFile.contentAsString(SCMFile.java:338)
at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:101)
at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:59)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:232)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:404)
Finished: FAILURE
Since this is running inside a VM on Azure, I thought the VM was not able to reach outside, but that seems not to be the case as I was able to ssh into the VM and git pull from the Git repo. So what is the problem here? How could I make this work?
for me unchecking "lightweight checkout" fixed the issue
I experienced the exact same error. My setting:
Pipeline build inside a dockerized Jenkins (version 2.32.3)
In the configuration of the job, I specified a check out into a subdirectory: Open the configuration, e.g. https://myJenkins/job/my-job/configure. At the bottom, see section Pipeline -> Additional Behaviours -> Check out into a sub-directory with Local subdirectory for repo set to, e.g., my-sub-dir.
Expectation: Upon check out, the Jenkinsfile ends up in my-sub-dir/Jenkinsfile.
Via the option Script path, you configure the location of the Jenkinsfile so that Jenkins can start the build. I put my-sub-dir/Jenkinsfile as value.
I then received the exception you pasted in your question. I fixed it by setting Script Path to Jenkinsfile. If you don't specify a sub-directory for check out, then still try double checking values for Script Path.
Note: I have another Jenkins instance at work. There I have to specify Script Path including the custom check out sub-directory (as mentioned in Expectation above).
GO TO Job-->Config-->Pipline and uncheck checkbox lightweight checkout"
lightweight checkout : selected, try to obtain the Pipeline script contents >directly from
the SCM without performing a full checkout. The advantage of this mode
is its efficiency; however, you will not get any changelogs or polling
based on the SCM. (If you use checkout scm during the build, this will
populate the changelog and initialize polling.) Also build parameters
will not be substituted into SCM configuration in this mode. Only
selected SCM plugins support this mode.

Resources