Use code to build jenkins job with new jenkins pipeline groovy script - jenkins

I have a jenkins pipeline, name is TEST-PIPELINE, then I changed jenkinsfile(pipeline config) locally, like:
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
echo 'Testing..'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
}
}
I wonder to trigger a new job with new modified script above.
How to trigger new job with new pipeline code by using python or some code, instead web page? I'd like to automatically to test new modified jenkinsfile, so need code to trigger that job with new file. Thanks!

If you are using Multi Branch Pipeline, you can configure build like this. Then, when a new change is pushed in the branch build will be triggered.
PS: Web Hooks should be also defined DOCUMENTATION

I use Generic Webhook trigger plugin to run my pipeline when there is new commit in the repository.

Related

Question on Jenkins BitBucket using pipeline and pipeline script but also running when new data is pushed to bitbucket

I have created an itiem using pipeline, and then in the pipeline selecting the pipeline script,
This allows me to run the build in stages. As below
[code]
pipeline {
agent any
tools {
terraform 'terraform-11'
}
stages {
stage('Git Checkout terraform') {
steps {
git credentialsId: '********', url: 'https://******/********.git'
}
}
stage('Terraform Init') {
steps {
sh 'terraform init'
}
}
stage('Terraform A'){
steps {
dir(dev){
sh 'terraform plan -var-file="terraform.tfvars"'
sh 'terraform apply -auto-approve'
}
}
}
stage('Terraform B'){
steps {
dir(env){
sh 'terraform plan -var-file="terraform.tfvars"'
sh 'terraform apply -auto-approve'
}
}
}
}
}
[/code]
This works very well, I take the code out and run a series of stages. There are more stages than this. What I would like to do is have the jenkins build run every time the terrform scripts are updated. I have look at examples but none of the examples are part of the PipeLine/PipeLine Script
There is Freestyle project, but it does not allow me to build all the stages I need.
There is PipeLine /Pipeline script from SCM which again does not allow me to build all the stages I need.
What I want to do is stick with my current pipeline, but set it so it can be run when scripts are pushed to Bitbucket. All I need is pointing at the right documentation. If this is possible. If its not possible, then I will go back to the drawing board.
I worked out the solution. I set up a Item that is a Folder, set up the Git Repo. Then I created a Jenkins file called JenkinsFile with all the stages and steps. This is then uploaded to the repo being built. So the build will run the main item which the pulls in the JenkinsFile and runs it.

Checkout and run SCM pipeline only on master node

I coded a generic pipeline which accepts several parameters in order to deploy releases from a pre-defined GitHub repository to specific nodes. I wanted to host this pipeline on a Jenkinsfile on GitHub, so I configured the job to work with a "Pipeline script from SCM". The fact is - when I try and build the job - the Jenkinsfile gets checked out on every node. Is it possible to checkout and execute the Jenkinsfile only on, say, the master node and run the pipeline as intended?
EDIT: As I stated before, the pipeline works just fine and as intended setting the job to work with a pipeline script. The thing is when I try and change it to be a "Pipeline script from SCM", the Jenkinsfile gets checked out on every agent, which is a problem since I don't have git installed on any agent other than master. I want the Jenkinsfile to be checked out only on master agent and be executed as intended. FYI the pipeline below:
def agents = "$AGENTS".toString()
def agentLabel = "${ println 'Agents: ' + agents; return agents; }"
pipeline {
agent none
stages {
stage('Prep') {
steps {
script {
if (agents == null || agents == "") {
println "Skipping build"
skipBuild = true
}
if (!skipBuild) {
println "Agents set for this build: " + agents
}
}
}
}
stage('Powershell deploy script checkout') {
agent { label 'master' }
when {
expression {
!skipBuild
}
}
steps {
git url: 'https://github.com/owner/repo.git', credentialsId: 'git-credentials', branch: 'main'
stash includes: 'deploy-script.ps1', name: 'deploy-script'
}
}
stage('Deploy') {
agent { label agentLabel }
when {
expression {
!skipBuild
}
}
steps {
unstash 'deploy-script'
script {
println "Execute powershell deploy script on agents set for deploy"
}
}
}
}
}
I think that skipDefaultCheckout is what are you looking for:
pipeline {
options {
skipDefaultCheckout true
}
stages {
stage('Prep') {
steps {
script {
........................
}
}
}
}
}
Take a look to the documentation:
skipDefaultCheckout
Skip checking out code from source control by default in the agent directive.
https://www.jenkins.io/doc/book/pipeline/syntax/
I think you are requesting the impossible.
Now:
your Jenkinsfile is inside your jenkins configuration and is sent as such to each of your agents. No need for git on your agents.
Pipeline script for SCM:
Since you use git, SCM = git. So you are saying: my Pipeline needs to be fetched from a git repository. You are declaring the Deploy step to run on agent { label agentLabel }, so that step is supposed to run on another agent than master.
How would you imagine that agent could get the content of the Jenkinsfile to know what to do, but not use git ?
What happens in Jenkins?
Your master agent gets triggered that it needs to build
the master agent checkouts the Jenkinsfile using git (since it is a Pipeline script from SCM)
jenkins reads the Jenkinsfile and sees what has to be done.
for the Prep stage, I'm not quite sure what happens without agent, I guess that runs on master agent.
the Powershell deploy script checkout is marked to run on master agent, so it runs on master agent (note that the Jenkinsfile will get checked out with git two more times:
before starting the stage, because jenkins needs to know what to execute
one more checkout because you specify git url: 'https://github.com/owner/repo.git'...
the Deploy stage is marked to run on agentLabel, so jenkins tries to checkout your Jenkinsfile on that agent (using git)...
You can use Scripted Pipeline to do this, it should basically look like this
node('master') {
checkout scm
stash includes: 'deploy-script.ps1', name: 'deploy-script'
}
def stepsForParallel = [:]
env.AGENTS.split(' ').each { agent ->
stepsForParallel["deploy ${agent}"] = { ->
node(agent) {
unstash 'deploy-script'
}
}
parallel stepsForParallel
you can find all info about jenkins agent section here.
Shortly: you can call any agent by name or label.
pipeline {
agent {
label 'master'
}
}
If it will not work for you, then you will need to set any label on master node and call it by label
pipeline {
agent {
label 'master_label_here'
}
}

Is it Possible to Run Jenkinsfile from Jenkinsfile

Currently we are developing centralized control system for our CI/CD projects. There are many projects with many branches so we are using multibranch pipeline ( This forces us to use Jenkinsfile from project branches so we can't provide custom Jenkinsfile like Pipeline projects ). We want to control everything under 1 git repo where for every project there should be kubernetes YAMLS's, Dockerfile and Jenkinsfile. When developer presses build button, Jenkinsfile from their project repo suppose to run our jenkinsfile. Is it possible to do this?
E.g. :
pipeline {
agent any
stages {
stage('Retrieve Jenkinsfile From Repo') { // RETRIEVE JENKINSFILE FROM REPO
steps {
git branch: "master",
credentialsId: 'gitlab_credentials',
url: "jenkinsfile_repo"
scripts {
// RUN JENKINSFILE FROM THE REPO
}
}
}
}
}
Main reason we are doing this, there are sensetive context in jenkinsfile like production database connections. We don't want to store jenkinsfile under developers' repo. Also you can suggest correct way to achieve that beside using only 1 repo.
EDIT: https://plugins.jenkins.io/remote-file/
This plugin solved all my problems. I could'not try comments below
As an option you can use pipeline build step.
pipeline {
agent any
stages {
stage ('build another job') {
steps {
build 'second_job_name_here'
}
}
}
}
Try load step
scripts {
// rename Jenkinsfile to .groovy
sh 'mv Jenkinsfile Jenkins.groovy'
// RUN JENKINSFILE FROM THE REPO
load 'Jenkinsfile.groovy'
}

Is there a way for a Jenkins Pipeline, in a Multibranch setup, to automatically checkout the branch that is at the latest revision?

I'm trying to configure job in Jenkins Multibranch pipeline. There are a lot of branches in SVN and I want the job to checkout only the latest one and ignores the rest of them. This job triggers a pipeline that does multiple checks on the whole build... so I always need to trigger this on the latest branch because there I will have the latest revision of the build.
The SVN structure is like this: V01_01_01 till the latest one V01_08_03. Currently I have it set up like the below and in the Jenkins pipeline I have "checkout scm", but if a new branch appears e.g. V01_08_04 I need V01_08_03 to be replaced by V01_08_04. Is there any way to do this ?
My set-up in Jenkins Multibranch pipeline
I found a hack to this. I created a python script that checks the whole repository for the latest folder that was updated.
pipeline
{
agent any
parameters
{
string(name: 'latest_folder', defaultValue: '')
}
stages
{
stage ('find latest folder')
{
steps
{
withPythonEnv('System-CPython-3.8')
{
sh 'pip3 install svn'
script {
def folder_name = sh(script: 'python3 latest_folder_svn.py', returnStdout: true)
env.latest_folder = folder_name
}
}
}
}
stage ('Checkout Step')
{
steps
{
echo "${env.latest_folder}"
}
}
}
}
This variable I will add it in the checkout step in order to have always the latest branch.
The python script is pretty straightforward. I use svn library to parse the repository and extract what I need.

How to create a post-build script for all Jenkins jobs

Is there a way to create a post build script for all Jenkins jobs? Some script that is shared across jobs? I would like to avoid manually creating a post-build script for each job if possible.
AFAIK there is no job that will always run after any other job. You can emulate that creating a new job and then either configure a post build trigger on all your jobs to run the new one, or configure a build trigger in the new job to run after all the jobs you specify.
However, if all your jobs are pipelines and you have a shared library you can create a step that is actually a pipeline with a built-in post, for example consider a step called postPipeline.groovy:
def call(Closure body) {
pipeline {
agent any
stages {
stage('Run pipeline') {
steps {
script {
body()
}
}
}
}
post {
always {
<< routine post actions go here >>
}
}
}
}
By changing all the pipelines to use this step you ensure they all run the post script:
postPipeline {
stage('Pipeline stage') {
<< code >>
}
.
.
.
}
Still, in any case you get yourself involved in manual labor.

Resources