I am trying to print/list all our jenkins (Freestyle and Pipeline) Jobs separately along with SCM Details such as (Git URL & Branch details) using below groovy. I am able to list our freestyle & scripted pipeline jobs names separately.
import jenkins.model.*
import hudson.model.*
import hudson.triggers.*
import org.jenkinsci.plugins.workflow.job.*
println("--- Jenkins Pipeline jobs List ---")
Jenkins.getInstance().getAllItems(WorkflowJob.class).each() { println(it.fullName) };
println("\n--- Jenkins FreeStyle jobs List ---")
Jenkins.getInstance().getAllItems(FreeStyleProject.class).each() { println(it.fullName) };
println '\nDone.'
With the below groovy code i can able to print the both freestyle & pipeline Git URLs, But it is printing separately.
Jenkins.instance.getAllItems(hudson.model.AbstractProject.class).each {it ->
scm = it.getScm()
if(scm instanceof hudson.plugins.git.GitSCM)
{
println scm.getUserRemoteConfigs()[0].getUrl()
}
}
println "Done"
Need help in listing/printing both Job name & Git URL along with respective Job.
If I run this code in Jenkins Script console,i see in output DSL scripted pipeline (Jenkinsfile) jobs as well
import org.jenkinsci.plugins.workflow.job.WorkflowJob;
def printScm(project, scm){
if (scm instanceof hudson.plugins.git.GitSCM) {
scm.getRepositories().each {
it.getURIs().each {
println(project + "\t"+ it.toString());
}
}
}
}
Jenkins.instance.getAllItems(Job.class).each {
project = it.getFullName()
if (it instanceof AbstractProject){
printScm(project, it.getScm())
} else if (it instanceof WorkflowJob) {
it.getSCMs().each {
printScm(project, it)
}
} else {
println("project type unknown: " + it)
}
}
Related
I already wrote an example Jenkinsfile to checkout and build and deploy a signal project. Is there a way to do all these for multiple project in different git repo the same time just using one Jenkinsfile ? I know I can set up these projects as independent jobs and use a Jenkinsfile to call them,but I'm wondering if I can do this without independent jobs.
Thanks.
You can make use of Job DSL Plugin to achieve this.
Jenkins Job DSL API will help you to write DSL scripts. You can find all the built-in DSL methods that will be needed to construct jobs.
Example pipeline script:
pipeline {
agent any
stages {
stage('Job1') {
steps {
//Pipeline Job
jobDsl scriptText: '''pipelineJob(\"$job1\") {
definition {
cpsScm {
scm {
git {
remote{
name('origin')
url('https://github.com/satta19/user-node.git')
credentials('git2-cred')
}
branch ('master')
}
}
scriptPath('Jenkinsfile')
}
}
}'''
}
}
stage('Job2') {
steps {
//Freestyle job
jobDsl scriptText: '''job(\"$job2\") {
steps {
shell(\'echo Hello World!\')
}
}'''
}
}
}
}
Note: I have taken the jobs name as string parameter i.e. $job1 and $job2 in the above example pipeline script.
Beginner in Jenkins. I have set up Jenkins on Linux server and creating a pipeline. I see in a Pipeline Tab, where it asks pipeline script or pipeline script from SCM. i m not using any SCM. i have written the sample pipeline script
pipeline {
agent {
node {
label ''
}
}
stages {
stage ('check script')
{
steps
{
script
{
println " Checking the bounce scripts of application"
}
}
}
stage ('Bounce Application')
{
steps
{
script
{
println " Bouncing the application"
}
}
}
}
}
i wanted to put the above pipeline script in a file on a Jenkins server and use/ call it here. Do we have feasibility of this?
Thanks for your help in advance.
Is it possible to get the git scm url for a Jenkins job with groovy in the Jenkins script console?
Yes it's possible:
item = Jenkins.instance.getItemByFullName("JOB_NAME")
println item.getScm().getUserRemoteConfigs()[0].getUrl()
If you want to iterate over all jobs that support Git you can use following script:
Jenkins.instance.getAllItems(hudson.model.AbstractProject.class).each {it ->
scm = it.getScm()
if(scm instanceof hudson.plugins.git.GitSCM)
{
println scm.getUserRemoteConfigs()[0].getUrl()
}
}
println "Done"
I have Jenkins pipeline Job with parameters (name, group, taskNumber)
I need to write pipeline script which will call groovy script (this one?: https://github.com/peterjenkins1/jenkins-scripts/blob/master/add-job.groovy)
I want to create new job (with name name_group_taskNamber) every times when I build main Pipeline Job.
I don't understand:
Where do I need to put may groovy script ?
How does Pipeline script should look like? :
node{
stage('Build'){
def pipeline = load "CreateJob.groovy"
pipeline.run()
}
}
You can use and configure a shared library like here (a git repo): https://github.com/lvthillo/shared-library . You need to configure this in your Jenkins global configuration.
It contains a folder vars/. Here you can manage pipelines and groovy scripts like my slackNotifier.groovy. The script is just a groovy script to print the build result in Slack.
In the jenkins pipeline job we will import our shared library:
#Library('name-of-shared-pipeline-library')_
mavenPipeline {
//define parameters
}
In the case above also the pipeline is in the shared library but this isn't necessary.
You can just write your pipeline in the job itself and call only the function from the pipeline like this:
This is the script in the shared library:
// vars/sayHello.groovy
def call(String name = 'human') {
echo "Hello, ${name}."
}
And in your pipeline:
final Lib= library('my-shared-library')
...
stage('stage name'){
echo "output"
Lib.sayHello.groovy('Peter')
}
...
EDIT:
In new declarative pipelines you can use:
pipeline {
agent { node { label 'xxx' } }
options {
buildDiscarder(logRotator(numToKeepStr: '3', artifactNumToKeepStr: '1'))
}
stages {
stage('test') {
steps {
sh 'echo "execute say hello script:"'
sayHello("Peter")
}
}
}
post {
always {
cleanWs()
}
}
}
def sayHello(String name = 'human') {
echo "Hello, ${name}."
}
output:
[test] Running shell script
+ echo 'execute say hello script:'
execute say hello script:
[Pipeline] echo
Hello, Peter.
[Pipeline] }
[Pipeline] // stage
We do it by using the https://wiki.jenkins.io/display/JENKINS/Jobcopy+Builder+plugin, try build another step in pipeline script and pass the parms which are to be considered
Since there's a limitation in Jenkins Pipeline's that you cannot add a manual build step without hanging the build (see for example this stackoverflow question) I'm experimenting with a combination of Jenkins Pipeline and Build Pipeline Plugin using the Job DSL plugin.
My plan was to create a Job DSL script that first executes the the Jenkins Pipeline (defined in a Jenkinsfile) and then create a downstream job that deploys to production (this is the manual step). I've created this Job DSL script as a test:
pipelineJob("${REPO_NAME} jobs") {
logRotator(-1, 10)
def repo = "https://path-to-repo/${REPO_NAME}.git"
triggers {
scm('* * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
publishers {
buildPipelineTrigger("${REPO_NAME} deploy to prod") {
parameters {
currentBuild()
}
}
}
}
freeStyleJob("${REPO_NAME} deploy to prod") {
}
buildPipelineView("$REPO_NAME Build Pipeline") {
selectedJob("${REPO_NAME} jobs")
}
where REPO_NAME is defined as an environment variable. The Jenkinsfile looks like this:
node {
stage('build'){
echo "building"
}
stage('run tests'){
echo "running tests"
}
stage('package Docker'){
echo "packaging"
}
stage('Deploy to Test'){
echo "Deploying to Test"
}
}
The problem is that the selectedJob points to "${REPO_NAME} jobs" which doesn't seem to be a valid option as "Initial Job" in the Build Pipeline Plugin view (you can't select it manually either).
Is there a workaround for this? I.e. how can I use a Jenkins Pipeline as the "Initial Job" for the Build Pipeline Plugin?
From the documentation on yourDomain.com/plugin/job-dsl/api-viewer/index.html#method/javaposse.jobdsl.dsl.views.NestedViewsContext.envDashboardView
It shows that buildPipelineView can only be used within a View block which is inside of a Folder block.
Folder {
View {
buildPipelineView {
}
}
}