I want to trigger a Jenkins build from Artifactory:
https://www.jfrog.com/confluence/display/RTF/Jenkins+Artifactory+Plug-in#JenkinsArtifactoryPlug-in-TriggeringBuilds
So it looks like it only works with freestyle non-multibranch jobs? It's going to be kicked off by Artifactory.
I already have a shared library repo with a lot of shared functions and stuff. How can I use that code in a freestyle job? I don't want to manage this job's code in a different place.
I would be nice if I could have a freestyle job execute code from the vars folder of my shared library.
You can build a free style job that trigger a pipeline job or a number of jobs:
And in the pipeline you can use shared library functions as well.
Related
Jenkins pipelines are stored in repo and we will use them in Jenkins.
Is there a way to save freestyle jobs in the repo in automated way.
Unfortunately there is no way to store the definition of a Free-Style job in a file format, which can be imported like a Jenkinsfile.
I have mutibranch job in jenkins. Is there any way i can do the multibranch job configurations in a Jenkinsfile? like branchsource , behaviours etc in multibranch job i want to store as a code
Short answer is NO.
But if you still want it, you can store job xml configuration in Git, and then update job through Jenkins API using a local script or even another Jenkins job. Don't recommend though.
Is it possible to create Multi-Branch Pipeline Job by a Job DSL which defines the Job by "Pipeline Script" instead of Jenkinsfile contained by each Git Repository?
We wanna avoid to generate and maintain the same Jenkinsfile (except some parameters) in each of our 100 Git Repositories.
At the moment we are using Pipeline Jobs with Job DSL seeded by a Factory Job, but we are limited at them moment with Multi-Branch Builds (Feature Branches). So we wanna switch to Multi-Branch Pipeline Jobs, but there we are limited in seeding them.
I know we could use a Jenkinsfile (Git Repo of Project) which includes other common Jenkinsfiles from the Jenkins, but that is just a workaround.
Only pipeline jobs can have the pipeline defined inline. Multi-branch jobs can't and JobDSL can't change anything about that.
The probably better alternative is using a shared library. You can configure Jenkins to automatically load this library so that the particular Jenkinsfiles in all the repos only have to call a function out of that.
You can e.g. have a look at a Jenkinsfile of a Jenkins plugin - it only calls a function from the shared library:
buildPlugin()
In your case (as you wrote about "except some parameters"), this function could have some parameters that could differ by the different jobs. The buildPlugin function is implemented here in https://github.com/jenkins-infra/pipeline-library/blob/master/vars/buildPlugin.groovy.
While this would still require you to update all your repos, it is probably the better starting point to introduce pipelines in your organisation.
I had some jenkins standalone jobs to build, package and deploy. Now I am connecting them and making 'build' job trigger 'package' job , and 'package' job to trigger 'deploy' job and am passing the required parameters between them.I can also see them neatly in pipeline view.
My question is, can this technically be called a pipeline? Or can I call it a pipeline only if I use pipeline plugin and write groovy script?
Thanks
p.s: Please do not devote this question. It is a sincere question for which I am not able to find the right answer. I want to be technically correct.
In Jenkins context, a pipeline is a job that defines a workflow using pipeline DSL (here, based on Groovy). A pipeline aims to define a bunch of steps (e.g. build + package + deploy in your case) in a single place, allows to define a complex workflow (e.g. parallel steps, input step, try/catch instructions) that can be both replayed and versionned (because it can be saved to git). For more information you should read Jenkins official pipeline documentation that explains in details what a pipeline is.
The kind of jobs you are currently using are called freestyle jobs, and even if they do define a "flow" (by chaining jobs together), they cannot be called pipelines jobs.
In short, pipelines are jobs that use pipeline plugin and groovy script syntax to define the whole application lifecycle, and standard Jenkins 1.x jobs are called freestyle jobs.
The Jenkins pipeline plugin is awesome.
But is it also possible to aggregate pipelines of (dependent projects) e.g. micro-services?
If you have separate jobs that run pipelines you could just call build [job name] to invoke subsequent pipelines
You can use the way that #ebnius says where you have little pipeline jobs and a parent which is orchestrating the complete workflow and calling the different pipelines.
Or you can use the Shared Library plugin (https://jenkins.io/doc/book/pipeline/shared-libraries/) where you define a step per groovy file for example and you have the entire structure modularized.