I have a shared library repository that contains all of my shared functions.
Is it possible to somehow use this within a freestyle job?
What I'm trying to accomplish is to create a freestyle job that calls and executes code from the vars folder of my shared library repository.
Is this possible at all?
Freestyle jobs have a Execute Groovy Script Build Step and I wonder if it might be possible to write a groovy script that calls a function within a global shared library.
I've tried to call the function using
GroovyShell shell = new GroovyShell()
def tools = shell.parse(new File('demoFree.groovy'))
tools.call()
demoFree.groovy being the name of the file that holds the function but my Groovy knowledge is very limited so I'm probably doing something very wrong.
I know this is easily done using a pipeline project but I'm having to deal with a legacy freestyle project which for a number of reasons isn't being moved to a declarative pipeline just yet.
No. According to the Documentation shared libraries are meant to be executed only from Jenkins pipelines.
However, you can easily create a pipeline job, choose Pipeline Script (instead of Pipeline from SCM) and write the code you need that uses your library.
You will find it much more convenient that using a FreeStyle job.
Related
I have 11 jobs running on the Jenkins master node, all of which have a very similar pipeline setup. For now I have integrated each job with its very own Jenkinsfile that specifies the stages within the job and all of them build just fine. But, wouldn't it be better to have a single repo that has some files (preferably a single Jenkinsfile and some libraries) required to run all the jobs that have similar pipeline structure with a few changes that can be taken care of with a work around?
If there is a way to accomplish this, please let me know.
Use a Shared Library to define common functionality. Your 11 Jenkinsfiles can then be as small as only a single call to the function implementing the pipeline.
Besides using a Shared Library, you can create a groovy file with common functionality and call its methods via load().
Documentation
and example. This is an easier approach, but in the future with the increasing complexity of pipelines, this may impose some limitations.
I am trying to upgrade my current regression infrastructure to use pipeline plugin and I realize there are two methods: scripted pipeline and declarative pipeline. Going through multiple articles, I realize that declarative pipeline is more future proof and more powerful, hence I am inclined to use this. But there seems to be following restrictions which I don't want to have in my setup:
The jenkinsfile needs to be in the repository. I don't want to keep my jenkinsfile in the code repository.
Since the jenkinsfile needs to be in SCM. Does it mean I cannot test any modification in the file until I check that in to the repository.
Any details on the above will be very helpful.
Declarative pipelines are compiled to scripted ones, so those will definitely not go away. But declarative ones are a bit easier to handle, so all fine for you.
You don't have to check a Jenkinsfile into VCS. You can also set up a job of type Pipeline and define it there. But this has the usual disadvantages like no history etc.
When using multi-branch pipelines, i.e., every branch containing a Jenkinsfile generating an own job, you just push your changed pipeline to a new branch and execute it. Once it's done, you merge it.
This approach certainly increases feedback cycles a bit, but it just applies the same principles as when writing your software. For experimentation, just set up a Pipeline type job and play around. Afterwards, commit it to a branch, test it, review it, merge it.
You can use the Pipeline Multibranch Defaults Plugin for that. It allows you to define the Jenkinsfile in the web UI (with the Config File Provider plugin) itself and then reference that file from a Multibranch Pipeline.
Is it possible to configure Jenkins to always run using a predefined Jenkinsfile for all projects, rather than pulling a Jenkinsfile from the project repo? The goal here is to make sure that a certain set of stages are always being run. If we allow projects to define their own Jenkinsfile, they could theoretically just skip some required stages in their project (like unit testing).
I want to make sure this never happens, but simply telling everyone "don't remove these stages from your Jenkinsfile" seems a bit brittle.
Use a Shared Library and define all of the stages and steps there. Then the Jenkinsfile in the SCM for each application needs only to point to the shared library and run a method. You can also pre-define variables specific to each application in the Jenkinsfile before you call the shared library.
This won't FORCE the application use your code--they can simply rewrite the entire Jenkinsfile, but at least they don't have control of the code you wrote and they can't simply comment out or easily add stages.
I am using this plugin: https://developer.ibm.com/urbancode/docs/integrating-jenkins-ibm-urbancode-deploy/
Is there a way to push to multiple components in UCD from inside one Jenkins job? e.g. The same Jenkins project compiles Java and produce three components: app1_web, app1_ear, app1_db in UCD. I see the plugin inside the Post Build Action can only be inserted once in the same job. Guess this is a question in Jenkins.
If you use Pipeline rather than the old Freestyle job type, you can use post-build actions multiple times.
Though I don't know which plugin you're using, and whether it supports Pipeline.
I am trying to combine nice branching handling of Workflow Multibranch
with powerfull Job-Dsl plugin Job generation. So basically I want branch to regenerate it's jobs from script in repository and run the main one.
But I don't see a way to run Process Job DSLs step from workflow script. May be there is a built in way to execute custom steps in Workflow, but I just can't find it.
You could create a separate job that processes the job-dsl, and then call it with the proper parameters from the workflow via a "build job: xxx" step.
Not quite sure where you are going with this, but perhaps what you really want is multibranch binding for Job DSL, or to manually iterate branches.
Alternately, with Workflow alone you can probably accomplish your goal, whatever that is.
It seems that the jobDSL method can be used in the pipeline.
Have a look into the Snippet Generator to generate some code :