I'm trying to figure it out on how to work with a specific version of a Shared Library.
Jenkins documentation about this isn't quite clear so I've being making some experimenting but with no success.
They basically say:
But how should I configure somelib on 'Global Pipeline Libraries' section under Manage Jenkins > System Config menu so I can use any of the available stable versions?!
The thing is:
Imagine that I've my somelib Project under version control and, currently, I've released 2 stable versions of it: v0.1 and v0.2 (so I have 2 tags named v0.1 and v0.2).
And in some Pipeline I want to use somelib's version v0.1 and on another Pipeline I need to use v0.2 version.
How can I do this using the #Library annotation provided by Jenkins?
In the Global Pipeline Libraries under Jenkins > System Config you only set the default library version to use if not specified otherwise inside the Jenkinsfile. This might look like this (ignore the Failed to connect to repo error here):
Inside the Jenkinsfile you can explicitly specify which version you want to use if you do not want the default:
#Library('somelib#<tag/branch/commitRef>')
That way you can freely choose at any time which pipeline version to use for you project.
Following #fishi response I just want to leave an important note.
During library configuration on Global Pipeline Libraries you must select Modern SCM option so things can work seamlessly.
If you select Legacy Mode instead you'll not be able to use the library as desired.
If for some reason Modern SCM does not appear in the Retrieval Mode option it means that you need to upgrade Global Pipeline Libraries plugin or even Jenkins
Basically "Version" is the branch name for the repo which stores the shared library codes. If you don't have any branch other than main or master, make sure to fill it in Default Version in your Global Pipeline Library configuration
Related
I would like to define some groovy code that is imported into one or more declarative pipelines that are all stored in the same git repo.
I do NOT want to create a global shared library that different pipelines from different repos share.
For example, in a repo I might have the following files:
shared-library.groovy
pr-pipeline-unit-tests.groovy
pr-pipeline-ui-tests.groovy
I want both pr-pipeline-unit-tests.groovy and pr-pipeline-ui-tests.groovy to be able to import shared-library.groovy.
These pipelines are executed on PRs, and updates to shared-library.groovy should only affect that PR - this is why I do not want a jenkins globally stored shared library.
As bonus, it would be cool if shared-library.groovy could be a standalone gradle project that is "imported" into the pipelines, similar to how buildSrc is imported into gradle project configuration files. Even better if the shared code could be Kotlin!
I am trying to create a dataflow template and run it via the DataFlow Cloud UI, and after executing the pipeline via command-line with the dataflow runner, it works correctly (i.e. the right data appears in the right places) but there are no "pre-compiled" template/staging files appearing in the Google Cloud Storage Bucket.
I did see this, but the post never mentions a resolution and I did include the parameter mentioned therein.
My command to run is python apache_beam_test.py --runner DataflowRunner --project prototypes-project --staging_location gs://dataflow_templates/staging --temp_location gs://dataflow_templates/temp --template_location gs://dataflow_templates/
I do get a warning regarding the options however:
C:\Python38\lib\site-packages\apache_beam\io\gcp\bigquery.py:1677: BeamDeprecationWarning: options is deprecated since First stable release. References to .options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
C:\Python38\lib\site-packages\apache_beam\io\gcp\bigquery_file_loads.py:900: BeamDeprecationWarning: options
is deprecated since First stable release. References to .options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Does that mean my command-line arguments will not be interpreted and if so, how do I get the DataFlow/Beam templates into my GCS so I can reference them from the DataFlow UI and run them again later on?
Help much appreciated!
The problem was indeed that the cli flags needed to be explicitly passed into the pipeline options.
As I did not have any custom flags added to my project, I wrongly assumed Beam would handle the standard flags automatically, but this was not the case.
Basically, you have to follow this even if you have no new parameters to add.
I assumed that step is optional (which it technically is if you only want to execute a pipeline without any runtime parameters once), but in order to reuse and monitor the pipelines in Dataflow UI you have to stage them first. That in turn requires passing a staging location into the pipeline.
Also, as far as I understand, execution of the pipeline requires a service account, while uploading the staging files requires Google Cloud SDK authentication.
I'm setting up a new Jenkins master server and configuring it using the Jenkins Configuration as code (JCASC) plugin.
https://github.com/jenkinsci/configuration-as-code-plugin/blob/master/README.md
I've configured most plugins with JCASC, based on documentation and examples inside the project, but I can't find the syntax for configuring plugin 'Fortify Jenkins Plugin' version 18.10.
I need to set these properties:
URL of the remote Fortify server, authentication token (generated on the fortify server) and which template to use.
Can anyone assist with an example or syntax for the yml file used by the JCASC plugin for Fortify plugin?
I don't know if fortify-plugin is compatible with JCasC, it might be or it might need some modifications. That said, if it is compatible, then the configuration export should work for it.
So, spin up a Jenkins instance, install the plugin, configure whatever you want in the Jenkins UI and then go to the CasC page and use the configuration export. That should give you a JCasC file containing your setup.
Alternatively, you can try the JCasC Schema experimental feature. It's a JSON schema generated by Jenkins that you can use in your YAML editor for autocompletion. More information here.
we have just released an update of the Fortify plugin with support for JCasC. Keep in mind, versions of the plugin prior to v21.1.36 were unable to support it, we had to make changes to make it happen.
You can find official documentation on how to use our configuration elements here. There's one correction to the documentation, though. Our top level configuration element is called fortifyPlugin instead of fortify mentioned in the documentation. It is going to be corrected in the next documentation update.
Here's a sample configuration for your quick reference:
unclassified:
fortifyPlugin:
url: "https://qa-plg-ssc3.prgqa.hpecorp.net:8443/ssc"
token: "3ab8c774-0850-483b-8be6-2907722a81d8"
proxyConfig:
proxyUrl: "web-proxy.us.softwaregrp.net:8080"
projectTemplate: "Prioritized High Risk Issue Template"
connectTimeout: "10"
readTimeout: "20"
writeTimeout: "10"
breakdownPageSize: "50"
ctrlToken: "5176d380-26ac-430f-95d7-0a2272cf3297"
I am working with Jenkins, Gradle and our Ivy repository.
Our build scripts specify the exact version of dependencies to be used for the build. This is good practice for production.
For CI it would be interesting if the project build used the latest versions of our own libraries, that way we could not only see if library changes "broke the build" for the library but also if they broke the projects that use them. That seems to be the point of "integration"!
I understand that gradle will take "1.+" instead of "1.2.3" so I could hack the build.gradle for the project on the CI server to achieve this. But perhaps there is a neater way to do it (build script recognises it is in CI mode and uses latest and not specific versions, perhaps by running a sed script on build.gradle to change it).
Am I missing something in Jenkins or gradle? Are there any gradle plugins that achieve this, or alternative approaches that you have used to achieve this?
something alike this might work with Jenkins:
if(System.getenv("BUILD_EXPERIMENTAL") == null) {
// known to be stable versions
apply from: "dependencies.gradle"
} else {
// bleeding edge versions
apply from: "experimental.gradle"
}
this just would need the same project being set up twice, once with and once without environmental variable BUILD_EXPERIMENTAL, which is used to control which dependencies block is being applied.
in case you want it generally being applied, when the project is being built with Jenkins, just replace BUILD_EXPERIMENTAL with BUILD_NUMBER (which by default is being set up in that environment).
If you want to have the latest you can simply use latest, or if it's easier something like [1.0,) that would match all versions greater or equal to 1.0 (assuming that 1.0 is your "smallest version ever") Look here for other matching patterns, which you could also combing with statuses.
Another way would be to have a local filesystem ivy repo only on the jenkins slave, which would have all the latest versions of your libraries, the point is that this repo is not accessible from developers workstations/laptops/VMs. And then you just simply use that in gradle settings in some way (for example have an environment variable defined only on the jenkins slave). This means that you don't need to change build.gradle
I would recommend leveraging Gradle dependency locking for achieving this.
In the build, you would use dynamic versions for your dependencies, locked to a good known state.
Developers and production build would then get these locked versions resolved.
On CI you could have a (set of) dedicated job(s) that runs and updates the lock state for one or more modules at a time. Based on that feedback, you could even commit this dependency upgrade or at least open a pull request for it.
This is my own answer inspired by #Martin Zeitler's answer.
We have a generic build script that get applied to all project build.gradle setting up common options, settings and tasks. We want to add in this logic, but make it optional and not break existing build scripts.
The logic will be activated and controlled by a property project.ext.buildJenkinsWithLatest which is true or false.
When the logic is active dependencies from the project files dependencies-production.gradle or dependencies-jenkins.gradle will be used. The Jenkins dependencies will only be used if the property is true and the CI environment is detected through the presence of the BUILD_NUMBER environment variable.
The generic build script contains this:
if (project.ext.has('buildJenkinsWithLatest')) {
println "Using conditional dependency management..."
//BUILD_NUMBER is not null if this is a Jenkins build
if(project.ext.buildJenkinsWithLatest == true && System.getenv("BUILD_NUMBER") != null) {
println "--- Using alternative dependencies..."
apply from: "dependencies-jenkins.gradle"
}
else {
println "--- Using production dependencies..."
apply from: "dependencies-production.gradle"
}
}
else {
println "Conditional dependency management is not active"
}
Now any project's build.gradle that already applies this script will print this when run:
Conditional dependency management is not active
To use the feature we will need to do the following for our project:
Create a dependencies-jenkins.gradle that contains a dependencies {} clause for the libraries we want to select a version dynamically.
Create a dependencies-production.gradle that contains a dependencies {} clause for those libraries, but with a specific version given.
Remove the libraries from any dependencies {} that remains in the project build.gradle.
Set the property project.ext.buildJenkinsWithLatest to true or false.
Apply the generic build script (after setting the property!).
For example in dependencies-jenkins.gradle use the latest 2.x.x version:
dependencies {
compile 'example:my-library:2+'
}
As to how to specify the versions in a dynamic way see #CantSleepNow's answer.
And in dependencies-production.gradle use a specific version:
dependencies {
compile 'example:my-library:2.3.4'
}
Then within build.gradle set the property and apply the generic build script:
...
project.ext.buildJenkinsWithLatest = true;
apply from: '../bxgradle/bx-std.gradle'
...
Now when the build is run on Jenkins the alternative dependencies will be used. Should one wish to build it on Jenkins with the production dependencies then set project.ext.buildJenkinsWithLatest to false.
I wonder if it is possible to remove only one build (including artifacts) from job workspace.
I tried to "Delete Build" in Build History but all it does is remove build reference from Build History table. I know I can ssh to a server and delete files from the command line but I am looking for a way to do it from Jenkins web interface.
After installing Workspace Cleanup Plugin I am able to wipe out current workspace but I want to keep my other builds in the workspace.
In your Jenkins instance, to be able to have folder/per build - set flag "Use custom workspace" in your job's settings. Here is a brief help info from the setting description:
For each job on Jenkins, Jenkins allocates a unique "workspace directory."
This is the directory where the code is checked out and builds happen.
Normally you should let Jenkins allocate and clean up workspace directories,
but in several situations this is problematic, and in such case, this option
lets you specify the workspace location manually.
One such situation is where paths are hard-coded and the code needs to be
built on a specific location. While there's no doubt that such a build is
not ideal, this option allows you to get going in such a situation.
...
And your custom directory path would look like this:
workspace\$JOB_NAME\$BUILD_NUMBER ~> workspace\my-job-name\123
where $JOB_NAME will be "my-job-name" and $BUILD_NUMBER is the build number, eq. "123".
There is one nasty problem with this approach and this is why I wouldn't recommend to use it - Jenkins will not be able to reclaim disk space for outdated builds. You would have to handle cleanup of outdated builds manually and it is a lot of hassle.
Alternative approach, that gives you more control, tools and is able to keep disk space usage under control (without your supervision) is to use default workspace settings and archive your build output (files, original source code, libraries and etc.) as a post-build action. Very-very handy and gives you access to a whole bunch of great tools like, Copy Artifact Plugin or ArtifactDeployer Plugin in other jobs.
Hope that info helps you make a decision that fits your needs best.
I also use "General/Advanced/Use custom workspace" (as in #pabloduo's answer) on a Windows machine with something like:
C:\${JOB_NAME}\${BUILD_NUMBER}
Just wanted to add a solution for getting rid of the build job's workspaces.
I use Groovy Events Listener Plugin for this.
Using the plug-in's standard configuration I just use the following Groovy script:
if (event == Event.JOB_DELETED){
new File(env.WORKSPACE).deleteDir()
}
And now the custom workspace is deleted when the build job is deleted.
Just be aware that this would also delete non-custom workspaces (because the event is triggered for all jobs on your Jenkins server).