I have more than 100 jobs in Jenkins and I have to change a Git URL in each and every job since we changed the git server.
I must traverse each job and change the Git URL. Can anyone help me with a groovy script?
I was able to traverse each job, but not able to get the Git URL or change it:
import hudson.plugins.emailext.*
import hudson.model.*
import hudson.maven.*
import hudson.maven.reporters.*
import hudson.tasks.*
// For each project
for(item in Hudson.instance.items) {
println("JOB : " + item.name);
}
I badly need help in this, please someone help me.
The script below will modify all Git URL. You will need to fill the modifyGitUrl method. Script is written for Git plugin version 2.3.2. Check the git plugin source code to adjust it to the version you need e.g. the constructor parameters might have changed.
import hudson.plugins.git.*
import jenkins.*
import jenkins.model.*
def modifyGitUrl(url) {
// Your script here
return url + "modified"
}
Jenkins.instance.items.each {
if (it.scm instanceof GitSCM) {
def oldScm = it.scm
def newUserRemoteConfigs = oldScm.userRemoteConfigs.collect {
new UserRemoteConfig(modifyGitUrl(it.url), it.name, it.refspec, it.credentialsId)
}
def newScm = new GitSCM(newUserRemoteConfigs, oldScm.branches, oldScm.doGenerateSubmoduleConfigurations,
oldScm.submoduleCfg, oldScm.browser, oldScm.gitTool, oldScm.extensions)
it.scm = newScm
it.save()
}
}
I would have shut the server down and edited all the config.xml files with a script(sed/awk perl or something) and then restarted jenkins to load the new configurations.
If shutting down jenkins is not an option it is posible to get edit and post every config.xml with something like this
GET http://myserver/job/config.xml| sed s/oldurl/newurl/g |POST http://myserver/job/config.xml
Related
I am trying to writing a Jenkins Shared Library for my CI process. I'd like to reference a class that is in the \src folder inside a global function defined in the \vars folder, since it would allow me to put most of the logic in classes instead of in the global functions. I am following the repository structure documented on the official Jenkins documentation:
Jenkins Shared Library structure
Here's a simplified example of what I have:
/src/com/example/SrcClass.groovy
package com.example
class SrcClass {
def aFunction() {
return "Hello from src folder!"
}
}
/vars/classFromVars.groovy
import com.example.SrcClass
def call(args) {
def sc = new SrcClass()
return sc.aFunction()
}
Jenkinsfile
#Library('<lib-name>') _
pipeline {
...
post {
always {
classFromVars()
}
}
}
My goal was for the global classes in the /vars folder to act as a sort of public facade and to use it in my Jenkinsfile as a custom step without having to instantiate a class in a script block (making it compatible with declarative pipelines). It all seems pretty straightforward to me, but I am getting this error when running the classFromVars file:
<root>\vars\classFromVars.groovy: 1: unable to resolve class com.example.SrcClass
# line 1, column 1.
import com.example.SrcClass
^
1 error
I tried running the classFromVars class directly with the groovy CLI locally and on the Jenkins server and I have the same error on both environments. I also tried specifying the classpath when running the /vars script, getting the same error, with the following command:
<root>>groovy -cp <root>\src\com\example vars\classFromVars.groovy
Is what I'm trying to achieve possible? Or should I simply put all of my logic in the /vars class and avoid using the /src folder?
I have found several repositories on GitHub that seem to indicate this is possible, for example this one: https://github.com/fabric8io/fabric8-pipeline-library, which uses the classes in the /src folder in many of the classes in the /vars folder.
As #Szymon Stepniak pointed out, the -cp parameter in my groovy command was incorrect. It now works locally and on the Jenkins server. I have yet to explain why it wasn't working on the Jenkins server though.
I found that when I wanted to import a class from the shared library I have, to a script step in the /vars I needed to do it like this:
//thanks to '_', the classes are imported automatically.
// MUST have the '#' at the beginning, other wise it will not work.
// when not using "#BRANCH" it will use default branch from git repo.
#Library('my-shared-library#BRANCH') _
// only by calling them you can tell if they exist or not.
def exampleObject = new example.GlobalVars()
// then call methods or attributes from the class.
exampleObject.runExample()
So I'm trying to make define folder level variables by putting them in a groovy file in the \vars directory.
Alas, the documentation is so bad, that it's impossible to figure out how to do that...
Assuming we have to globals G1 and G2, is this how we define them in the groovy file?
#!Groovy
static string G1 = "G1"
static string G2 = "G2"
Assuming the Groovy file is called XYZ.Groovy, how do I define it in the folder so its available for the folder's script?
Assuming I get over that, and that that LIBXYZ is the name the folder associates with the stuff in the /vars directory, is it correct to assume that when I call
#Library("LIBXYZ") _
it will make XYZ available?
In that case, is XYZ.G1 the way to access the globals?
thanks, a.
I have a working example here as I was recently curious about this. I agree that the documentation is wretched.
The following is similar to the info in README.md.
Prep: note that folder here refers to Jenkins Folders from the CloudBees Folder plugin. It is a way to organize jobs.
Code Layout
The first part to note is src/net/codetojoy/shared/Bar.groovy :
package net.codetojoy.shared
class Bar {
static def G1 = "G1"
static def G2 = "G2"
def id
def emitLog() {
println "TRACER hello from Bar. id: ${id}"
}
}
The second part is vars/folderFoo.groovy:
def emitLog(message) {
println "TRACER folderFoo. message: ${message}"
def bar = new net.codetojoy.shared.Bar(id: 5150)
bar.emitLog()
println "TRACER test : " + net.codetojoy.shared.Bar.G1
}
Edit: To use a static/"global" variable in the vars folder, consider the following vars/Keys.groovy:
class Keys {
static def MY_GLOBAL_VAR3 = "beethoven"
}
The folderFoo.groovy script can use Keys.MY_GLOBAL_VAR3.
And then usage (in my example: Basic.Folder.Jenkinsfile):
#Library('folderFoo') _
stage "use shared library"
node {
script {
folderFoo.emitLog 'pipeline test!'
}
}
Jenkins Setup: Folder
Go to New Item and create a new Folder
configure the folder with a new Pipeline library:
Name is folderFoo
Default version is master
Retrieval Method is Modern SCM
Source Code Management in my example is this repo
Jenkins Setup: Pipeline Job
create a new Pipeline job in the folder created above
though a bit confusing (and self-referential), I create a pipeline job that uses this same this repo
specify the Jenkinsfile Basic.Folder.Jenkinsfile
the job should run and use the library
I am trying to use Jenkins as a tool as an automation build.
So, I need to create a pipline with parameter that helps me to select an appropriate directory where I start a build batch file.
By the moment, I have found how to select a directory as a parameter by usage of Extensible Choice plugin.
But it allows me to select a folder at one level, but I need to go deeper and get an oportunity to select via multilevel directory levels.
For example, select directory at level1 and than at level2 and finaly at level3.
Could you please give me any advise how to do that?
Use groovy script in pipeline job to dynamically assign the directory
Thanks. I have tried to find any similar example of code or plugin but haven't been succeed with this.
So, I have decided to do that based on a standard groovy syntax. Here is the code:
node {stage "Directories list output"
def dirname = getdirlist()
echo dirname}
import java.io.File;
import java.io.IOException;
import javax.swing.JFileChooser;
#NonCPS
def getdirlist() {def initialPath = System.getProperty("user.dir");
JFileChooser fc = new JFileChooser(initialPath);
fc.setFileSelectionMode(JFileChooser.FILES_AND_DIRECTORIES);
int result = fc.showOpenDialog( null );
switch ( result ){case JFileChooser.APPROVE_OPTION:
File file = fc.getSelectedFile();
def path = fc.getCurrentDirectory().getAbsolutePath();
def outputpath="path="+path+"\nfile name="+file.toString();
break;
case JFileChooser.CANCEL_OPTION:
case JFileChooser.ERROR_OPTION:
break;}
return outputpath}
I can't make it work. I have some suspitions that Jenkins pipeline doesn't allow to open a standard Java file dialog. What can be another aproach to my task?
I've a jenkins job with few parameters setup and I've a JSON file in the workspace which has to be updated with the parameters that I pass through jenkins.
Example:
I have the following parameters which I'll take input from user who triggers the job:
Environment (Consider user selects "ENV2")
Filename (Consider user keeps the default value)
I have a json file in my workspace under run/job.json with the following contents:
{
environment: "ENV1",
filename: "abc.txt"
}
Now whatever the value is given by user before triggering a job has to be replaced in the job.json.
So when the user triggers the job, the job.json file should be:
{
environment: "ENV2",
filename: "abc.txt"
}
Please note the environment value in the json which has to be updated.
I've tried https://wiki.jenkins-ci.org/display/JENKINS/Config+File+Provider+Plugin plugin. But I'm unable to find any help on parameterizing the values.
Kindly suggest on configuring this plugin or suggest any other plugin which can serve my purpose.
Config File Provider Plugin doesn't allow you to pass parameters to configuration files. You can solve your problem with any scripting language. My favorite approach is using Groovy plugin. Hit a check-box "Execute system Groovy script" and paste the following script:
import groovy.json.*
// read build parameters
env = build.getEnvironment(listener)
environment = env.get('environment')
filename = env.get('filename')
// prepare json
def builder = new JsonBuilder()
builder environment: environment, filename: filename
json = builder.toPrettyString()
// print to console and write to a file
println json
new File(build.workspace.toString() + "\\job.json").write(json)
Output sample:
{
"environment": "ENV2",
"filename": "abc.txt"
}
With Pipeline Utility Steps plugin this is very easy to achieve.
jsonfile = readJSON file: 'path/to/your.json'
jsonfile['environment'] = 'ENV2'
writeJSON file: 'path/to/your.json', json: jsonfile
I will keep it simple. A windows batch file or a shell script (depending on the OS) which will read the environment values and open the JSON file and make the changes.
I want to programatically kick off an Anthill job from another system and set some build properties (the Git branch).
What API exists to help me do that?
An alternative (simpler but less flexible) approach...
Create a Trigger on the build workflow and use wget or curl to send an HTTP POST to Anthill passing the required parameters with the POST.
Here is a way to send an HTTP POST using an HTML FORM.
http://anthillizer.com/display/main/How+to+create+a+simple+tool+to+fire+an+AnthillPro+CI+Trigger
You can do the same thing with wget.
Hope this helps!
Eric
You'll need to the Anthill SDK (click the 'tools' link at the top of the Anthill Pro screen)
Add the remoting/lib and remoting/conf to you classpath. Using these imports:
import com.urbancode.anthill3.domain.buildrequest.BuildRequest;
import com.urbancode.anthill3.domain.buildrequest.RequestSourceEnum;
import com.urbancode.anthill3.domain.project.Project;
import com.urbancode.anthill3.domain.project.ProjectFactory;
import com.urbancode.anthill3.domain.security.User;
import com.urbancode.anthill3.domain.security.UserFactory;
import com.urbancode.anthill3.domain.trigger.remoterequest.repository.RepositoryRequestTrigger;
import com.urbancode.anthill3.domain.workflow.Workflow;
import com.urbancode.anthill3.main.client.AnthillClient;
import com.urbancode.anthill3.persistence.UnitOfWork;
import com.urbancode.anthill3.runtime.scripting.helpers.WorkflowLookup;
import com.urbancode.anthill3.services.build.BuildService;
This code will look up a project and workflow then kick off a build.
AnthillClient anthill = AnthillClient.connect(hostStage, remotingPort, username, password);
UnitOfWork uow = anthill.createUnitOfWork();
Project prj = ProjectFactory.getInstance().restoreForName("My Project"); //'My Project' is the project name.
Workflow wflow = WorkflowLookup.getForProjectAndName(prj, "My Workflow"); //'My Workflow' is the workflows name/key
User usr = UserFactory.getInstance().restoreForName("username");
RepositoryRequestTrigger req1 = new RepositoryRequestTrigger();
req1.setWorkflow(wflow);
req1.setNew();
req1.setName("Git Repository Trigger");
uow.register(req1);
uow.commit();
BuildRequest br = BuildRequest.createOriginatingRequest(wflow.getBuildProfile(),usr, RequestSourceEnum.EVENT,req1);
br.setForcedFlag(true);
//Set any build properties here
br.setPropertyValue("gitBranch","develop",false);
BuildService.getInstance().runBuild(br);