Jenkins job with multiple dynamic parameters - jenkins

Here is my requirement.
I want to use the Jenkins for packaging multiple zip files.
We have an artifactory with repo A and repo B -- Each one of them have multiple zip files. I have the api's to list the files of a repo
In Jenkins, I want to create a parameterized job where 1st parameter should be able to populate list of zip files from Repo A and 2nd parameter should be able to populate list of zip files from Repo B + In 2nd parameter i should be able to select multiple zip files populated from Repo B
Can you please suggest a better way to do this.

Try something like it:
List<String> files = populate()
doSomething(files)
List<String> populate() {
List<String> files = ''
if (JOB_PARAMETER == 'repoA') {
files.add(yourApiCall())
} //similarly for another
return files
}
JOB_PARAMETER is a parameter in your Jenkins job

Related

Jenkins FindFiles - Multiple files in multiple folders

So I have the case that I need to populate findfiles with files from more than one Dir
FILES = steps.findFiles(glob: "${FILE}/*.zip")
then I need to go to another folder and update it
FILES = steps.findFiles(glob: "${AnotherFilePath}/*.zip")
End goal is to iterate over the files and for each file do something.
e.g
for(file in FILES) {
I really want to get away from bash but is it possible to do that Jenkins Groovy way? Can u populate Files Variable?
you could use collectMany groovy method that executes closure for every item in initial list and joins result into one array
def FILES = [FILE, AnotherFilePath].collectMany{ steps.findFiles(glob: "${it}/*.zip") }

Get all folders in a given directory with Groovy script (using Extended choice parameter in Jenkins)

Im using Extended Choice Parameter in Jenkins.
I want to add a drop down parameter in the job and display all folder names within the given directory using a groovy script
how can i do that ?
You can use the following groovy script in the extended choice parameter to list all folders under a given folder (You will probably require an admin approval to run this script):
import groovy.io.FileType
def list = []
def dir = new File("/var/lib/jenkins/workspace")
dir.eachFileRecurse (FileType.DIRECTORIES) { file ->
list << file
}
return list
However, an easier option will be to use the Filesystem List Parameter Plugin.
The plugin lists file, directory or symlink names of a defined directory and exposes them as selection options for the parameter.
It also supports include/exclude patterns and execution on different nodes .

Write key/value data through Jenkins API

I already use Jenkins API for some tasks in my build pipeline. Now, there is a task that I want to persist some simple dynamic data say like "50.24" for each build. Then be able to retrieve this data back in a different job.
More concretely, I am looking for something on these lines
POST to http://localhost:8080/job/myjob//api/json/store
{"code-coverage":"50.24"}
Then in a different job
GET
http://localhost:8080/job/myjob//api/json?code-coverage
One idea is to do archiveArtifacts and save it into a file and then read it back using the API/file. But I am wondering if there is plugin or a simple way to write some data for this job.
If you need to send a variable from one build to another:
The parametrized build is the easiest way to do this:
https://wiki.jenkins.io/display/JENKINS/Parameterized+Build
the URL would look like:
http://server/job/myjob/buildWithParameters?PARAMETER=Value
If you need to share complex data, you can save some files in your workspace and use it (send the absolute path) from another build.
If you need to re-use a simple variable computed during your build
I would go for using an environment var, updated during your flow:
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
environment {
DISABLE_AUTH = 'true'
DB_ENGINE = 'sqlite'
}
stages {
stage('Build') {
steps {
sh 'printenv'
}
}
}
}
All the details there:
https://jenkins.io/doc/pipeline/tour/environment/
If you need to re-use complex data between two builds
You have two case there, it's if your build are within the same workspace or not.
In the same workspace, it's totally fine to write your data in a text file, that is re-used later, by another job.
archiveArtifacts plugin is convenient if your usecase is about extracting test results from logs, and re-use it later. Otherwise you will have to write the process yourself.
If your second job is using another workspace, you will need to provide the absolute path to your child-job. In order for it to copy and process it.

Read files of directory using Job DSL (similar to readFilesFromDirectory)

In the Job DSL, there is the method readFileFromWorkspace(), which makes it possible to read a files content from the workspace.
Now it would like to have something like readFilesFromDirectory() which gives me all files in some directory.
The goal is to make it possible to choose from different ansible playbooks:
choiceParam('PLAYBOOK_FILE', ['playbook1.yml', 'playbook2.yml'])
and to populate this list with existing files from a directory. Is something like this possible?
Well, shortly after asking this question, I found the solution.
So the Hudson API can be used:
hudson.FilePath workspace =
hudson.model.Executor.currentExecutor().getCurrentWorkspace()
def resultList = workspace.list().findAll { it.name ==~ /deploy.*\.yml/ }

Include branch name in post build event on Team Build

I would like to perform the following steps in the TFS build process:
do post build event that will copy some files from my compiled projects to another predefined directory, I'd like that directory path to include the branch name.
I'd like to be able to refer to the branch name inside my xaml workflow template as well.
The first one is rather simple. When you're using the new TFS 2013 build server and process template, you can simply add a post-build powershell script in the Build Definition Configuration, check in the script and run it during the build.
The second one is dependent on whether you're using TFVC or Git, in the first case, use the VersionControlServer class to query the BranchObjects, then check which one is the root for your working folder. Be aware though, that in TFVC multiple branches can be referenced in one workspace, so there may be multiple answers to this query, depending on which file you use the find the branchroot. A custom CodeActivity would do the trick, similar to this check in a custom checkin policy.
The code will be similar to:
IBuildDetail buildDetail = context.GetExtension<IBuildDetail>();
var workspace = buildDetail.BuildDefinition.Workspace;
var versionControlServer = buildDetail.BuildServer.TeamProjectCollection.GetService<VersionControlServer>();
var branches = versionControlServer.QueryRootBranchObjects(RecursionType.Full);
var referencedBranches = listOfFilePaths.GroupBy(
file =>
branches.SingleOrDefault(
branch => file.ServerItem.StartsWith(branch.Properties.RootItem.Item)
)
).Where(group => group.Key != null);
To get a list of all items in yo workspace, you can use Workspace.GetItems.
In case you're using Git, you have a few options as well. The simplest is to invoke the command line:
git symbolic-ref --short HEAD
or dive into LibGit2Sharp and use it to find the branch name based on the current working folder from a custom activity.
If you want to include this in an MsBuild task, this may well be possible as well. It goes a bit far for this answer to completely outline the steps required, but it's not that hard once you know what to do.
Create a custom MsBuild task that invokes the same snippet of code above, though instead of getting access to the workspace through BuildDetail.BuildDefinition.Workspace, but through the WorkStation class:
Workstation workstation = Workstation.Current;
WorkspaceInfo info = workstation.GetLocalWorkspaceInfo(path);
TfsTeamProjectCollection collection = new TfsTeamProjectCollection(info.ServerUri);
Workspace workspace = info.GetWorkspace(collection);
VersionControlServer versionControlServer = collection.GetService<VersionControlServer>();
Once the task has been created, you can create a custom .targets file that hooks into the MsBuild process by overriding certain variables or copying data when the build is finished. You can hook into multiple Targets and define whether you need to do something before or after them.
You can either <import> these into each of your projects, or you can place it in the ImportAfter or ImportBefore folder of your MsBuild version to make it load globally. You can find the folder here:
C:\Program Files (x86)\MSBuild\{MsBuild Version}\Microsoft.Common.Targets\ImportAfter

Resources