Conditional execution of batch files in TFS release pipeline - tfs

I am working over the TFS and facing an issue in which there are different folders are available in TFS Repository
Ex:
C#Project
Extreme
CCM
Basically they are different technologies folders and tfs users just only do the check in of corresponding folder.
In Release pipeline i have various batch task which basically executes some batch script file over agent and execute.
There are multiple batch tasks and they perform some actions and my problem is that i want to execute conditionally batch files.
For ex if any changes occur over the C# application it won't execute some of the scripts, if a changes occur over a specific folder then a specific bat file will execute rest won't execute.

There is nothing out-of-the-box that allows you to execute tasks conditionally based on folder where changes were. However, you can do what you want to do it two ways
1) Create a separate build definition for each of your technology areas. Use the Path filter in your trigger to control which build gets triggered based on the path of your changed files.
2) Create variables for each technology in your build definition. At the start of the build definition, add a Powershell task (or something similar) that sets the appropriate variable(s) based on what files were changed. You can use these variables in the custom condition for your task execution.

Related

TFS Online - Dont Zip empty folder

We're using TFS-Online to One-Click-Deploy our Software.
From time to time it happens, that we need to use some special scripts, we store in a folder. This basically means, most of the time said folder stays empty.
If i now go and trigger a build, i have there followings tasks
Now the question:
Is there any way to suppress these two tasks if the folder to be zipped/deleted is empty?
The tasks are the built in ones.
Note: This is NO on-premise TFS
You could specify conditions for running a task in VSTS. Express the condition as a nested set of functions. The agent evaluates the innermost function and works its way out. The final result is a boolean value that determines if the task is run or not.
In your case, a solution should be:
Add a powershell task prior to the Archive Files task.
Use the powershell task to judge the folder is empty or not.
If the folder is empty then fail the powershell task.(Remember to check Continue on error or always run option)
Add a condition for both Archive File and Delete File task such as Only when all previous tasks have succeeded
After this, those two task will not run when the special folder is empty during the build pipeline.
More details please refer this thread Specify conditions for running a task.

Call Build vNext task directly

Build vNext tasks are an awesome improvement over the previous build process. One downside though is that I can't make some tasks conditional. I can create an additional build for every combination, but this clearly scales badly and causes lots of additional work if we have to change some other part of the build.
Instead I'd prefer being able to write my own PowerShell tasks that can call existing build tasks. There is at least one downside to this (if no build asks specifically for the vso-task the build agent won't download it), but considering we are using on-premise TFS and build agents I can live with this.
I tried to do something like the following:
$path = get-item "$env:AGENT_HOMEDIRECTORY\Tasks\NuGetPackager\0.1.56\NuGetPackager.ps1"
& "$path" -searchPattern $searchPattern -outputDir "$packageFolder" -configurationToPackage $configurationToPackage -nugetAdditionalArgs "$nugetAdditionalArgs -version $nugetVersion"
Sadly this causes the following error:
2016-04-12T09:50:22.3652811Z ##[error]import-module : Could not load file or assembly 'Microsoft.TeamFoundation.DistributedTask.Agent.Interfaces,
2016-04-12T09:50:22.3652811Z ##[error]Version=14.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find
2016-04-12T09:50:22.3652811Z ##[error]the file specified.
2016-04-12T09:50:22.3652811Z ##[error]At C:\Agent1\Tasks\NuGetPackager\0.1.56\NuGetPackager.ps1:19 char:1
Now one solution I found on the web indicates that I could add the looked for dlls to the GAC, but I really, really don't want to. Also clearly the tasks work just fine when called from TFS directly, so what configuration am I missing?
I tried adding the folder containing the dlls to the path and even call SetDllDirectory explicitly in the PowerShell, but neither of those help.
Environment: Windows Server 2012 R2 on both build agent and TFS server. TFS 2015 Update 1.
The Powershell task Host that's used by the build agent for 2015 RTM up to Update 2 is a custom host which does creative things to resolve assemblies and handle input/output. These tasks can't be called from outside the agent.
Plus, quite a few build tasks are implemented using Node, so you'll have to detect which one is which and invoke them accordingly.
The build tasks are being migrated to a new vsts-task-lib, which will support out-of-agent invocation. These would allow exactly what you want.
In the mean time you could take the existing tasks (they're a simple manifest plus script in most cases) and add one string parameter to the task in which you stick a variable which you can then treat as the condition. You'd need to replace all the standard tasks. Then push them again. if you keep the ExtensionID and the Task GUID the same, they'll act as in-place replacements. This is probably the easiest way to do what you want without having to perform all kinds of hacks that take away the Task's UI. Just set the version number to something ridiculously higher, like 100.0.1.83. that way you'll always end up using your version.
Note: the new builds are meant to be repeatable, in that calling the same build multiple times they always yield the same results. conditional actions can be captured in custom powershell scripts that are stored in source control. These can be executed as part of the workflow.

How to configure Jenkins in order to build project using ant and custom args

There's too much routine with building next project version using ant. The routine is in several properties files that must be edited before running ant task. I took a look at Jenkins as a system to make builds (including night ones) but I have a problem with changing properties.
Is it possible (if yes, how can I do it) to type parameters in Jenkins configuration before build in order they will be passed to ant?
What I really mean is the following schema (I used in manual builds):
there're 2 properties files that contain data about build version, src destination, emails to notify about new build and so on.
corresponding properties' keys are used in Ant tasks and these properties are changed manually before build.
some properties are read by Java util and used for their own part during build.
there're also 3 or 4 ant XMLs that a imported in build.xml, and these xmls also read properties from mentioned files.
What I want to do is:
change key properties in Jenkins
press build project
my data will overwrite data in properties files OR will be passed as ant vars values straight to the ant's task(s).
as a result I receive new build with corresponding notifications (they're made through ant)
Are there mechanisms that allow one to make such schema work via Jenkins?
Thank you in advance.
In Jenkins, you can use the parameterised build feature to specify those parameters you need to substitute into your build.
For example, if specify a parameter called server and, when clicking "Build Now", you enter test, the build will be executed with an environment variable you can access called ${server}.
Then, in your "Invoke Ant" build step, if you press Advanced..., this reveals a "Properties" field. Here you can enter my.ant.property=${server}.
That's equivalent to calling ant -Dmy.ant.property=${server}, and will be expanded to ant -Dmy.ant.property=test.
Another option : Set environment variables for the scope of the build using this Env plugin. So if the properties you are using are environment variables or can be set as them then you want to use this one. Though it might involve some effort in changing the build scripts, but it can be a good option :
Q : Why would I use this one as I already have parametrized build plugin
A : Because the parametrized build plugin requires human interaction if there is more than 1 choice. For example building for Release 1 or Release 2 or Test branch.
While in the Env plugin, you can set the property once for each choice and then create a respective job for each. Then just schedule the job(s) thereby eliminating the human factor.

Including empty folders in tfs build

Our C# solution has a couple of folders that are populated via post-build events (i.e. they are initially empty).
The solution builds fine locally, however when using the TFS build agent, the folders don't show up in the published websites folder.
Any suggestions on how to force TFS to copy the folders over?
This is addressed here: publish empty directories to a web application in VS2010 web project
TFS does not execute the AfterBuild target of your proj file. I believe it will execute the AfterCompile target but this still might not do what you want.
I have used the approach of including dummy files which is simple enough even though its lame.
I've tried the approach of including a powershell script to do some post-publish tasks which works.
More recently I have adopted a convention of including a supplemental MSBuild file that ends in ".package.proj" and added an additional MSBuild execution activity to my Team Build Template that looks for it after the files are dropped to the drop location and then executes it. This provides a simple hook into the Team Build workflow without getting you deep into changing the workflow for a particular build. It's just a single additional activity wrapped in a conditional that looks for the file. If found, execute it.
You could also make a custom build template and use the Workflow activities to perform various cleanup tasks, but it's probably overkill and will increase maintenance on the build templates. Better to keep the customization simple if you can and have it function in a way that doesn't require "opt-out" configuration on builds that don't require the customization. Existing vanilla builds should continue to work as expected after the customization to the template.

Jenkins - Running instances of single build concurrently

I'd like to be able to run several builds of the same Jenkins job simultaneously.
Example:
Build [*jenkins_job_1*]: calls an ant script with parameter 'A'
Build [*jenkins_job_1*]: calls an ant script with parameter 'B'
repeat as necessary
each instance of the job runs simultaneously, rather than through a queue.
The reason I'd like to do this is to avoid having to create several jobs that are nearly identical, all of which would need to be maintained.
Is there a way to do this, or maybe another solution (ie — dynamically create a job from a base job and remove it after it's finished)?
Jenkins has a check box: "Execute concurrent builds if necessary"
If you check this, then it'll start multiple builds for a job.
This works with the "This build is parameterized" checkbox.
You would still trigger the builds, passing your A or B as parameters. You can use another job to trigger them or you could do it manually via a script.
You can select Build a Multi-configuration project (Matrix build) when you create the job. Then, under the job's configuration, you can define the Configuration Matrix which lets you specify one or more parameters (axes) for different builds. Regarding running simultaneously, you should be able to run as many simultaneous builds as you have executors (with the appropriate label).
Unfortunately, the Jenkins wiki lacks documentation about this setup. There are a couple previous SO questions, here and here, that might provide a little guidance. There was a "recent" blog post about setting up a multi-configuration job to perform builds on various platforms.
A newer (and better) solution is the Jenkins Job DSL Plugin.
We've been using it with great success. Our job configurations are now disposable... we can set up a huge stack of complicated jobs from some groovy files and a couple template jobs. It's great.
I'm liking it a lot more than the matrix builds, which were complicated and harder to understand.
Nothing stopping you doing this using the Jenkins pipeline DSL.
We have the same pipeline running in parallel in order to model combined loads for an application that exposes web services, provides a database to several external applications, receives data via several work queues and has a GUI front end. The business gives us non-functional requirements (NFRs) which our application must meet that guarantees its responsiveness even at busy times.
The different instances of the pipeline are run with different parameters. The first instance might be WS_Load, the second GUI_Load and the third Daily_Update_Load, modelling a large data queue that needs processing within a certain time-frame. More can be added depending on which combination of loads we're wanting to test.
Other answers have talked about the checkboxes for concurrent builds, but I wanted to mention another issue: resource contention.
If your pipeline uses temporary files or stashes files between pipeline stages, the instances can end up pulling the rug from under each others' feet. For example you can end up overwriting a file in one concurrent instance while another instance expects to find the pre-overwritten version of the same stash. We use the following code to ensure stashes and temporary filenames are unique per concurrent instance:
def concurrentStash(stashName, String includes) {
/* make a stash unique to this pipeline and build
that can be unstashed using concurrentUnstash() */
echo "Safe stashing $includes in ${concurrentSafeName(stashName)}..."
stash name: concurrentSafeName(stashName), includes: includes
}
def concurrentSafeName(name) {
/* make a name or name component unique to this pipeline and build
* guards against contention caused by two or more builds from the same
* Jenkinsfile trying to:
* - read/write/delete the same file
* - stash/unstash under the same name
*/
"${name}-${BUILD_NUMBER}-${JOB_NAME}"
}
def concurrentUnstash(stashName) {
echo "Safe unstashing ${concurrentSafeName(stashName)}..."
unstash name: concurrentSafeName(stashName)
}
We can then use concurrentStash stashName and concurrentUnstash stashName and the concurrent instances will have no conflict.
If, say, the two pipelines both need to store stats, we can do something like this for filenames:
def statsDir = concurrentSafeName('stats')
and then the instances will each use a unique filename to store their output.
You can create a build and configure it with parameters. Click the This build is parameterized checkbox and add your desired param(s) in the Configuration of the build. You can then fire off simultaneous builds using different parameters.
Side note: The "Bulk Builder" in Jenkins might push it into a queue, but there's also a This bulk build is parameterized checkbox.
I was having a pretty large build queue and I performed below steps to run jobs in
parallel in jenkins to reduce number of jobs waiting in queue
For each job you need to navigate to configure and select the checkbox stating
"Execute concurrent builds if necessary"
Navigate to Manage -> Configure System -> look for "# of executors" and set the no
of parallel executors you want (in my case it was set to 0 and I updated it to 2)

Resources