Here is an example of what I have today:
And it remains the same regardless of the value passed to the Verbosity parameter.
Is there a way to reduce this clutter?
We are using TFS 2015.
In a build process that is based on the Default Template or the Upgrade Template, you can use the Logging Verbosity build process parameter to manage the verbosity of the information that is logged and stored.
The following table lists the Logging Verbosity values and their corresponding effects:
More details please refer this link: Manage Build Information and Control Verbosity
If you are using a custom build template. Suggest you to use minimal Verbosity When Designing a Custom Build Process Template
Users of your build process rely on verbosity filtering to reduce information overload. You can help make this filtering more effective by taking the following measures: Best Practice: Set Verbosity as Low as Possible
These are basically the targets that MSBuild compiles, and as I experienced, it cannot really be controlled with the verbosity.
Our solution was to use custom activity to run the build (we have a wrapper), where we did not pass the TFS logger parameter to the MSBuild.
However in such a case you should consider that the MSBuild task can run for long time, and your custom activity should be cancellable. In this topic I can suggest to read the below article:
https://devtfs.wordpress.com/2013/09/24/dealing-with-long-running-activities/
You may also want to show the compiled targets somehow, therefore you might need to write your own custom MSBuild Logger and a UI to show the result as well.
Related
I am trying build automation for a project developed using legacy language called Team Developer 6. where each file needs to be compiled as an exe. also need to do some filter activity before building exes. there are 300 exes.
this process I could do in simple .Net utility which does the filtering and invokes Team Developer compiler for required files.
Is it possible to put this in to TFS build work flow? what is the best approach for this?
Write an MSBuild project that invokes the necessary commands for the tooling you require and check it in. In the TFS build definition, make use of the default template (at first) and set the MSBuild project file you created as the 'project to build'.
This way you can test your build process locally with MSBuild on the command line, and determine which command line switches you might need. You can set command line switches into the build definition, or if you need some further control you can modify the default template to inject the command line switches directly into the MSBuild activity.
I recommend this way, as then you won't have to create any customized workflow, and can avoid having to go down the road of using custom workflow activities in TFS (which is absolutely supported, but in my opinion a bit difficult to diagnose/debug/maintain/upgrade).
You would ideally want to use an InvokeProcess activity to call an executable which does the filtering and invoking. An alternative but more complex approach would be to create a custom activity, but that requires installation of binaries on the build servers.
I'm attempting to modify my build process file for TFS 2010. I have a flag that is set when queuing the build, and when said flag is set, I want to create a Label, and add all the source files in the compiled project to that label.
On sequential builds, with the flag set, I than want to replace older source files in said label with anything new in the changeset being compiled.
I've been attempting to do this with LabelSources with no luck, and there is but vary poor documentation on either LabelSources or LabelWorkspace (whats the difference?).
Here's what I currently have:
<mtbwa:LabelSources
Child="[LabelChildOption.Replace]"
Comment="Published to Container"
DisplayName="Create Container Label"
sap2010:WorkflowViewState.IdRef="LabelSources_1"
Items="[{"$/Foo/LabelTest/Sandbox/"}]"
Name="[String.Format("{0}-{1}", LabelName, Version_Container)]"
Recursion="[RecursionType.Full]"
Scope="$/Foo"
mva:VisualBasic.Settings="Assembly references and imported namespaces serialized as XML namespaces"
Version="T" />
It definitely hits the action, but no labels can be found after the fact.
Any help would be much appreciated. and Any tangible documentation, other than Class Documentation with sparse definitions would also be greatly appreciated
Edit 1: Tried to clear up my goal.
What you are trying to do is built into the existing template. There should be an option in the process definition that refers to Clean Sources which will be set to True.
This option controls wither the build sources get cleaned, deleted and start afresh. Or if a differential is done.
If you have a lot of source code you can set clean sources to false and save a bunch of time getting the code.
You can also speed the build by placing a TFS Proxy on the build box which will cache the files and make a clean build quicker.
In my experience, Most of the built-in activities are poorly-documented for a reason - their only well-tested use case is their use inside TFS' built-in templates (DefaultTemplate.11.0.xaml, etc.). I'm afraid you're going to have to write some custom code, in the form of a custom activity, powershell script or something, to achieve other goals.
That said, I don't really understand the process you're trying to set up. Do you just want to have a label set as your latest-successfully-built sources? Why not use the one created automatically by the build itself?
TFS build flow is defined in TFS 2010's build template(which in fact is Windows Workflow Foundation file with *.xaml extension).
It was pretty convenient for dealing with single build definition in simple project, but in the near future we'll have more complicated project where we'll have many very different build definitions, but in the same time some of them will have some significant common parts in logic.
And there is no wish to have common logic replicated in each build template, and on the other hand having one super-smart-parametrizable build is considered as not the best idea.
Long story short, but the questions is:
is there any possibility to put common logic into another build template/or_whatever and reuse it?
If not - do you have some approaches/recommendation regarding such situation?
UPDATE
As K.Hoff mentioned, there is a possibility to create custom activities, but I want to go deeper and reuse not only activities but sequences as well(put simply, similarly to like Ant or NAnt do - include one file into another, call one sequences from another, etc).
I would recommend you to check whether it is possible to write code activity which executes workfow (.xaml file) with common build functionality. As a result such code activity could be put into several "master" build templates so it is possible to reuse common flow.
Here is an example how to dynamically load and execute workflow - http://msdn.microsoft.com/en-us/vs2010trainingcourse_introtowf_topic8.aspx.
We have a similar situation, but since most of our build scenarios are similar (i.e. get->build->test->deploy) we have mostly solved it with one big definition and custom activities. But we also make use of the ExecuteWorkflow activity available from Community TFS Build Extensions.
This works well for "simple" scenarios, the reason we don't use this more extensively is because it's quite complicated to pass parameters between workflow executions. Here's a link to a problem I had with this (and further down the solution I found).
You can create custom code activities as explained here and reuse them in other build templates.
An other way is to implement good old msbuild scripts and put them in the msbuild execution activities to reuse them in many build process templates.
I can't find a quick way to reuse complete sequences, the only way we found is to write the acitvities as common as possible and inject parameters to get them run.
But I don't think it's a TFS problem it's a Workflow problem.
We've written a framework to test the performance of our Java application (none of the existing frameworks, eg JMeter, were appropriate). The framework produces various metrics, e.g. mean/min/max transactions per second.
We'd like each Jenkins build to display these metrics so that we can keep track of whether a commit has improved performance or not.
I can't figure out how to do this.
One idea is to modify our performance test framework to output a HTML file, then somehow make Jenkins display/link to it on the build results page.
Any advice gratefully received.
The Peformance Plugin can show the results of JMeter and JUnit test in the nice, graphical fashion. And on the plugin page there is a description on how to use it.
This is an open-source plugin hosted on GitHub. The JUnit and JMeter parser are already there, but You can implement your own just by subclassing PerformanceReportParser. It's pretty easy and you can just fork the repo and start your implementation.
I agree that it is hard (if not impossible) to squeeze all the information into standard formats, like JUnit. They are good for quick identification of problems. Once you know there is a problem - you need more information that is usually free-form or custom-formatted to fit your particular needs. So we use both: JUnit that can be immediately processed by Jenkins to decide if the build is stable or not, draw the nice trend graph, etc. We also produce an HTML report that is much more detailed.
Now to your immediate question: you can simply archive your HTML file as an artifact (there is a standard post-build step to do that). Then a link to it will be displayed among the artifacts for the build. There are permalinks to the latest artifacts and latest successful build artifacts:
http://[server]/job/[job_name]/lastCompletedBuild/artifact/foo.html
http://[server]/job/[job_name]/lastSuccessfulBuild/artifact/foo.html
You may bookmark those links and have quick and easy one-click access to your results.
You could use the HTML Publisher Plugin to publish the resulting HTML page. That would be pretty straightforward.
If you want better integration you could try to create output that follows the same format JMeter produces, and use the Performance Plugin.
For best result you could take Ćukasz advice and modify the Performance Plugin to your needs. That requires the most effort on your part, of course.
I am playing with TFS 2010, and am trying to setup a build process that will have some custom steps.
These include things like, stopping/starting IIS, search and replace files etc... across environments.
I have tried to look for examples online and have not found anything clear and meaningful on how to just run a script or something over the source files. Looking at the default build process template (DefaultTemplate.xml) I cant make much sense of it.
How do I go about doing this ?
For info on customising the TFS2010 workflow build templates have a look at Ewald Hoffman's series. Start with Part 1 (archived here).
I should also mention that since it looks like you're doing deployment then you may want to break deployment automation away from build automation.
This is almost exactly what I'd say for this question (Split build and deplyment stages, investigate TFSDeployer). One additional element is more generic - for deployment tasks you can't find an easy integrated tool you should create a custom deployment script. You can call any script by adding an "InvokeProcess" step in your Build workflow. TFSDeployer also has locations where you can insert custom PowerShell Scripts. (If you don't like PowerShell you can have PowerShell or "InvokeProcess" call a different script engine.)