Calculated automatically completed work in TFS 2013 Agile template - tfs

How can I implement cumulative automatically hours in completed work field? I tried to use Aggregator plugin but it not working for me. I need to sum all change in remaining work.
I used Agile template, VS 2013 and TFS 2013.4
<!--Sum all remaining work -->
<AggregatorItem operationType="Numeric" operation="Sum" linkType="Self" linkLevel="1" workItemType="Task">
<TargetItem name="CompletedWork"/>
<SourceItem name="RemainingWork"/>
</AggregatorItem>
Can you help me?

If you're using the Scrum Process Template I don't think you have enough data to calculate this. The Remaining Work field is not a good option to try to capture this. I may start the day with Remaining Work = 8, work 6 hours on the item, but at the end of the day recognize that there is still 4 hours of work left (it was bigger than originally estimated). In that case the remaining work would only decrease by 4 even though I worked 6 hours.
If you need to capture actual work completed, you should be using a separate field on the Work Items. Both the Agile and CMMI process templates have fields for this. If you stick with the Scrum template I'd suggest adding a Completed Work field in addition to the Remaining Work field.

Related

Jenkins plugin with viewing\aggregating possibilities depending on one of the parameters

I'm looking for plugin where I could have aggregation of settings and view for many cases, the same way it is in multi-branch pipeline. But instead of basing on various branches I want to base on one branch but varying on parameters. Below picture is from mentioned multi-branch pipeline, instead of "Branches" I'm looking for "Cases" and instead of "Name" column I need to have configurable Parameter.
Additionally to it, I need to have various Periodic build triggers in way
H 22 * * 5 %param1=value1 %param2=value3
H 22 * * 5 %param1=value2 %param2=value3
The second case could be done in standard job, but since there will be many such cases launched periodically every week or two weeks or every month, and difference in param1 is crucial and is important to have it readable and easily visible to quickly distinguish which case have failed.
I was looking for such plugin but couldn't find something like this. Maybe someone knows such plugin or way to solve it.
I have alternative of creating "super"job which in build steps would launch my current job with specific parameters. Then my readability would change from many rows to many columns since the number is over 20 it will IMHO significantly decrease readability(in column solution) and additionally not all cases would be launched with same periodicity. So there would be necessity to have some ready sets assigned by parameter, and most often the super build cases would have mostly skips in it. What would result that one might not see last result for one of the cases.
Note, that param2, has always same value for periodic launches. Other values are used only with manual trigger. Param2 can but doesn't have to be visible on "multi-branch pipeline" like solution.
I hope my explanation of issue is clear. Looking forward for answers\suggestions etc. :)

TFS aggregator not working as expected

I have tried to use TFS aggregator to simply total up a field..
<?xml version="1.0" encoding="utf-8"?>
<AggregatorItems tfsServerUrl="[server Url]">
<AggregatorItem operationType="Numeric" operation="Sum" linkType="Self" workItemType="Task">
<TargetItem name="Total Work"/>
<SourceItem name="Total Work"/>
<SourceItem name="Completed Work"/>
</AggregatorItem>
</AggregatorItems>
Now what I am wanting to do is have Total Work start at zero (so I have a default rule on that) And when someone enters(logs time) in completed work. It will simply +=.
but it seems to go crazy and when I refresh the page it is totalling many many times.
Is it because I am using Total Work in the SourceItem as well as TargetItem
Every time I refresh the task it gets bigger and bigger. I really only want it totalling when someone enters a value in the Completed Work.
As I recall, the way I made tfs aggregator to know the difference between a user change event and one that fired from an update by the aggregator was to re-run the aggregation and see if the value is the same.
If the aggregation sees a change is needed then it will update the work item.
since your target is a souce, that breaks.

Burndown chart for TFS with Ideal trend starting at max(remaining hours) and not first date(remaining hours)

We use the MSF Agile 5.0 process template in TFS. I wonder if anyone has tweaked the Burndown chart so the Ideal trend would get it's height from the highest point the Remaining work has reached and not the value remaining work has at day 1 of the iteration.
In the burndown below the Ideal Trend would start at 130 hours and not 20.
]1
Does anyone have an rdl-file to share?
Here is a question that shows the code in the report but I don't know how to change it (http://social.msdn.microsoft.com/Forums/sv-SE/tfsreporting/thread/e901242b-3a82-41e6-8fa4-d0ab29bceb5e)
I wonder if you can just simply set the start date of the iteration to Nov 6th for example?

TFS Cube - Total Lines of Code appears incorrect?

I'm using the TFS cube as documented here and am getting a curious result for 'total lines'. If I look at a file inside of visual studio, I see that a file is perhaps 42 lines long (total, comments, whitespace, and all). However, when I ask the TFS cube for that same information, it tells me that the file is almost - but not exactly - twice its size.
I have my pivot table set up as follows:
Report Filter includes a specific team project, and is filtered on file extension (.cs)
Row labels set to Filename.Parent_ID
Values set to 'Total Lines'
I've looked at the MSDN guidance here and can't see what I've done wrong, other than noting that I have not selected an individual build (if i do so, I get no results).
Edit: As it may be relevant, I'm using TFS 2008 SP1 with SQL 2005 standard. There is a note on the MSDN page which cautions me that SQL 2005 Standard does not support perspectives, and 'the cube elements from all perspectives reside in the team system data cube'. I'm not sure what that means for this problem, if anything.
Check your linebreaks in the files : does numbers change if you convert files between windows/linux line endings?
Please add lines with 60, 90, 150, 200 characters and check how many added lines will be reported. Might be some work-wrapping.

How to fix the endless printing loop bug in Nevrona Rave

Nevrona Designs' Rave Reports is a Report Engine for use by Embarcadero's Delphi IDE.
This is what I call the Rave Endless Loop bug. In Rave Reports
version 6.5.0 (VCL10) that comes bundled with Delphi 2006, there is a
nortorious bug that plagues many Rave report developers. If you have a
non-empty dataset, and the data rows for this dataset fit exactly into a
page (that is to say there are zero widow rows), then upon PrintPreview,
Rave will get stuck in an infinite loop generating pages.
This problem has been previously reported in this newsgroup under the
following headings:
"error: generating infinite pages"; Hugo Hiram 20/9/2006 8:44PM
"Rave loop bug. Please help"; Tomas Lazar 11/07/2006 7:35PM
"Loop on full page of data?"; Tony Chistiansen 23/12/2004 3:41PM
reply to (3) by another complainant; Oliver Piche
"Endless lopp print bug"; Richso 9/11/2004 4:44PM
In each of these postings, there was no response from Nevrona, and no
solution was reported.
Possibly, the problem has also been reported on an allied newsgroup
(nevrona.public.rave.reports.general), to wit:
6. "Continuously generating report"; Jobard 20/11/2005
Although it is not clear to me if (6) is the Rave Endless loop bug or
another problem. This posting did get a reply from Nevrona, but it was
more in relation to multiple regions ("There is a problem when using
multiple regions that go over a page-break.") than the problem of zero
widows.
This is more of a work-around than a true solution. I first posted this work-around on the Nevrona newsgroup (Group=nevrona.public.rave.developer.delphi.rave; Subject="Are you suffering from the Rave Endless Loop bug?: Work-around announced."; Date=13/11/2006 7:06 PM)
So here is my solution. It is more of a work-around than a good
long-term solution, and I hope that Nevrona will give this issue some
serious attention in the near future.
Given your particular report layout, count the maximum number of rows
per page. Let us say that this is 40.
Set up a counter to count the rows within the page (as opposed to rows within the whole report). You could do this either by event script or by a CalcTotal component.
Define an OnBeforePrint scripted event handler for the main data band.
In this event handler set the FinishNewPage property of the main data band to be True when the row-per-page count is one or two below the max (in our example, this would be 38). And set it to False in all other cases. The effect of this is to give every page a non-zero number of widows (in this case 1..38), thus avoiding the condition that gives rise to the Rave Endless loop problem.
Thanks so much for this Sean - unfortunately this wouldn't work for me but I came up with another solution...
You see I have a memo at the top of the region that might expand or contract depending on how many notes the user has left in the database. This means that the number of rows that can fit on a page varies.
However. there is another solution - you use the MaxHeightLeft property of a databand.
All you do is measure the height of your databand, multiply it by 2, and put this in your MaxHeightLeft property. This will force 1 or 2 records onto the next page if it fills up that much.
thank's a lot, this thread helps me out from my problem with endless printing loop in Nevrona Rave...., I set MinHeightLeft to 0,500, this setting is work but i'm not sure that it will work for anothers result set of my query report.
Master,
The solution is MinHeightLeft to 0,500 , i use property wastefit area in true and generated the loop in the second print, but when changed the property MinHeightLeft to 0,500 the error disapear.
Thanks !
Atte
Fabiola Herrera.
Fabi_ucv#hotmail.com

Resources