After upgrading from TFS 2017 RTM to TFS 2017 Update 3, the following job started to throw ExtensionNotFound exception.
On the hub that shows the Service Hooks there aren't any available, nor any credentials are being required to open it.
Is there any reason for such behavior in the TFS 2017 Update3? Maybe some special permissions are required for the account being used, it that Update?
Possibly another note on that behavior is that there are some errors in the EventViewer, around the same time as the mentioned jobs fails in TFS 2017 Update3:
The subscriber
Microsoft.TeamFoundation.TestManagement.Server.TestRunEventListener
has been disabled.
Could not load file or assembly
'Microsoft.TeamFoundation.TestManagement.Server, Version=15.0.0.0,
Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its
dependencies. The system cannot find the file specified. (type
FileNotFoundException)
I do have the assembly so I am not sure what is going on?
After looking deeper into that behavior I have been able to answer part of the question, about the missing assembly. Actually where is it installed currently, it does not have all of the dependencies for real. It is in the
Application Tier\TFSJobAgent folder
And there part of the assemblies which it needs are not there. So I have moved it to the proper location (I believe) that is should be both:
Application Tier\Web Services\bin
Application Tier\TFSJobAgent\Plugins
Instead, because there are all of the needed assemblies that
Microsoft.TeamFoundation.TestManagement.Server
is referencing. I have also installed on another machine TFS 2017 RTM, and this proves that the current location is wrong of that particular assembly.
Now it is still interesting why that had happened, and if there are other issues with moving that assemblies around, and why it has been moved at first place after the upgrade to TFS Update 3?
I would need some time to verify the initial question about ExtentionNotFound though, will keep you updated.
Related
we have just upgraded to TFS 2017 from 2013. We had a custom plugin that ran when we changed the build quality. Since the upgrade it doesn't fire. we have tried changing the required DLLs to use the The 2017 client dlls. but the build quality handler does not trigger the plugin. it uses the Microsoft.TeamFoundation.Framework.Server.ISubscriber interface. We do not get any exceptions as well on the tfs server.
The ISubscriber implementation needs to be recompiled against the TFS 2017 Server as well as Client object model.
And it's important to understand that the new build infrastructure (the non-xaml builds) likely trigger a different set of notifications. At least they're not queryable with the old Client Object Model IBuildServer, you need to use the new REST API.
Without knowing more about your setup (what type of builds, the exact versions of the object model you're binding against, what permissions the TFS Service user has) it's hard to tell where this is going wrong. We have a troubleshooting guide for the TFS Aggregator (https://github.com/tfsaggregator/tfsaggregator/wiki/Troubleshooting) which is also a ISubscriber plugin, it may help you debug your setup.
We have set up TFS (2015) with a build server and have several solutions that are built (some manually, others automatically).
Sometimes a build will inexplicably fail, with a lot of errors stating that assemblies are missing, like so:
The type 'Object' is defined in an assembly that is not referenced. You must add a reference to assembly 'System.Runtime, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'.
To fix it I simply need to queue the build again. No edits, no check-ins, just "queue new build", and it builds successfully.
It can happen on manual builds, on automatic builds, on a single build or when building multiple solutions. I can't see any recognizable pattern to why this happens. I don't think it is a nuget issue, because we see different errors when the build server fails at downloading nuget packages (also, is nuget even used for downloading System.Runtime? I assume that library is readily available on the server?)
As I said, it is easy to fix this, but we're using continous integration and automatic deployment (Octopus) to streamline our release cycle, and these "fake" errors are damn annoying when the build error buzzer starts ringing.
Try to set 'MSBuild Arguments' to /m:1 to force MSBuild to use a single process for all projects, to see whether it is helpful.
The issue was, as might be expected, human error. All available build agents were added to the pool for this particular solution, and one of those agents didn't have .Net Framework 4.5.2 installed.The agent was removed from the pool, and if the feedback I got from the sysadmins is correct then the errors should stop occurring now.
I am getting the following error when running my build on Visual Studio Online (using the built-in Build Controller):
C:\Program Files
(x86)\MSBuild\14.0\bin\amd64\Microsoft.Common.CurrentVersion.targets
(3962): Could not copy
"d:\a\src\MySolution\MyProject\Trunk\packages\Microsoft.Data.Edm.5.6.4\lib\net40\Microsoft.Data.Edm.xml"
to "..\Build\bin\Release\Microsoft.Data.Edm.xml". Beginning retry 1 in
1000ms. The process cannot access the file
'..\Build\bin\Release\Microsoft.Data.Edm.xml' because it is being used
by another process.
It is never the same file either but it seems to always be either an xml or dll from the packages folder.
EDIT: I'm not sure if it is worth mentioning, but I do have multiple workspaces and multiple build definitions using this repository.
I found the problem. Completely unrelated to the error above.
I went into the msbuild log files and found this:
Failed to produce diagnostics extension's config for
MyRole\diagnostics.wadcfgx. Error : Could not find a part of the path
'd:\a\src...\MyRole\diagnostics.wadcfgx'. Done Building Project
"d:\a\src...\MyCloudProject.Cloud.ccproj" (Publish target(s)) --
FAILED.
I was missing a file in source control.
I do wonder why this error did not bubble up into my build summary. And where did that initial error come from?
I am using TFS with Using Visual Studio 2013 and have been able to work around this issue by closing all open documents that I want to check-in (seems VS locked itself out) and/or resolving conflicts. The error message is sufficiently vague so as to be useless as to the actual cause of the check-in failure.
Update 02 November 2016:
I'm not sure why VS 2013 and TFS don't play nice together via the Team Explorer Check-in Pending Changes button, but it consistently fails to launch the conflict resolver, a key piece of the check-in process.
The following works for me on VS 2013 and TFS hosted on a SQLServer Express 2014 database:
1. Launch the Source Explorer: Team Explorer tab -> Source Explorer
2. Navigate to your solution repository
3. Then proceed to do the following for each project that you want to check in:
a. Right click project
b. Check in pending changes
c. Resolve conflicts and repeat steps 3a and 3b until no pending changes remain for the project
I've been tasked with setting up a new Team Foundation/Build server at my company, with which we'll be starting a new project. Nobody here currently has experience with TFS, so I'm learning all of this on my own. Everything is working so far; The server's been set up, the Repository and Team Project has been created, the Build Server has been created, and I've created a simple hello world application to verify the source control and Continuous Integration builds (on the build server) run properly.
However, I'm having a problem setting up the automatic versioning. I've installed the TfsVersioning project, and it's working fine; I'm able to define a format for my assembly versions. I haven't yet decided what format I'll use; probably something like Major.Minor.Changeset.Revision (I'm aware of the potential problem regarding using the changeset number in the assembly version, so I may decide to switch to Major.Minor.Julian.Revision before we begin development).
The problem:
I don't want assemblies to have new file versions if their source code has NOT changed since the last build. With a continuous Integration build this isn't a problem, as the build server will only grab the source files that have changed, causing an incremental build which produces only updated modules; the existing unchanged modules won't be built, so their version will remain unchanged.
If I set up a nightly build, I'll want to clean the workspace and perform a Build-All. However, this means that ALL assemblies will have new version (assuming the Assembly File Version includes the build number).
A solution?
This has prompted me to consider using the latest changeset number in the Assembly File Version. This way, if nothing has been committed between two successive Build-Alls, the versions won't be incremented. However, this would mean that a change and commit to a single file would force a version increment on ALL assemblies.
I'm looking for one of two things:
A way to only increment Assembly Version Numbers if their source/dependencies have changed since the last build. Successive Build-Alls should not cause changes in version numbers.
OR
A way for testers and non-developers to be able to tell version W.X.Y.Z and version W.X.Y.Z+1 of assembly 'Foo' are identical, even though they have differing file versions.
I've probably read about 20 articles on the subject, and nobody (except this guy) seem to address the issue. If what I'm asking for isn't common practice in the Team Foundation ALM, how do I address the second bullet point above?
Thanks for your time!
This is something I did in the past. The solution has two critical points:
You must use an incremental build, i.e. Clean Workspace = None
The change to AssemblyInfo.cs must be computed at each project
This latter is the most complex and I will just draft the solution here.
In the custom MSBuild properties use CustomAfterMicrosoftCommonTargets to inject an hook in normal Visual Studio compile
/property:CustomAfterMicrosoftCommonTargets=custom.proj
Also forward a value for the version
/property:BuildNumber=1.2.3.4
In custom.proj redefine the target BeforeCompile to something similar
<Target Name="BeforeCompile"
Inputs="$(MSBuildAllProjects);
#(Compile);
#(_CoreCompileResourceInputs);
$(ApplicationIcon);
$(AssemblyOriginatorKeyFile);
#(ReferencePath);
#(CompiledLicenseFile);
#(EmbeddedDocumentation);
$(Win32Resource);
$(Win32Manifest);
#(CustomAdditionalCompileInputs)"
Outputs="#(DocFileItem);
#(IntermediateAssembly);
#(_DebugSymbolsIntermediatePath);
$(NonExistentFile);
#(CustomAdditionalCompileOutputs)"
Condition="'$(BuildNumber)'!=''">
<Message Text="*TRACE* BuildNumber: $(BuildNumber)"/>
<MyTasksThatReplaceAssemblyVersion
BuildNumber="$(BuildNumber)"
File="$(MSBuildProjectDirectory)\Properties\AssemblyInfo.cs"/>
</Target>
You need to have a task for replacing the AssemblyFileVersion in the AssemblyInfo.cs source. MSBuild Extension Pack has an AssemblyInfo task for this purpose.
I posted the full details at my blog here, here and here.
Some times ago I asked the question about how to integrate an application using dependencies on a build server and I had quite satisfying answers. Today I am facing a different case. For a project I have to use non-redistribuable depedencies (RDL object model for SSRS). It means that out-of-the-box, these assemblies are not made to be deployed for development purpose. But somehow, I need to...
My first guess was to publish them in the GAC. Fine, it worked and the build server was able to compile the project smoothly. But then I realised that it broke some applications like the Report Server and the Report Builder (probably it would also break BIDS). So publishing in the GAC is definitely not a decent solution.
My second guess was to check the assemblies in source control. Well, it could work if I had only 2 assemblies for an amount of about 1MB. But here it is 23 assemblies and 29MB I have to check in, so it is definitely not suitable either.
I don't know much about MSBuild targets and maybe it could be a solution but I really have no idea on how to use it. I have been scratching my head hard and now I have to chose between breaking my builds or breaking my services!
As some people stated in comments we finally decided to source control the assemblies.
But as we are in an environment where we sometimes need to move a lot, which means not always in office, and need to work from distance with occasionally somewhat unreliable Internet connection, we decided to put some strict condition on whether we source control the assemblies or we deploy them on the build server and development machines.
Assemblies will be source controlled if all these criterias are met:
Assemblies/Framework is not deployable/redistribuable
Assemblies/Framework deployment may interfere with local machine services stability
Total amount of deployed assemblies on the project collection does not exceed 100MB
You could try using a different repository just for these assemblies, and do a checkout/update during the build job.
Also, if you want to keep it in the main repo as well, you could use svn:externals (http://svnbook.red-bean.com/en/1.0/ch07s03.html) to automatically update the DLLs when you update your working copy.