TFS testing is not picking up new code - tfs

I have a TFS 2017 instance running on windows server 2012 R2 with a test box running windows 10.
I am running into a very odd issue. Most of my automated CodedUI tests are running pretty well. However, I have noticed that code changes are not always picked up by TFS when it performs a new build, at least not in the testing code area.
The builds themselves work well, and new code always gets incorporated for those. However, when the latest build triggers a release containing CodedUI tests, those tests do not always grab the latest build.
I have noticed this primarily in my App.config file which contains connection strings that are not being updated. In one case I had three tests that ran apparently successfully, but then they ran again using the values from the old App.config file.
I also have found that changes to the [TestCategory()] attribute are not always picked up either. I use that category to specify which tests I want run in a particular release build. I use variations on the same word for my categories: CodedUI, CodedUIExtended, CodedUIStage. At first I thought the system was doing some sort of StartsWith and picking up the other names, but when I tell it to run CodedUI it is running both the CodedUI and CodedUIStage categories.
[TestCategory("CodedUI"), TestMethod]
public void UI_Login_AdminAuthenticate()
{
...
}
Because the CodedUIStage categories were recently changed and used to be CodedUI that has led me in the direction of suspecting some sort of caching being used in TFS.
Can anyone shed some light on why my category and app.config changes are not being picked up correctly? What is causing this, and could it be happening to the code itself as well when I attempt to deploy a fix/correction?
EDIT:
As suggested, I tried checking the clean option on my TFS build configuration, however it had no affect.
The release, which is triggered immediately upon build does show it is using the newly finished build number.
And looking at the artifacts, the test dll has the current date modified, so it looks like it was just created.
The test.dll.config seems to be the issue. In the artifacts it has a very old Date modified and is not the current version that is checked into TFS. Typically this would feel like a clean issue, however TFS always deletes and re-copies all destination files when doing a build or Release. I have verified that by watching the files being deleted and re-created on the file system during the process.
C:\agent_work\r6\a\ [artifact_name]\bin
EDIT2:
With a little more exploration, the build artifacts are correct. It is when the Release copies those artifacts into the release process that the problem happens.
A week ago I renamed the folder in TFS containing my test project inside the solution. The old folder name is showing up in the artifacts the Release copies into itself. The new folder name is also showing up, which means I now have two dlls, and that is causing problems.
I am not sure where the Release is finding this copy of the old folder. I explicitly deleted it from the Release's copy of the build artifacts and re-ran the build and release and it showed back up.

Thanks to the suggestions from Daniel, I eventually figured out that after creating the artifacts, my build process was then publishing those to a separate place on the file system.
Unfortunately, the Copy and Publish Build Artifacts task does not have a clean feature like the basic Copy Files task does.
As such, whenever files are removed from the build they still exist in that location when the Release process goes to try and grab what it thinks the artifacts are. And so it ends up grabbing extra files.
Manually deleting the old files from that secondary artifact destination location solved the issue.

Related

How does TFS choose which check-ins to associate with a build?

Our builds generally have a mish mash of work items and commits associated with them and I cannot tell how TFS determines what to add. We are using TFS 2015 update 3 and TFVC.
When a build runs, it gets code from a location somewhere in the branching and folder of TFVC. Typically, something like "root\dev\src\component name" in this way we avoid getting all of the code in our repository and we have CI set up to run so that any changes in this folder will result in a CI build running.
We also run daily builds which run more tests and create a release package that is used by TFS Release Management. I would expect that any changes to code inside of the folder defined in setting up the repository for this build to be included in the associated change-sets of a build. I also expect that any changes checked-in outside of these branches would not be associated. But this is not the case. We see commits from across the entire project.
Does anyone know how this is supposed to work?
I am not sure if this should go in the question or the answer but I have found some additional information, thanks to the hints provided in the answers below.
It appears that the source settings will take the common root between mapped folders of the repository settings, so if I have 2 folders $/Relo/Dev/B1/src/Claims.Services and $/Relo/Dev/B1/src/PSScripts it will take the common root $/Relo/Dev/B1/src as the source settings and include any changes from that folder down within the build. Can anyone confirm this? Of course thats not what I want to have happen. In the History tab of the build definition if I looked at the diff I can see a field "defaultBranch" in the json which seems to be the value that controls this, is there any way to update this field directly?
TFS determines what changesets should be mapped to a build based on the Source Repository Mappings (Build vNext) in the build definition and the last successful build.
So, you will see a list of the changesets with files committed in the lowest common base of any of the mapped folders including all their descendents, since the latest successful build. Whenever you get a successful build (I hope that it happens more often than failing ones ;-)) the list will shorten and only show the last check-in.
Example mappings below will result in any changeset made to anything below $/Relo/Dev/B1/src (because it is the lowest common base):
$/Relo/Dev/B1/src/Claims.Services
$/Relo/Dev/B1/src/PSScripts
Similar it will pick up all the related work items to the above changesets.
This is what should happen. If you see something else, I would have a closer look at the Repository Mappings or Source Settings of the build definition.
#Noel - I guess you are using vNext build and not XAML builds. Or are you using a mix of XAML and vNext?
In general a scheduled TFS build will associate all changes which were not associated in the last successful run of the same build.
I suggest you check once again if the source folder locations are the same for CI build and Daily build?

Jenkins Project Artifacts and Workspace

I've used jenkins for quite a few years but have never set it up myself which I did at my new job. There are a couple questions and issues that I ran into.
Default workspace location - It seems like the latest Jenkins has the default workspace in Jenkins\jobs[projectName]\workspace and is overwritten (or wiped if selected) for every build. I thought that it should instead be in Jenkins\jobs[projectName]\builds[build_id]\ so that it would be able to store the workspace state for every build for future reference?
Displaying workspace on the project>Build_ID page - This goes along with the previous as I expected each 'workspace' for previous builds to show here. Currently in my setup this individual page gives you nothing except the Git revision and what repo changes triggered the build. As well as console output. Where are the artifacts? Where is the link to this build's workspace that was used?
Archiving Artifacts in builds - When choosing artifacts, the filter doesn't seem to work. My build creates a filestructure with the artifacts in it inside workspace. I want to store this and the artifacts filter says it starts at workspace. So I put in 'artifacts' and nothing gets stores (also where would this get stored?). I have also tried '/artifacts' and 'artifacts/*'.
Any help would be great! Thanks!
It does seem like you are confused about several aspects of Jenkins.. I think your question basically boils down to the following.
What is a difference between a workspace and a build?
So, here are some thoughts on this topic:
Builds are historical data. They (usually) don't change like a workspace does during building/checkout.
Builds contain information about a run (e.g. its status, build number, change log, etc) and any artifacts that you tell it to archive (logs, test results, etc). They (usually) don't contain source code like a workspace.
Builds are stored in the Jenkins\jobs\[projectName]\builds\[build_id]\ directory. This is a directory managed by Jenkins and you (usually) do not need to modify anything in this directory. However, workspaces are directories meant for the build and you can do pretty much anything with them and place them anywhere (it does not need to be in the default Jenkins\jobs\[projectName]\workspace directory.
Workspaces should be able to be wiped at any given time. To restore it, just rebuild the job with the same parameters/revision. If you need to keep something after a build, tell Jenkins to archive it before the build is done.
In regard to saving the entire state, I don't think you need to do that. As mentioned in #4, you should be able to reproduce the same build by kicking off the same revision/parameters as the build in question. If you cannot get back to the original state from the same revision/parameters, then that might be something to strive for as debugging is going to be a nightmare. :)
A workspace is an aspect of the project and not a build and that is why there is no link to the workspace from that page. Again, a build is just saved data from a previous run. A project uses the workspace to build stuff and that is why you can get to the workspace from that page.
In regard to how to save artifacts, you must specify the names of the files you want to save. Unless you are trying to save a file called "artifacts", then you should probably use something else. How about **/*.log for all log files? or **/*.xml for all xml files?
Hope this helps.

Team foundation server's automated build is not getting the latest code

I have setup a build controller etc and the builds were failing, I have fixed these now and the build failed properly - as in because of an error.
I have fixed the error and checked the code back in but now the code is not being extracted, although sometimes one folder of many is.
I have deleted the code from the build machine and requeued a build but it keeps failing. It complains that it cannot find the solution that I specified as the build solution.
I have checked the check box to build even if nothing has changed.
Have I missed a setting somewhere for extracting the code?
TFS version is 2012 Express
Visual Studio version is 2010 Professional
I had this issue recently with TFS 2012. I think it boils down to this:
In the lastest build definition files, it appears that it performs a Clean task before updating the workspace. This means that if you do something that causes the Clean part of the build to fail, it will never download the new files in order to fix it.
Recently, I was making big changes to my build file and inevitably made a lot of mistakes, I found that if one of these mistakes caused the Clean to break, I had to go onto the Build server and change the file manually to get it working again.
Does this sound like it might be the same issue?
There are several properties in your build definition you can check. I would start with setting the "Clean Workspace" to All to ensure the correct code is being pulled down and built.
There are other checks you can look at as well like the agent set for the build and the "GetVersion" property. Check the below link out. It should be able to help you in more detail.
Define a Build Process that is Based on the Default Template

TFS 2012 Build "Access to Path Denied"

I’m using TFS 2012 Build and running into an error
Access to the path is denied
The solution being built contains about 15 projects of which a number are using the Castle.Components.Validator.2.5.0 assembly. I have seen other posts that talk about the TFS Build Access Denied errors, but they generally refer to having simultaneous builds running. In this case only one build runs at a time. Also, the error occurs when the server is restarted or the build has not run for some time.Once a build is run and fails, the next one succeeds and each one after that succeeds again until the build hasn’t been run for a while or the server is restarted. Although we can get around this, it is a manual headache. Here is the error:
C:\WINDOWS\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (3513): Unable to copy file "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll". Access to the path 'D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll' is denied.
When looking at the log file you can see that the build is trying to copy the file twice. Because the first one has a lock on the file, the second one fails and thus the build fails. Here is a snippet of the log file that shows what is happening:
2>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll".
5>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll".
2>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\MvcContrib.Mvc3.FluentHtml-ci.3.0.96.0\lib\MvcContrib.FluentHtml.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\MvcContrib.FluentHtml.dll".
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\RhinoMocks.3.6\lib\Rhino.Mocks.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Rhino.Mocks.dll".
Any help on how to fix this would be greatly appreciated.
As others mentioned, this happens when performing multithreaded builds with a common destination directory and the file copy task happens to encounter a simultaneous conflict with a copy task running for a different project.
Normally this should result in a "file used by another process" exception (which is handled and retried by the file copy task) but sometimes the file operation results in an "Access is denied" exception instead. (I'm still not sure why)
Some suggest that you should "solve the duplication", but I don't see that as being feasible for cases where all the projects need to directly reference a library like log4net.
Obviously one way to prevent the issue is to explicitly run msbuild with /p:BuildInParallel=false or /m:1 or /maxcpucount:1 (or omit the argument entirely) to force single-threaded mode.
However, in TFS 2013, the default build template automatically always passes /m (use all cores) to msbuild, which silently overrides any single-thread setting you can manually pass in. (Determined by my own experimentation and examining diagnostic logs)
Another workaround I attempted was to manually pass /p:AllowedReferenceRelatedFileExtensions=none to msbuild, which prevents all pdb and xml files from being copied from referenced libraries. (Since for a while I only ever saw xml files having this issue.) But then I kept having problems with log4net.dll.
The ultimate workaround that I used was one I discovered by decompiling the source code for Microsoft.Build.Tasks.Copy:
if (hrForException == -2147024891)
{
if (!Copy.alwaysRetryCopy)
throw;
else
this.LogDiagnostic("Retrying on ERROR_ACCESS_DENIED because MSBUILDALWAYSRETRY = 1", new object[0]);
}
If error -2147024891 (0x80070005 access is denied) occurs, the Copy task will check a special variable to see if it should retry. That value is set via an environment variable:
Copy.alwaysRetryCopy = Environment.GetEnvironmentVariable("MSBUILDALWAYSRETRY") != null;
After setting the environment variable MSBUILDALWAYSRETRY = 1 (and restarting the build server), the problem went away. And I also periodically started seeing "Retrying on ERROR_ACCESS_DENIED..." as warnings in the build logs, proving that the setting was taking effect, (instead of the builds merely coincidentally succeeding).
(Note that this environment variable is not well documented, use as appropriate.)
Update: Apparently TFS 2015 no longer overrides your /m:1 with /m (even on legacy/XAML build definitions), which should make /m:1 a valid fix again.
It looks like there are two projects copying the same file. Depending on the timing, they sometimes happen at the same time, resulting in the failure. You have to trace the node id back to find the source project. See http://blogs.msdn.com/b/buckh/archive/2012/01/21/a-tool-to-find-duplicate-copies-in-a-build.aspx for more details and code that may track it down for you.
As Buck Hodges and Nimblejoe have rightly said, this is mostly due to TFS running multiple MSBuild processes by default to build your projects.
You can override it in the build definition in Process -> 3. Advanced -> MSBuild Arguments by adding the MSBuild argument /p:BuildInParallel=false
This can also happen if you have a build agent's folder open.
I also had same problem. I got error messages that related to cannot copy since access to path denied. In my case all my dll's and xml files and so on are place at
D:\TFS\Example\Bin\Debug folder.
I right clicked on Bin folder and clicked Properties and saw that Read-only check box is checked under Attributes.
I un-checked Read only check box and cliked Apply and clicked OK on the new popup that is shown.
I went back to Visual Studio and build my solution which was giving me error messages.
Voilaa.. This time it build successfully without errors.
I donot know whether this is perfect but I did this to solve my issue.
To work around this problem I had to remove the "ReadOnly" flag on the source directory
Then in the build definition set Clean Workspace
to None
Like Ziggler, I solved this problem with building a project by removing the 'read only' property of the bin folder in my project. It is only happening to XML files stored in a /packages/ directory that is common to the solution that contains this project. The 'bin' folder is not checked into source control. I am still stumped as to the root cause of the problem.
I found the same problem which occurred after the build tried to overwrite files in the "Working Directory" it had created in a previous attempt to build. (set in the Agent)
I resolved this by manually deleting the output folder it created (in my case [Working Directory]\Binaries) before attempting the build.
This can be done automatically by changing the Build Definition. Under Process---2.Basic---Clean Workspace set this to the Outputs option
Here's a variation of this problem which I had to deal with:
I couldn't figure out why my build kept failing on an "Access to the path is denied" error, even though I had added things like /p:BuildInParallel=false and /p:OverwriteReadOnlyFiles=true to the MSBuild Arguments of my XAML build. The cause turned out to be a "Post-build event command line" in my Project's properties.
After changing
%WinDir%\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe[SNIP]
/P:Configuration=$(ConfigurationName);[SNIP]
;AutoParameterizationWebConfigConnectionStrings=false
to
%WinDir%\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe[SNIP]
/P:Configuration=$(ConfigurationName);[SNIP]
;AutoParameterizationWebConfigConnectionStrings=false;OverwriteReadOnlyFiles=true
the error went away.
One possible cause is if you have the bin or obj folders for class libraries checked-in into TFS. Deleting the bin or obj folders of the projects from TFS will resolve this issue if that is the case.
I was having this problem and chose to ignore it because I didn't want to sacrifice build performance for the sake of getting rid of some benign error messages by NuGet. However, I seem to have stumbled across a solution while trying to solve another problem, and I think it is related. I think the order of fetching of NuGet packages is related to the build order of projects in the solution. So if this has somehow become disjointed, then NuGet may be the first casualty before you run into build errors where you start getting "Metadata file 'XXX.dll' could not be found" errors which annoyingly require you to build again until the build succeeds (as described here).
So, I believe the solution is to follow the steps described in the accepted answer to the aforementioned question. Or, follow the more comprehensive steps in one of the alternative answers. In other words, disable building of all projects, restart VS, then re-enable building of all projects. This will (normally) resolve build order. And that should hopefully resolve the NuGet issue. Please let me know if this fixes it for anyone.
I had this issue, with TFS 2015.
It turned out to be because the build Agent was running under the default (NETWORK SERVICE) credentials, which didn't have write permissions on the target folder.
Once I'd removed the Agent and reinstalled it with credentials it worked.
It did have me trawling through the logs for a while, checking and unchecking the multi-proc box and even restarting the build server in my hunt.
Check the obvious stuff first...
For me, it was that the build agent wasn't started in an administrator powershell.
MSBuild arguments:- /tv:14.0 /t:Rebuild /m:1 /p:RunCodeAnalysis=false /p:TreatWarningsAsErrors=false /p:OverwriteReadOnlyFiles=true /p:BuildInParallel=false /p:AllowedReferenceRelatedFileExtensions=none
strong text
Set false to Clean workspace
Go to build agent and remove read only from mapped folder.
as a lot of people have already stated before, this happens when building projects in parallel. Project A and B both referencing 3rd Party Library C (Copy Local) will cause this when they are build at the same Time - side by side.
The real problem is, that TFS Build 2012 and below are configured that when building a solution, the whole output of the solution is copied to a single folder. Thats where the pains of parallel builds are having their origins.
Since TFS 2013 you can easily solve this by setting the "Output location" in the build definition to "PerProject". This forces the build services to behave like a local msbuild run where the setings regarding the output locations are read from the corresponding project files. So the output is written to the bin folders under each project.
For TFS 2012 and below this article (+linked articles) will help you getting the same result as with TFS 2013:
http://blog.stangroome.com/2012/05/10/override-the-tfs-team-build-outdir-property-net-4-5/
I resolved a very similar issue by closing all open instances of Visual Studio, re-opening the solution and building it again.

TFS MSBuild: $(ProjectDir) blank or random

I have a vcproj file that includes a simple pre-build event along the lines of:
Helpertask.exe $(ProjectDir)
This works fine on developer PCs, but when the solution is built on our TFS 2008 build server under MSBuild, $(ProjectDir) is either blank or points to an unrelated folder on the server!
So far the best workaround I have managed is to hard code the developer and server paths instead:
if exist C:\DeveloperCode\MyProject HelperTask.exe C:\DeveloperCode\MyProject
if exist D:\BuildServerCode\MyProject HelperTask.exe D:\BuildServerCode\MyProject
This hack works in post-build steps but it doesn't work for a pre-build step (the Pre-build task now does nothing at all under MSBuild!)
Do you have any ideas for a fix or workaround? I have very little hair left!
$(MSBuildProjectDirectory) worked for me
I think your problem may be related to how items are initalized. An items include attribute is evaluated at the begining of a build. So if you depend on files that are created in the build process you must declare these as dynamic items. Dynamic items are those defined inside of a target, or by using the CreateItem task. I've detailed this on my blog MSBuild: Item and Property Evaluation.
Sayed Ibrahim Hashimi
My Book: Inside the Microsoft Build Engine : Using MSBuild and Team Foundation Build
I think the problem is that build server's workspace probably isn't initialized properly.
I just kept getting problems with this - I tried many different approaches but they all failed in mysterious ways.
Once $(ProjectDir) started behaving properly again, the pre-build step stopped executing the command (I added echo commands above and below it - they were both executed, but the program in between them was not. No errors or output of any kind were generated to indicate why it failed).
I don't know if this is a dodgy server of if MSBuild is having a laugh.
I've given up now. I gave the build server a big kick and have changed tack: We now run this tool offline (manually) and check in the results for the build server to use. So much for an automated build :-( If only MSBuild would run solutions in the same way as Visual Studio does - it's maddening that it sets up the environment completely differently (different paths coming out of the solution variables, ouptus redirected into different folders so you can't find them where they're supposed to be, etc)
I branched an existing project and $(ProjectDir) kept the old directory in the newly branched code. But that's because I had some compiling errors. Once every project in the solution compiled without errors, $(ProjectDir) changed to the correct path.
Carlos A Merighe

Resources