Resolving dependencies in TFS 2015 - tfs

I have two separate solutions in TFS 2015. We'll call them Solution1 and Solution2. The build for Solution1 creates an assembly which is required by Solution2. I'm not sure of the best way to handle this dependency in TFS.
Possible scenarios could include.
Each time Solution1 builds successfully it copies the new assembly to Solution2 which in turn triggers a build of Solution2 (is this possible in TFS? And if so, how?)
Each build of Solutiuon2 pulls the latest version of the assembly from Solution1
How have other people handled dependencies between TFS projects?

You should package the output of Solution 1 as a Nuget package and publish it to a Nuget repository. You can use a Network Share, MyGet, VSTS, or TFS 2017 as a Package Repo.
Your second solution can then take a dependantsy on that Nuget Package and you choose when to update.
If you want to update the packages automatically you can call something prior to Solution 2 build, like the pre-build step mentioned in comments.

As it was explained in the other answer you can manage it with NuGet deployment. That is the really clear and fancy way.
Another way might be, if you use the same output folder for both solutions, and
you always build Solution1 first.
The third way can be that you always deploy your Solution1 to a specific location which can be referenced by Solution2. It is logically similar to the NuGet version, however you do not rely on that (but this dependency in "normal" cases is IMHO acceptable).
Your actual choose can depend on the environment and on your constraints.

Related

Promoting NuGet Packages to release versions

We have several assemblies that we share over all our projects. Since last year we use NuSpec files to create packages and share them all in a internal feed. The packaging is done as part of the build process (TFS 2015). Versioning is set to automatic, use date and time. The build is a CI build and triggered when merging from the Development branch to the CI branch.
When one wants to use the packages, one has to enable "include prelease" in the NuGet Package Manager to get these packages. This is ok, for time while a package is not completely tested, but ready to release.
Question
What I am looking for is a straight forward way now, to promote such packages, once they've been created and tested, to a release version, leaving the original Major.Minor.Revision but removing the date portion of the prelease version and share that new version in a - ideally the same - feed.

git filter in on premise TFS 2015 Update 3

If the repository is git behind our TFS project there is no way to filter the repository by source folder. The build always pull the whole repository. We have multiple solutions in the TFS project we want to build separetly. We can do it, but it is slow because we cannot filter the source folder to download.
The other problem that we cannot add folder to CI trigger. So all the projects will be build after a push in any projects.
I know that the Team Services already support path filters for git repository. But does anybody know some workarounds for this problem for on premise TFS 2015 Update 3?
There is no workaround for this on TFS 2015 update3. Unless separate your components into different repositories. Multiple projects must be in their own repository. All dependencies is a project by it self and can be handled as NuGet packages. Then your whole solution would not break if you change something in the dependency project and using CI trigger.
This feature will ship in TFS 15 and is already available on VSTS
https://www.visualstudio.com/en-us/docs/build/news/2016#june-14

tfs 2015 build not performing nuget package restore

I have a bit of a strange situation. 2 applications in the same tfs repository, both using near identical build definitions, both using nuget packages, one performs a package restore, one does not.
Both build definitions contain the 'Restore NuGet Packages' option checked:
both have same .sln and .vbproj file structures.
build log of the 'good' one gives:
and the 'bad' one doesnt:
Build controller/agent is the same, I cant see any difference in the build definition or the solution configuration.
My question is where do I start looking for why these are doing something different?
Turns out the packages.config files for each of the projects were not checked in to TFS, without these there was nothing to restore. Stupid error!

How does TFS know about nuget?

From this article -
"However, there are cases where it’s not actually a person who’s doing the building and who therefore can’t provide consent this way. (And where Visual Studio isn’t even installed.) The prototypical example is a build server. In that case, NuGet will also look for an environment variable called EnableNuGetPackageRestore. To enable package restore for scenarios where the Visual Studio option is not practical, set this variable to true."
How does TFS even know how to call nuget? Do I install the nuget exe?
As you've stated in a comment, "Enable NuGet Package Restore" is no longer the recommended method for accomplishing this. The NuGet docs explain it best. Basically, because that method is integrated into MSBuild, packages that extend the build will be downloaded too late. While I'm not sure what packages do this currently it appears that there are big plans for the future of NuGet (the new ASP.NET vNext custom CLR implementation is one example of where it's headed).
The good news is that the alternative is really easy to set up. If you're in VS, you literally do nothing (unless you've disabled the automatic package restore in the past). If you have a build server you just have to make some small changes to your .proj file. If you're not familiar with MSBuild it may seem challenging but it's really quite simple. The secret is this chunk of code:
<Target Name="RestorePackages">
<Exec Command=""$(ToolsHome)NuGet\NuGet.exe" restore "%(Solution.Identity)"" />
</Target>
Again, the details are in the docs (this one specifically). If you have any questions don't be afraid to reach out: the SO community is here to help!
How does TFS even know how to call nuget? Do I install the nuget exe?
No, you should enable package restore for the solution which can be done from the context menu for the solution file in the solution explorer. This will add a folder named .nuget to your solution and in this folder you will find NuGet.exe.
If you add the solution to source control and build it on a build server the NuGet.exe executable will be used by the NuGet msbuild targets to restore packages from NuGet as part of the build.
So you do not need to install NuGet on the build server - it becomes part of your project and is placed under version control.
If you set your solutions to restore NuGet packages, and then on the build server with the build account, open vs and set Allow NuGet to download missing packages during build in Visual Studio. Your builds should restore packages ok.

How to use NuGet packages on build server/production server without internet?

Background
I have the following components:
My local solution (.NET 4.5) which makes use of NuGet packages.
A PowerShell build script in my solution that has targets to build, run unit tests, to Web.config transforms, etc.
A build server without an internet connection running CruiseControl.NET that calls my build script to build the files. It also serves as the (IIS7) environment for the dev build.
A production server with IIS7 that does not have internet access.
Goal
I would like to utilize NuGet packages from my solution and have them be stored locally as part of source -- without having to rely on an internet connection or nuget package server on my build and production servers.
Question
How can I tell MSBuild to properly deploy these packages, or is this the default behavior of NuGet?
Scott Hanselman has written an excellent article entitled How to access NuGet when NuGet.org is down (or you're on a plane). If you read through this article, you'll see at the end that the suggestions he makes are primarily temporary-type solutions and he goes out of his way to say that you should never need the offline cache except in those emergency situations.
If you read at the bottom of his article, however, he makes this suggestion:
If you're concerned about external dependencies on a company-wide
scale, you might want to have a network share (perhaps on a shared
builder server) within your organization that contains the NuGet
packages that you rely on. This is a useful thing if you are in a
low-bandwidth situation as an organization.
This is what I ended up doing in a similar situation. We have a share which we keep with the latest versions of various packages that we rely on (of course, I'm assuming you're on some type of network). It works great and requires just a little work to update the packages on a semi-regular basis (we have a quarterly update cycle).
Another article that may also be of help to you (was to me) is: Using NuGet to Distribute Our Company Internal DLLs
By default, Nuget puts all your dependencies in a packages/ folder. You can simply add this folder to your source control system and Nuget will not need to download anything from the internet when you do your builds. You'll also want to ensure that Nuget Package Restore isn't configured on your solution.
You'll have to make a decision; either you download/install the packages at build time (whether it be using package restore, your own script, or a build tool that does this for you), or you put the /packages assemblies in source control as if they were in a /lib directory.
We've had so many problems with using package restore and NuGet's Visual Studio extension internally that we almost scrapped NuGet completely because of its flaws, despite the fact that 1 of our company's 2 products is a private NuGet repository.
Basically, the way we manage the lifecycle is by using a combination of our products BuildMaster and ProGet such that:
ProGet caches all of our NuGet packages (both packages published by ourselves and ones from nuget.org)
BuildMaster performs both the CI and deployment aspect and handles all the NuGet package restoration so we never have to deal with massive checked-in libraries or the solution-munging nightmare that is package restore
If you were to adopt a similar procedure, it may be easiest to create a build artifact in your first environment which includes the installed NuGet package assemblies, then simply deploy that artifact to your production without having to repeat the process.
Hope this helps,
-Tod
I know this is an old discussion, but how in the world is it bad to store all the files required to build a project because of the size?
The idea that if a library is not available that you should replace it is crazy. Code cost money and since you don't control the libraries on git, or in nuget, a copy should be available.
One requirement that many companies have is an audit. What if a library was found to steal your data. How would you know for sure if the library is removed from NUGET and you can't even build the code to double check.
The one size fits all Nuget and git ways of the web are not OK.
I think the way Nuget worked in the past, where the files were stored locally and optionally place in source control is the way to go.

Resources