Nuget Build Triggers - tfs

We are considering a move to Azure DevOPS/TFS and we have built a prototype workflow which seems to work well.
The only outstanding thing from our current CI process to replicate is the triggering of builds based on nuget package updates.
Our build pipeline is a tree, where some libraries which generate nuget packages generated at the top of the tree are used as dependencies in other libraries downstream.
Using team city one of our build steps inspects the dependencies of a solution, identifies the topmost level dependencies and adds them as nuget build triggers ensuring that the next time a successful build of a dependency occurs the downstream library is triggered as well.
How can that be replicated in Azure Dev Ops?

I think you might be after something like NuKeeper:
NuKeeper automates the routine task of discovering and applying NuGet
package updates.
NuKeeper will compare the NuGet packages used in your solution to the
latest versions available on NuGet.org, and:
List available NuGet package updates on .NET code on the local file
system or on a GitHub server.
Apply NuGet package updates to .NET code
on the local file system.
Make pull requests containing updates to
code on a GitHub server.
Image stolen from Shayne Boyer's blog.

Related

Promoting NuGet Packages to release versions

We have several assemblies that we share over all our projects. Since last year we use NuSpec files to create packages and share them all in a internal feed. The packaging is done as part of the build process (TFS 2015). Versioning is set to automatic, use date and time. The build is a CI build and triggered when merging from the Development branch to the CI branch.
When one wants to use the packages, one has to enable "include prelease" in the NuGet Package Manager to get these packages. This is ok, for time while a package is not completely tested, but ready to release.
Question
What I am looking for is a straight forward way now, to promote such packages, once they've been created and tested, to a release version, leaving the original Major.Minor.Revision but removing the date portion of the prelease version and share that new version in a - ideally the same - feed.

What is the difference between octo.exe's create-release and octopack as an argument to msbuild

I am having trouble understanding the fundamentals of octopus deployment. I am using octo.exe with the create-release and deploy-release commands. I am also using the octopack plugin.
I am getting an error but that's not really the point - I want to understand how these peices fit together. I have searched and searched on this topic but every article seems to assume the reader has a ton of background info on octopus and automated deployment already, which I do not.
My question is: what is the difference between using octopack by passing the octopack argument to msbuild and simply creating a release using octo.exe? Do I need to do both, or will one or the other suffice? If both are needed, what do each of them do exactly?
Release and deployment as defined in the Octopus Deploy Documentation:
...a project is like a recipe that describes the steps (instructions) and variables (ingredients) required to deploy your apps and services. A release captures all the project and package details so it be deployed over and over in a safe and repeatable way. A deployment is the execution of the steps to deploy a release to an environment.
OctoPack is
...the easiest way to package .NET applications from your continuous integration/automated build process is to use OctoPack.
It is easy to use, but as Alex already mentioned, you could also use nuget.exe to create the package.
Octo.exe
is a command line tool that builds on top of the Octopus Deploy REST API.
It allows you to do much of the things you'd normally do through the Octopus Deploy web interface.
So, OctoPack and octo.exe serve a different purpose. You can't create a release with OctoPack and octo.exe is not for creating packages.
Octopack is there to NuGet package the project. It has some additional properties to help with pushing a package onto the NuGet feed, etc.
octo.exe is used to automate the creation of releases on the Octopus server and optionally deploy.
Note: a release in Octopus is basically a set of instructions on how to make the deployment. It includes the snapshot of variables and steps, references to the versions of the NuGet packages, etc.
octopack is a good starter, however I stopped using it some time ago with a few reasons.
No support for .Net 2.0 projects (and I needed to move all legacy apps into Octopus)
didn't like it modifying the project files (personal preference)
Pure nuget.exe was not much more work for me.

F# NuGet packages in Azure Functions

Using csx scripts in Azure Functions I can use the Project.json file to install nuget packages, but when I'm using fsx scripts the packages aren't installed (the log console never shows the Starting NuGet restore message). The only way I found is installing locally and uploading the dependencies. Am I missing something?
I think that the current execution model for F# in Azure functions does not support project.json. There is a work in progress PR to improve F# support that will enable this.
For now, I think there are two options:
Install the packages locally and upload them to Azure (as you are doing)
If you're deploying via git, then I think the deployment lets you run deployment script (in the same way in which Azure WebSites let you run a deployment script).
I have not tested the second approach with Azure functions, but I think it could work. For example, see the F# Snippets' deployment script which calls a build script that starts by using Paket to restore dependencies. This way, you need just paket.bootstrapper.exe and paket.dependencies with paket.lock to specify your NuGet dependencies.

Commit file back to repository from build server in Visual Studio Team Services

I'm currently setting up continuous integration using TFS/Visual Studio Team Services (was VS Online), and I'm using the Team Foundation Build 2015 tasks. So not the XAML builds.
I'm using it to build a Xamarin Android project, but that's pretty irreverent I guess,
The process should be like this:
After a check-in:
TFS should download the sources
TFS should increment the version number within AndroidManifest.xml
I've managed to do this by making a PowerShell script for this.
After the AndroidManifest.xml file is modified, it should be committed back into the TFS repository
Then the rest, build deploy into hockeyapp etc
The first steps are all configured, but I'm struggling with the commit part. How do I get TFS to commit the file? I don't really see any task suitable for it. I've tried using the Copy and Publish Build Artifacts Utility - But that did not seem to work, and I'm not even sure if that's the right utility
I'm using the default hosted build agent btw. Any help would be appreciated
Warning
I do want to point out that checking in changes as part of the build can lead to some features of VSTS/TFS not working. Association of work items to the checkin, sources and symbol generation, tractability from changes to build to release and integration with Test Manager, remote debugging, will likely not yield the expected results because the Changeset/commit recorded in te build may not match the actual sources. This may lead to unexpected funny behavior.
Also, if any new changes have already been committed/checked-in after the build has started, the version number may be updated in Source Control for code that was not actually released under that version.
So: First of all, it's considered a bad practice to change the sources from the build process.
Alternatives
There are better ways of doing it, one is to use the build version (Build_BuildNumber or Build_BuildID variables). Alternatively you an use a task like GitVersion to generate the semantic version based on the branch and tag in your git repository. That way your build will generate the correct version number and will increment the revision in case the same sources are built multiple times.
I understand, but I still want to check in my code as part of the build
If these things don't work for you and you still want to check in the changes as part of the build, you can either use the TFVC Build Tasks if you're using TFVC or use the Git Build Tools to add the remote to the local repository and then use the git commandline tools to commit and push the changes back to the repository.
These extensions require TFS Update 2 to install. But you can push the individual build tasks using the tfx commandlien tool. For the TFVC tasks the process is explained here.
On mac
On the mac it's going to be harder since you're using TFVC. My TFVC tasks leverage the TFS Client Object Model and Powershell to communicate to the TFS Server. The tf.exe tool doesn't even work on windows when you're in the context of a build, which is why I need to call into the VersionControlServer object directly. Given I'm dependent on these technologies, the tasks won't run on a Mac or Linux agent.
You could try to see whether the Team explorer Everywhere X-platform commandline works from the build agent (using a shell script). I have no way to test this on an actual Mac.
Given the cross platform nature of your project I'd recommend to move to Git, it integrates into XCode and Android Studio, making it easier to do a native UI or build on top of native libraries.
Alternative 2
You could setup a build which does the required changes to the code and then checks in the modified code. Then have a (CI) build run the Android and the Mac builds using the modified code.

How to use NuGet packages on build server/production server without internet?

Background
I have the following components:
My local solution (.NET 4.5) which makes use of NuGet packages.
A PowerShell build script in my solution that has targets to build, run unit tests, to Web.config transforms, etc.
A build server without an internet connection running CruiseControl.NET that calls my build script to build the files. It also serves as the (IIS7) environment for the dev build.
A production server with IIS7 that does not have internet access.
Goal
I would like to utilize NuGet packages from my solution and have them be stored locally as part of source -- without having to rely on an internet connection or nuget package server on my build and production servers.
Question
How can I tell MSBuild to properly deploy these packages, or is this the default behavior of NuGet?
Scott Hanselman has written an excellent article entitled How to access NuGet when NuGet.org is down (or you're on a plane). If you read through this article, you'll see at the end that the suggestions he makes are primarily temporary-type solutions and he goes out of his way to say that you should never need the offline cache except in those emergency situations.
If you read at the bottom of his article, however, he makes this suggestion:
If you're concerned about external dependencies on a company-wide
scale, you might want to have a network share (perhaps on a shared
builder server) within your organization that contains the NuGet
packages that you rely on. This is a useful thing if you are in a
low-bandwidth situation as an organization.
This is what I ended up doing in a similar situation. We have a share which we keep with the latest versions of various packages that we rely on (of course, I'm assuming you're on some type of network). It works great and requires just a little work to update the packages on a semi-regular basis (we have a quarterly update cycle).
Another article that may also be of help to you (was to me) is: Using NuGet to Distribute Our Company Internal DLLs
By default, Nuget puts all your dependencies in a packages/ folder. You can simply add this folder to your source control system and Nuget will not need to download anything from the internet when you do your builds. You'll also want to ensure that Nuget Package Restore isn't configured on your solution.
You'll have to make a decision; either you download/install the packages at build time (whether it be using package restore, your own script, or a build tool that does this for you), or you put the /packages assemblies in source control as if they were in a /lib directory.
We've had so many problems with using package restore and NuGet's Visual Studio extension internally that we almost scrapped NuGet completely because of its flaws, despite the fact that 1 of our company's 2 products is a private NuGet repository.
Basically, the way we manage the lifecycle is by using a combination of our products BuildMaster and ProGet such that:
ProGet caches all of our NuGet packages (both packages published by ourselves and ones from nuget.org)
BuildMaster performs both the CI and deployment aspect and handles all the NuGet package restoration so we never have to deal with massive checked-in libraries or the solution-munging nightmare that is package restore
If you were to adopt a similar procedure, it may be easiest to create a build artifact in your first environment which includes the installed NuGet package assemblies, then simply deploy that artifact to your production without having to repeat the process.
Hope this helps,
-Tod
I know this is an old discussion, but how in the world is it bad to store all the files required to build a project because of the size?
The idea that if a library is not available that you should replace it is crazy. Code cost money and since you don't control the libraries on git, or in nuget, a copy should be available.
One requirement that many companies have is an audit. What if a library was found to steal your data. How would you know for sure if the library is removed from NUGET and you can't even build the code to double check.
The one size fits all Nuget and git ways of the web are not OK.
I think the way Nuget worked in the past, where the files were stored locally and optionally place in source control is the way to go.

Resources