What is the difference between octo.exe's create-release and octopack as an argument to msbuild - jenkins

I am having trouble understanding the fundamentals of octopus deployment. I am using octo.exe with the create-release and deploy-release commands. I am also using the octopack plugin.
I am getting an error but that's not really the point - I want to understand how these peices fit together. I have searched and searched on this topic but every article seems to assume the reader has a ton of background info on octopus and automated deployment already, which I do not.
My question is: what is the difference between using octopack by passing the octopack argument to msbuild and simply creating a release using octo.exe? Do I need to do both, or will one or the other suffice? If both are needed, what do each of them do exactly?

Release and deployment as defined in the Octopus Deploy Documentation:
...a project is like a recipe that describes the steps (instructions) and variables (ingredients) required to deploy your apps and services. A release captures all the project and package details so it be deployed over and over in a safe and repeatable way. A deployment is the execution of the steps to deploy a release to an environment.
OctoPack is
...the easiest way to package .NET applications from your continuous integration/automated build process is to use OctoPack.
It is easy to use, but as Alex already mentioned, you could also use nuget.exe to create the package.
Octo.exe
is a command line tool that builds on top of the Octopus Deploy REST API.
It allows you to do much of the things you'd normally do through the Octopus Deploy web interface.
So, OctoPack and octo.exe serve a different purpose. You can't create a release with OctoPack and octo.exe is not for creating packages.

Octopack is there to NuGet package the project. It has some additional properties to help with pushing a package onto the NuGet feed, etc.
octo.exe is used to automate the creation of releases on the Octopus server and optionally deploy.
Note: a release in Octopus is basically a set of instructions on how to make the deployment. It includes the snapshot of variables and steps, references to the versions of the NuGet packages, etc.
octopack is a good starter, however I stopped using it some time ago with a few reasons.
No support for .Net 2.0 projects (and I needed to move all legacy apps into Octopus)
didn't like it modifying the project files (personal preference)
Pure nuget.exe was not much more work for me.

Related

F# NuGet packages in Azure Functions

Using csx scripts in Azure Functions I can use the Project.json file to install nuget packages, but when I'm using fsx scripts the packages aren't installed (the log console never shows the Starting NuGet restore message). The only way I found is installing locally and uploading the dependencies. Am I missing something?
I think that the current execution model for F# in Azure functions does not support project.json. There is a work in progress PR to improve F# support that will enable this.
For now, I think there are two options:
Install the packages locally and upload them to Azure (as you are doing)
If you're deploying via git, then I think the deployment lets you run deployment script (in the same way in which Azure WebSites let you run a deployment script).
I have not tested the second approach with Azure functions, but I think it could work. For example, see the F# Snippets' deployment script which calls a build script that starts by using Paket to restore dependencies. This way, you need just paket.bootstrapper.exe and paket.dependencies with paket.lock to specify your NuGet dependencies.

Packaging DSC configurations for TFS Release Management vNext

I am trying to grasp Release Management vNext and dsc configuration 'management' (how to manage DSC configuration files). In the 'Deploy Using PS/DSC' dialog box while editing a vNext Release Template
Why is PSScriptPath relative?
Does it really mean, that I somehow have to get my scripts I want to use relative to my current drop folder? What is the best way to do achieve this? I want to be able to do:
Have a separate git repository for configuration files
Reuse configuration files across different projects
I've read a promising article Packaging DSC configurations for Visual Studio / TFS Release Management vNext but it seems to be out dated and some kind of hack from my point of view.
How does Microsoft want us to use this? How to achieve reusable configurations in a separate repository?
Thank you
Use a submodule to your separate configuration repository, then ensure the submodule is initialized during the build. You can then copy the configuration scripts to the build drop folder as part of your build script.
The reasoning is that your deployment scripts will evolve over time, and that evolution should be something that is captured. If you ever need to redeploy an old version of your software, that old version shouldn't be deployed using new scripts -- it should be deployed using the same version it used initially.

Build and Deploy a Web Application with TFS 2015 Build

We have just installed TFS 2015 (Update 1) on-premise and are trying to create a Continuous Integration/Build system using the new TFS Build system. The build works fine, and gives me a green light, but when I look at the default build it has only built the binaries from the bin directory, and there seems to be no easy way to deploy the app on-premise to a local server.
There are two deploy options for a filesystem copy, and a powershell script, and it would certainly be easy enough to use them to copy files to a new server, but since the build only built the binaries, I don't see a tool to gather up the Web artifacts (cshtml, images, scripts, css, etc..) for this.
After an exhaustive google search, I've only found one article which talks about this at:
http://www.deliveron.com/blog/building-websites-team-foundation-build-2015/
However, this uses WebDeploy and creates a rather messy deploy package.
How can I deploy the site (standard MVC web application, in fact my tests are using the default boilerplate site created by the create project wizard) complete with artifacts to a local server in the easiest possible way? I don't want to have to install WebDeploy on the servers, and would rather use PowerShell or something to deploy the final artifacts.
The build is just the standard Visual Studio build template, with 4 steps (Build, Test, Index & Publish, Publish Build Artifacts).
We use "Visual Studio Build" step and as Arguments for MSBuild we use following line:
/p:DeployOnBuild=True /p:PublishProfile=$(DeploymentConfiguration)
On Variables tab page DeploymentConfiguration has to be configured. It must be the Name of the publish Profile (filename of the pubxml file). If the file Name is Build.pubxml the publish profile is Build.
for example:
/p:DeployOnBuild=True /p:PublishProfile=Build
I wanted to add that Ben Day has an excellent write-up that helped us package quickly and then release to multiple environments through Release Manager.
His msbuild arguments look like this:
/p:DeployOnBuild=True /p:DeployDefaultTarget=WebPublish /p:WebPublishMethod=FileSystem /p:DeleteExistingFiles=True /p:publishUrl=$(build.artifactstagingdirectory)\for-deploy\website
The difference between this and the accepted answer is that this parameter set stages everything in an artifacts folder, and then saves it as part of the build. We can then deploy exactly the same code repeatedly.
We capture the web.env.config files alongside the for-deploy folder and then use xdt transforms in the release process to ensure everything gets updated for whichever environment we're deploying to. It works well for all our web projects.
We use WebDeploy/MSDeploy for 40+ applications and love it. We do install WebDeploy on all our servers so we can deploy more easily but you could also use the Web Deploy On Demand feature which doesn't require WebDeploy be pre-installed.

Generating nuget-packages for Octopus Deploy and Azure Cloud Services

I've been trying for a week to deploy a webrole to Azure Clous Services without quite getting there.
Here is my setup:
I've got a cloud solution with a cloud project and a MVC application (standard no changes to template yet). Its under source control in Visual Studio Online.
I'm using octopack to try generating the nuget package
I'm using the buildt in nuget repo from Octopus
The Octopus server and tentacle is hosted on a VM in azure
I've created a step-template for my deployment step (see this article)
My plan:
I'd like to have a CI build to a dev-service and a seperate build to push my project to the staging environment and roll it onto the production environment using Octopus.
My problem:
The packages that are produced by Octopack seems to not contain what they should. And I've tried to play around with the nuspec file included in my webrole to get it just right. Something ends up missing either way i try.
Have anyone gotten this to work? I'd appreciate any tips pointing me in the right direction as I've slowly been running out of ideas. So i turn to you my fellow nerdlings for some much needed help.
Regards
ZiGGstern
Correct me if I'm wrong but it looks like you're in need of the octo.exe to automate deployments after build within Visual Studio/TFS Online to your target environments.
I'm trying to focus on this statement:
I'd like to have a CI build to a dev-service and a seperate build to
push my project to the staging environment and roll it onto the
production environment using Octopus.
You can configure within your build-template, using the "Post-Deploy Script Path" a PowerShell script to call the Octo.exe (with an API Key) and fire off a deployment for your desired environment(s). You can customize this per build if you so choose. I've used this method by creating a folder within the root of my Solution (I call it 'Tools' but the name doesn't matter). Within that Tools folder, I add a PowerShell script AND the octo.exe. The PS script fires the Octo.exe which makes a call to my Octopus Server and with the "create release" option, I'm able to automatically deploy to whatever environment AFTER my build finishes within TFS. Make sure to always include those files (right-click in VS and in file properties select 'always copy').
I'm not quite sure why your NuGet packages would not be configured correctly, but that should be remedied first. Your question is trying to ask for two things and it's not clear which is more important to you; NuGet package or the Deployment from CI build. Having said that, I think you need to give more details on why you think your NuGet package is inadequate or not working correctly for your Azure services.
Please note, the site you supplied is using a custom PowerShell script in the form of a step template. It may be best to try the default Azure step within Octopus first before using a customized script. Just a thought.
Read more about the Octo.exe here: http://docs.octopusdeploy.com/pages/viewpage.action?pageId=360596

How to use NuGet packages on build server/production server without internet?

Background
I have the following components:
My local solution (.NET 4.5) which makes use of NuGet packages.
A PowerShell build script in my solution that has targets to build, run unit tests, to Web.config transforms, etc.
A build server without an internet connection running CruiseControl.NET that calls my build script to build the files. It also serves as the (IIS7) environment for the dev build.
A production server with IIS7 that does not have internet access.
Goal
I would like to utilize NuGet packages from my solution and have them be stored locally as part of source -- without having to rely on an internet connection or nuget package server on my build and production servers.
Question
How can I tell MSBuild to properly deploy these packages, or is this the default behavior of NuGet?
Scott Hanselman has written an excellent article entitled How to access NuGet when NuGet.org is down (or you're on a plane). If you read through this article, you'll see at the end that the suggestions he makes are primarily temporary-type solutions and he goes out of his way to say that you should never need the offline cache except in those emergency situations.
If you read at the bottom of his article, however, he makes this suggestion:
If you're concerned about external dependencies on a company-wide
scale, you might want to have a network share (perhaps on a shared
builder server) within your organization that contains the NuGet
packages that you rely on. This is a useful thing if you are in a
low-bandwidth situation as an organization.
This is what I ended up doing in a similar situation. We have a share which we keep with the latest versions of various packages that we rely on (of course, I'm assuming you're on some type of network). It works great and requires just a little work to update the packages on a semi-regular basis (we have a quarterly update cycle).
Another article that may also be of help to you (was to me) is: Using NuGet to Distribute Our Company Internal DLLs
By default, Nuget puts all your dependencies in a packages/ folder. You can simply add this folder to your source control system and Nuget will not need to download anything from the internet when you do your builds. You'll also want to ensure that Nuget Package Restore isn't configured on your solution.
You'll have to make a decision; either you download/install the packages at build time (whether it be using package restore, your own script, or a build tool that does this for you), or you put the /packages assemblies in source control as if they were in a /lib directory.
We've had so many problems with using package restore and NuGet's Visual Studio extension internally that we almost scrapped NuGet completely because of its flaws, despite the fact that 1 of our company's 2 products is a private NuGet repository.
Basically, the way we manage the lifecycle is by using a combination of our products BuildMaster and ProGet such that:
ProGet caches all of our NuGet packages (both packages published by ourselves and ones from nuget.org)
BuildMaster performs both the CI and deployment aspect and handles all the NuGet package restoration so we never have to deal with massive checked-in libraries or the solution-munging nightmare that is package restore
If you were to adopt a similar procedure, it may be easiest to create a build artifact in your first environment which includes the installed NuGet package assemblies, then simply deploy that artifact to your production without having to repeat the process.
Hope this helps,
-Tod
I know this is an old discussion, but how in the world is it bad to store all the files required to build a project because of the size?
The idea that if a library is not available that you should replace it is crazy. Code cost money and since you don't control the libraries on git, or in nuget, a copy should be available.
One requirement that many companies have is an audit. What if a library was found to steal your data. How would you know for sure if the library is removed from NUGET and you can't even build the code to double check.
The one size fits all Nuget and git ways of the web are not OK.
I think the way Nuget worked in the past, where the files were stored locally and optionally place in source control is the way to go.

Resources