Application insights 2.2.1 in .net core 2.0 - turn off output to debug - asp.net-mvc

This is the same question as this but for the recent version of application insights 2.2.1
Since updating to the 2.2 version the debug output is filled with AI data even if it is disabled the way it used to be done.
Previously AI was enabled in startup and I could do something like this:
services.AddApplicationInsightsTelemetry(options =>
{
options.EnableDebugLogger = false;
options.InstrumentationKey = new ConnectionStringGenerator().GetAITelemetryKey();
});
The new method of adding application insights, per the new VS templates, is to add it in Program.cs like this:
public static IWebHost BuildWebHost(string[] args) =>
WebHost.CreateDefaultBuilder(args)
.UseStartup<Startup>()
.UseApplicationInsights(connectionStringGenerator.GetAITelemetryKey())
.UseSerilog()
.Build();
In this case there is no construction that takes any options and if I remove the 'UseApplicationInsights' and revert to the original method it makes no difference. Either way I get the output debug window filled wit AI logs.
In fact, even if there is no method to load AI (i.e. I remove both the 'UseApplicationInsights' and 'AddApplicationInsightsTelemetry' I get the logs.
Thanks for any help.

You can opt out of telemetry (for debug, for example) by setting a DOTNET_CLI_TELEMETRY_OPTOUT environment variable to 1.

Visual Studio is lighting up Application Insights even if you have no code to enable it. You can create an environment variable, ASPNETCORE_PREVENTHOSTINGSTARTUP = True, to prevent Visual Studio from lighting up Application Insights.
How to do this?
Right click the project in VS, Select Properties.In Debug options add environment variable as shown in below screenshot.

Related

Diagnostic.trace output for power query in Visual Studio 2019 fails to generate output

I am working on a custom data connector for power query using Visual Studio 2019. I include the following lines;
mytab = #table({"one","two"},{{1,2},{3,4}}),
tlog = Diagnostics.Trace(TraceLevel.Information,
"Hello",
()=>mytab,
true
)
and expect to see output in the Log tab in VS when started, there is none.
I checked the project properties and have "Show Engine traces" && "Show user traces" set to true under the Log section.
Have I constructed the Diagnostics.Trace correctly? Have I missed a setting to enable logging? I don't return tlog ( in tlog ) as I am returning query output and just wanting to generate some trace information.
Thanks in advance for any help.
Cheers!

BuildHTTPClient not able to get Build Definition Steps?

We are using the BuildHTTPClient to programmatically create a copy of a build definition, update the variables in memory and then save the updated object as a new definition.
I'm using Microsoft.TeamFoundation.Build2.WebApi.BuildHTTPClient 16.141. The TFS version is 17 update 3 (rest api 3.x)
This is a similar question to https://serverfault.com/questions/799607/tfs-buildhttpclient-updatedefinition-c-example but I'm trying to stay within using the BuildHttpClient libraries and not go directly to the RestAPIs.
The problem is the Steps list is always null along with other properties even though we have them in the build definition.
UPDATE Posted as an answer below
After looking at #Daniel Frosts attempt below we started looking at using older versions of the NuGet package. Surprisingly the supported version 15.131.1 does not support this but we have found out that the version="15.112.0-preview" does.
After rolling back all of our Dlls to match that version the steps were cloned when saving the new copy of the build.
All of the code examples we used work when you are using this package. We were unable to get Daniel's example working but the version of the Dll was the issue.
We need to create a GitHub issue and report it to MS
First Attempt - GetDefinitionAsync:
VssConnection connection = new VssConnection(DefinitionTypesDTO.serverUrl, new VssCredentials());
BuildHttpClient bdClient = connection.GetClient<BuildHttpClient>();
Task <BuildDefinition> resultDef = bdClient.GetDefinitionAsync(DefinitionTypesDTO.teamProjectName, buildID);
resultDef.Wait();
BuildDefinition updatedDefinition = UpdateBuildDefinitionValues(resultDef.Result, dr, defName);
updatedTask = bdClient.CreateDefinitionAsync(updatedDefinition, DefinitionTypesDTO.teamProjectName);
The update works on the variables and we can save the updated definition back to TFS but there are not any tasks in the newly created build definition. When we look at the object that is returned from GetDefinitionAsync we see that the Steps list is empty. It looks like GetDefinitionAsync just doesn't get the full object.
Second Attempt - Specific Revision:
int rev = 9;
Task <BuildDefinition> resultDef = bdClient.GetDefinitionAsync(DefinitionTypesDTO.teamProjectName, buildID, revision: rev);
resultDef.Wait();
BuildDefinition updatedDefinition = UpdateBuildDefinitionValues(resultDef.Result, dr, defName);
Based on SteveSims post we were thinking we are not getting the correct revision. So we added revision to the request. I see the same issue with the correct revision. Similarly to SteveSims post I can open the DefinitionURL in a browser and I see that the tasks are in the JSON in the browser but the BuildDefinition object is not populated with them.
Third Attempt - GetFullDefinition:
So then I thought to try getFullDefinition, maybe that's that "Full" means of course with out any documentation on these libraries I have no idea.
var task2 = bdClient.GetFullDefinitionsAsync(DefinitionTypesDTO.teamProjectName, "MyBuildDefName","$/","TfsVersionControl");
task2.Wait();
Still no luck, the Steps list is always null even though we have steps in the build definition.
Fourth Attempt - Save As Template
var task2 = bdClient.GetTemplateAsync DefinitionTypesDTO.teamProjectName, "1_Batch_Dev");
task2.Wait();
I tried saving the Build Definition off as a template. So in the Web UI I chose "Save as Template", still no steps.
Fifth Attempt: Using the URL as mentioned in SteveSims post:
Finally i said ok, i'll try the solution SteveSims used, using the webclient to get the object from the URL.
var client = new WebClient();
client.UseDefaultCredentials = true;
var json = client.DownloadString(LastDefinitionUrl);
//Convert the JSON to an actual builddefinition
BuildDefinition result = JsonConvert.DeserializeObject<BuildDefinition>(json);
This also didn't work. The build definition steps are null. Even when looking at the Json object (var json) i see the steps. But the object is not loaded with them.
I've seen this post which seems to add the Steps to the base definition, i've tried this but honestly I'm having an issue understanding how he has modified the BuildDefinition Object when referencing that via NuGet?
https://dennisdel.com/blog/getting-build-steps-with-visual-studio-team-services-.net-api/
After looking at #Daniel Frosts attempt below we started looking at using older versions of the NuGet package. Surprisingly the supported version 15.131.1 does not support this but we have found out that the version="15.112.0-preview" does.
After rolling back all of our Dlls to match that version the steps were cloned when saving the new copy of the build.
All of the code examples above work when you are using this package. We were unable to get Daniel's example working but we didn't try hard as we had working code.
We need to create a GitHub issue for this.
Found this in my code, which works.
Use this package, not sure if it could have an impact (joke).
...packages\Microsoft.TeamFoundationServer.Client.15.112.1\lib\net45\Microsoft.TeamFoundation.Build2.WebApi.dll
private Microsoft.TeamFoundation.Build.WebApi.BuildDefinition GetBuildDefinition(string projectName, string buildDefinitionName)
{
var buildDefinitionReferences = _buildHttpClient.GetFullDefinitionsAsync(projectName, "*", null, null, DefinitionQueryOrder.DefinitionNameAscending, top: 1000).Result;
return buildDefinitionReferences.SingleOrDefault(x => x.Name == buildDefinitionName && x.DefinitionQuality != DefinitionQuality.Draft);
}
With the newer clients Steps will always be empty. In newer api-versions (which are used by the newer clients) the steps have moved to Phases. If you use GetDefinitions or GetFullDefinitions and look in
definition.Process.Phases[0].Steps
you'll find them. (GetDefinitions gets shallow references so the process won't be included.)
The Steps collection still exists for compatibility reasons (we don't want apps to crash with stuff like MethodNotFoundExceptions) but it won't be populated.
I was having this problem, although I able to get Phases[0] information at runtime, but could not get it at design time. I solved this problem using dynamic type.
dynamic process = buildDefTemplate.Process;
foreach (BuildDefinitionStep tempStep in process.Phases[0].Steps)
{
// do some work here
}
Not, it is working!
Microsoft.TeamFoundationServer.Client version 16.170.0 I can get build steps through process.Phases[0].Steps only with process and step being dynamic as #whitecore above stated
var definitions = buildClient.GetFullDefinitionsAsync(project: project.Name);
foreach (var definition in definitions.Result)
{
Console.WriteLine(string.Format("\n {0} - {1}:", definition.Id, definition.Name));
dynamic process = definition.Process;
foreach (dynamic step in process.Phases[0].Steps)
{
Console.WriteLine(step.DisplayName);
}
}

Choice for build tool: MSBuild, NANT or something else?

I am doing automation in my company. We are a C# workshop.
Currently I am working on automated build. NANT is flow control tool. While NANT is not actively developed (last binary released on June 2012 and github repo is not active), MSBuild is better. Therefore, I prefer MSBuild but retiring NANT is still questionable - what is the cost?
I have come up with some pros and cons, but I know collective intelligence is better. Thanks for your help!
Update:
I have read the question, but the second answer rises a concern for me. On build machine there are multiple .NET frameworks, will it be troublesome?
MSBuild
Pros:
Commercial support
Community is growing
Intergrated with VS and TFS
Keep pace with .Net
Cons:
Rewrite current script
Not familiar by people
NANT
Pros:
Already in use
Familiar by people
Cons:
Not updated for a long time (since 2012)
Community is not active
Lack of new .Net support
We wrote FlubuCore (rewrite of Flubu). It's an open source C# library for building projects and executing deployment scripts using C# code.
Main advantages of flubu that I see are:
.Net Core support.
Easy to learn and to use because you write build script entirely in C#.
Fluent interface and intelisense.
Quite a lot of built in tasks (compile, running tests, managing iis, creating deploy package, publishing nuget packages, executing powershell scripts...)
Write your own custom c# code in script and execute it..
Run any external program or command in script with RunProgramTask.
Reference any .net library or c# source code file in buildscript. Now also available option to reference nuget package in build script.
Write tests, debug your build script..
Use flubu tasks in any other .net application.
Web api is available for flubu. Useful for automated deployments remotely.
Write your own flubu tasks and extend flubu fluent interface with them.
You can find flubu on nuget:
Search for FlubuCore.Runner if u need it for .net project
Search for dotnet-flubu if u need it for.net core project
Example of how flubu is used in .net:
protected override void ConfigureBuildProperties(IBuildPropertiesContext context) {
context.Properties.Set(BuildProps.NUnitConsolePath,
# "packages\NUnit.ConsoleRunner.3.6.0\tools\nunit3-console.exe");
context.Properties.Set(BuildProps.ProductId, "FlubuExample");
context.Properties.Set(BuildProps.ProductName, "FlubuExample");
context.Properties.Set(BuildProps.SolutionFileName, "FlubuExample.sln");
context.Properties.Set(BuildProps.BuildConfiguration, "Release");
}
protected override void ConfigureTargets(ITaskContext session) {
var loadSolution = session.CreateTarget("load.solution")
.SetAsHidden()
.AddTask(x => x.LoadSolutionTask());
var updateVersion = session.CreateTarget("update.version")
.DependsOn(loadSolution)
.SetAsHidden()
.Do(TargetFetchBuildVersion);
session.CreateTarget("generate.commonassinfo")
.SetDescription("Generates common assembly info")
.DependsOn(updateVersion)
.TaskExtensions().GenerateCommonAssemblyInfo()
var compile = session.CreateTarget("compile")
.SetDescription("Compiles the solution.")
.AddTask(x => x.CompileSolutionTask())
.DependsOn("generate.commonassinfo");
var unitTest = session.CreateTarget("unit.tests")
.SetDescription("Runs unit tests")
.DependsOn(loadSolution)
.AddTask(x => x.NUnitTaskForNunitV3("FlubuExample.Tests"));
session.CreateTarget("abc").AddTask(x => x.RunProgramTask(# "packages\LibZ.Tool\1.2.0\tools\libz.exe"));
session.CreateTarget("Rebuild")
.SetDescription("Rebuilds the solution.")
.SetAsDefault()
.DependsOn(compile, unitTest);
}
//// Some custom code
public static void TargetFetchBuildVersion(ITaskContext context) {
var version = context.Tasks().FetchBuildVersionFromFileTask().Execute(context);
int svnRevisionNumber = 0; //in real scenario you would fetch revision number from subversion.
int buildNumber = 0; // in real scenario you would fetch build version from build server.
version = new Version(version.Major, version.Minor, buildNumber, svnRevisionNumber);
context.Properties.Set(BuildProps.BuildVersion, version);
}
Example of how flubu is used in .net core
public class MyBuildScript : DefaultBuildScript
{
protected override void ConfigureBuildProperties(IBuildPropertiesContext context)
{
context.Properties.Set(BuildProps.CompanyName, "Flubu");
context.Properties.Set(BuildProps.CompanyCopyright, "Copyright (C) 2010-2016 Flubu");
context.Properties.Set(BuildProps.ProductId, "FlubuExample");
context.Properties.Set(BuildProps.ProductName, "FlubuExample");
context.Properties.Set(BuildProps.SolutionFileName, "FlubuExample.sln");
context.Properties.Set(BuildProps.BuildConfiguration, "Release");
}
protected override void ConfigureTargets(ITaskContext context)
{
var buildVersion = context.CreateTarget("buildVersion")
.SetAsHidden()
.SetDescription("Fetches flubu version from FlubuExample.ProjectVersion.txt file.")
.AddTask(x => x.FetchBuildVersionFromFileTask());
var compile = context
.CreateTarget("compile")
.SetDescription("Compiles the VS solution and sets version to FlubuExample.csproj")
.AddCoreTask(x => x.UpdateNetCoreVersionTask("FlubuExample/FlubuExample.csproj"))
.AddCoreTask(x => x.Restore())
.AddCoreTask(x => x.Build())
.DependsOn(buildVersion);
var package = context
.CreateTarget("Package")
.CoreTaskExtensions()
.DotnetPublish("FlubuExample")
.CreateZipPackageFromProjects("FlubuExample", "netstandard2.0", "FlubuExample")
.BackToTarget();
//// Can be used instead of CreateZipPackageFromProject. See MVC_NET4.61 project for full example of PackageTask
//// context.CreateTarget("Package2").AddTask(x =>
x.PackageTask("FlubuExample"));
var test = context.CreateTarget("test")
.AddCoreTaskAsync(x => x.Test().Project("FlubuExample.Tests"))
.AddCoreTaskAsync(x => x.Test().Project("FlubuExample.Tests2"));
context.CreateTarget("Rebuild")
.SetAsDefault()
.DependsOn(compile, test, package);
}
}
Detailed presentation and documentation can be found here:
https://github.com/flubu-core/flubu.core
You can find full examples here:
https://github.com/flubu-core/examples
Thanks for all answers. We have decided to use Cake since we are a C# workshop.
There is a property nant.settings.currentframework which is used to set target framework in case you have multiple .net framework
<property name="nant.settings.currentframework" value="net-2.0" />
As per .92 build:
nant.settings.currentframework The current target framework, eg. 'net-1.0'.
nant.settings.currentframework.description Deprecated. Description of the current target framework.
nant.settings.currentframework.frameworkdirectory Deprecated. The framework directory of the current target framework.
nant.settings.currentframework.sdkdirectory Deprecated. The framework SDK directory of the current target framework.
nant.settings.currentframework.frameworkassemblydirectory Deprecated. The framework assembly directory of the current target framework.
nant.settings.currentframework.runtimeengine Deprecated. The runtime engine of the current target framework if used eg. mono.exe.

Starting a Mono Process Programmatically

How can I start a process in mono using the Process.Start API? My best guess would be the following (in F#):
let start (path : string) =
System.Diagnostics.Process.Start("/usr/bin/env", sprintf "mono \"%s\"" path)
This seems to work in linux, but it is obviously not correct in Mono/Windows. Is there any way I could obtain the location of the mono executable programmatically?
It turns out that you can basically just Process.Start with just the target executable path, no need to specify the mono executable.
You can find the location of Mono on windows using the following registry keys
$version = HKLM_LOCAL_MACHINE\Software\Novell\Mono\DefaultCLR
$monoprefix = HKLM_LOCAL_MACHINE\Software\Novell\Mono\$version\SdkInstallRoot
where you use the version you found to find the mono prefix.
Taken from this page
Rather than starting a new instance of the CLR, you can start assemblies from within your existing instance. Microsoft documents the relevant functionality here: http://msdn.microsoft.com/en-us/library/yk22e11a%28v=vs.110%29.aspx. Mono implements this as well.
What you have to do is create a new AppDomain to provide you with an execution environment isolated from your current one, load an assembly in there, and execute it.
Example:
var domain = AppDomain.CreateDomain("Foo");
domain.ExecuteAssembly("Bar.exe");

MvcIntegrationTestFramework or an alternative updated for ASP.NET MVC 3

I'm interested in using Steve Sanderson’s MvcIntegrationTestFramework or a very similar alternative with ASP.NET MVC 3 Beta.
Currently when compiling MvcIntegrationTestFramework against MVC 3 Beta I get the following error due to changes in MVC:
Error 6
'System.Web.Mvc.ActionDescriptor.GetFilters()' is obsolete: '"Please call System.Web.Mvc.FilterProviders.Providers.GetFilters() now."' \MvcIntegrationTestFramework\Interception\InterceptionFilterActionDescriptor.cs Line 18
Questions
Can anybody provide the MvcIntegrationTestFramework working for ASP.NET MVC 3 Beta?
--- and / or ---
Are there similar alternatives you would recommend?
EDIT #1: Note I have e-mailed Steve the creator of MvcIntegrationTestFramework, also hoping for some feedback there.
EDIT #2 & #3: I have received a message from Steve. Quoted for your reference:
I haven't needed to use that project with MVC 3, so sorry, I don't have an updated version of it. As far as I'm aware it should be possible to update it to work on MVC 3, but you'd need to figure that out perhaps by inspecting the MVC 3 source code to notice any changes in how actions, filters, etc are invoked now. If you do update it, and if you decide to adopt it as an ongoing project (e.g., putting it on Github or similar), let me know and I'll post a link to it! (Thanks Steve!)
EDIT #4: Honestly had a quick stab at using System.Web.Mvc.FilterProviders.Providers.GetFilters() didn't get anywhere fast and simply adding the [Obsolete] found that there was an error in the internals of the MVC requests. Anybody else had a dabble?
EDIT #5: Please comment if you are using an alternative Integration Test Framework with MVC 3.
Have a look at my fork:
https://github.com/JonCanning/MvcIntegrationTestFramework/
I realize this is not the answer you're looking for but Selenium or Watin may be of use to you as an alternative to the Integration Test Framework.
Selenium will let you record tests as nUnit code so you can integrate with your existing test projects etc. Then your test can validate the DOM similarly to the Integration Test Framework. The advantage is Selenium tests can be executed with various different browsers.
Key caveat is that Selenium needs your app to be deployed on a web server, not sure if that's a show stopper for you.
I thought I would share my experiences with using MvcIntegrationTestFramework in an ASP.NET MVC 4 project. In particular, the ASP.NET MVC 4 project was a Web Role for a Windows Azure Cloud Service.
Although the example project from Jon Canning's fork worked for me (although I did change the System.Web.Mvc assembly from 3.0.0.0 to 4.0.0.0, which required a bunch of editing in the web.config file to get the tests to run and pass), I got an error whenever I tried to run the same tests against an Azure ASP.NET MVC 4 Web Role project. The error was:
System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation.
The inner exception was:
System.InvalidOperationException: This method cannot be called during the application's pre-start initialization phase.
I started wondering how an Azure Web Role project based on ASP.NET MVC 4 was different to a normal ASP.NET MVC 4 project, and how such a difference would cause this error. I did a bit of searching on the web but didn't come across anybody trying to do the same thing that I was doing. Soon enough I managed to realise that it was to do with the Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener. Part of the role of this class seems to be to ensure that the web role is running in a hosted service or the Development Fabric (you'll see a message to this effect if you switch the startup project from the cloud service project to the web role project inside a cloud service solution, and then try to debug).
The solution? I removed the corresponding listener from the Web.config file of my Web Role project:
<configuration>
...
<system.diagnostics>
<trace>
<listeners>
<!--Remove this next 'add' element-->
<add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.8.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
name="AzureDiagnostics"> <filter type="" /> </add>
</listeners>
</trace>
</system.diagnostics>
...
</configuration>
I was then able to run integration tests as normal against my Web Role project. I did, however, add the listener to the Web.Debug.config and Web.Release.config transformation files so that everything was still the same for normal deploying and debugging.
Maybe that will help somebody looking to use the MvcIntegrationTestFramework for Azure development.
EDIT
I just realised that this solution might be a bit of a 'hack' because it might not let you do integration testing on application code that relates to Azure components (e.g. the special Azure caching mechanisms perhaps). That said, I haven't come across any issues to do with this yet, although I also haven't really written that many integration tests yet either...
I used Jon Canning's updated version (https://github.com/JonCanning/MvcIntegrationTestFramework/) and it solved my problem very well for controller methods that only accept value types and strings, but did not work for those that accepted classes.
It turns out there was an issue with the code for the updated MvcIntegrationTestFramework.
I figured out how to fix it, but don't know where else to post the solution, so here it is:
A simple sample to show how it works is:
[TestMethod]
public void Account_LogOn_Post_Succeeds()
{
string loginUrl = "/Account/LogOn";
appHost.Start(browsingSession =>
{
var formData = new RouteValueDictionary
{
{ "UserName", "myusername" },
{ "Password", "mypassword" },
{ "RememberMe", "true"},
{ "returnUrl", "/myreturnurl"},
};
RequestResult loginResult = browsingSession.Post(loginUrl, formData);
// Add your test assertions here.
});
}
The call to browsingSession.Post would ultimately cause the NameValueCollectionConversions.ConvertFromRouteValueDictionary(object anonymous) method to be called, and the code for that was:
public static class NameValueCollectionConversions
{
public static NameValueCollection ConvertFromObject(object anonymous)
{
var nvc = new NameValueCollection();
var dict = new RouteValueDictionary(anonymous); // ** Problem 1
foreach (var kvp in dict)
{
if (kvp.Value == null)
{
throw new NullReferenceException(kvp.Key);
}
if (kvp.Value.GetType().Name.Contains("Anonymous"))
{
var prefix = kvp.Key + ".";
foreach (var innerkvp in new RouteValueDictionary(kvp.Value))
{
nvc.Add(prefix + innerkvp.Key, innerkvp.Value.ToString());
}
}
else
{
nvc.Add(kvp.Key, kvp.Value.ToString()); // ** Problem2
}
}
return nvc;
}
Then there was two problems:
The call to new RouteValueDictionary(anonymous) would cause the "new" RouteValueDictionary to be created, but instead of 4 keys, there are only three, one of which was an array of 4 items.
When it hits this line: nvc.Add(kvp.Key, kvp.Value.ToString(), the kvp.Value is an array, and the ToString() gives:
"System.Collections.Generic.Dictionary'2+ValueCollection[System.String,System.Object]"
The fix (to my specific issue) was to change the code as follows:
var dict = anonymous as RouteValueDictionary; // creates it properly
if (null == dict)
{
dict = new RouteValueDictionary(anonymous);
}
After I made this change, then my model class would properly bind, and all would be well.

Resources