In Visual Studio can I run the Performance Profiler on a specific unit test? - visual-studio-2019

I would like to see where I have bottle necks in a specific method. I believe that this is input dependent (a specific set of inputs will cause a much longer run time than others). I already have unit tests exploring the various input possibilities.
Is there a way for me in Visual studio to get the Debug > Performance Profiler to run on a specific unit test the same way I would debug a specific test (Test Explorer > Right Click > Debug)?
Visual Studio Professional 2019 Version 16.11.7

Related

Usage of Fine Code Coverage in Visual Studio 2022 Community Edition

I've installed the extension Fine Code Coverage (Version 1.1.191) in Visual Studio 2022 Community Edition. I got a few xUnit test-projects in my solution. I'm getting the coverage statistics and the report by using the following settings:
run the tests from the visual studio Test Explorer (not the xunit testrunner)
Test => Options => Fine Code Coverage => RunMsCodeCoverage:
selection of "No" or "IfInRunSettings" produces the mentioned output
selection of "Yes" and not providing runsettings fails to produce any output and the Output window for FCC gives the message "No cobertura files for ms code coverage"
The README of the project states for the usage of MS Code Coverage with FCC:
Firstly you need to change the RunMsCodeCoverage option from No.
Ms code coverage requires a runsettings file that is configured appropriately for code coverage. This requires that you have the ms code coverage package and have pointed to it with the TestAdaptersPaths element as well as specifying the ms data collector.
...
FCC does not require you to do this. If you do not provide a runsettings and RunMsCodeCoverage is Yes then FCC will generate one. If RunMsCodeCoverage is IfInRunSettings then if the project has runsettings that includes the ms data collector element configured correctly then FCC will process the collected results.
As pointed out in my settings above I do not get a result if "RunMsCodeCoverage" is set to "Yes" without providing runsettings. On the other hand setting "IfInRunSettings" without providing runsettings I'm getting the desired output (are the runsettings generated automatically in this case?). So I find the description in the README in contrast to my results. Could it be that the documentation in this point is referring to the enterprise edition of VS? I'd like to be able to understand the different setup options/requirements so anyone who can shed a light on these is welcome.

Validation Errors when using multiple IsEmbedded references

I'm encountering a problem when trying to create a DSL for use in Visual Studio IDE.
I'm trying to use Visual Studio Modeling SDK to generate a Visual Studio Extension allowing for the drawing of Fault Trees1 and Reliability Block Diagrams2 plus also some Markov Models3 for reliability analysis.
As part of this I'd like two have two 'things', a Gate, and an Event, which both have an output port. As the OutPort would not be shared with other objects, and is really inherent in the object itself, I've tried to make this IsEmbedded.
However the Validation Engine appears to complain when I have this IsEmbedded and 1..1 multiplicity. With a Validation Error that it must be 0..1 multiplicity. Now that I have configured this however, I'm still getting the same Validation Error.
I did post this on the MSDN Visual Studio Forum4, however with no success (I was redirected to Stack Overflow).
Does anyone know of good resources for the Modeling SDK?
Whilst the DSL platform appears to be well suited to do what I'm after, if there is another way of getting the Toolbox drag/drop and node layout capability (easily) within Visual Studio then I'm happy to hear about it.

Built-in code analysers vs NuGet packages

Having just switched to VS2019 I’m exploring whether to use code analysis. In the project properties, “code analysis” tab, there are numerous built-in Microsoft rule sets, and I can see the editor squiggles when my code violates one of these rules. I can customise these rule sets and “save as” to create my own.
I have also seen code analyser NuGet packages such as “Roslynator” and “StyleCop.Analyzers”. What’s the difference between these and the built-in MS rules? Is it really just down to more comprehensive sets of rules/more choice?
If I wanted to stick with the built-in MS rules, are there any limitations? E.g. will they still get run and be reported on during a TFS/Azure DevOps build?
What's the difference between legacy FxCop and FxCop analyzers?
Legacy FxCop runs post-build analysis on a compiled assembly. It runs as a separate executable called FxCopCmd.exe. FxCopCmd.exe loads the compiled assembly, runs code analysis, and then reports the results (or diagnostics).
FxCop analyzers are based on the .NET Compiler Platform ("Roslyn"). You install them as a NuGet package that's referenced by the project or solution. FxCop analyzers run source-code based analysis during compiler execution. FxCop analyzers are hosted within the compiler process, either csc.exe or vbc.exe, and run analysis when the project is built. Analyzer results are reported along with compiler results.
Note
You can also install FxCop analyzers as a Visual Studio extension. In this case, the analyzers execute as you type in the code editor, but they don't execute at build time. If you want to run FxCop analyzers as part of continuous integration (CI), install them as a NuGet package instead.
https://learn.microsoft.com/en-us/visualstudio/code-quality/fxcop-analyzers-faq?view=vs-2019
So, the built-in legacy FxCop and NuGet analyzers only run at build time while the extension analyzers can run at the same time the JIT compiler does as you type. Also, you have to specifically say to run legacy code analysis on build, whereas the NuGet analyzers will run on build just because they are installed. And analyzers installed as NuGet or extensions won't run when you go to the menu option "Run Code Analysis".
At least, that's what I get out of that page.
There's a link near the bottom of that page that takes you to what code analysis rules have moved over to the new analyzers, including rules that are now deprecated.
https://learn.microsoft.com/en-us/visualstudio/code-quality/fxcop-rule-port-status?view=vs-2019
The different analyzers attempt to cover different coding styles and things Microsoft didn't cover when they built FxCop. With the little research I just did on this, there's a whole rabbit hole to follow, Alice, that would take more time than I have right now to devote to it. And it seems to be filled with lots of arcane knowledge and OCD style code nitpicks that make Wonderland seem normal. But that's just my opinion.
There's lots of personal and professional opinion about various rules in these and basic Microsoft rules, so there's plenty of room to use what you want and disable what you don't. For a beginner, I'd suggest turning on only a few rules at a time. That way you aren't inundated with more warnings and errors than lines of code you might have. Ok, so that might be a bit of an exaggeration, but there's so many rules that really are nitpicks, especially on legacy code, that they aren't really worth it to have enabled, since you likely won't have time to fix it all. You will also want to do basic research and use "common sense" when you decide what to enable. ("Do I really need to worry about variable capitalization coding style consistency on an app that's been ported into 4 different languages over 15+ years and has 10k files?") This is both personal and professional opinion here, so follow it or not.
And don't forget the rules that contradict each other. Those are fun to deal with.......

How to disable Specflow intellisense?

My project is very large and has a huge number of test steps. As a result, when I am writing 'feature' files I find my computer grinds to a halt. On very large feature files, even without typing anything, one of my CPU cores will max out, and performance will degrade to the point where typing is extremely laggy, forcing me to restart Visual Studio.
Even on smaller feature files, the performance when writing feature files is also extremely slow as the Specflow intellisense looks at all the test steps in the project.
Is there a way to disable the Specflow intellisense or even to stop Specflow from analysing step bindings? Is there anything at all I can do to improve performance here?
You can disable intellisense by going to Tools -> Options -> SpecFlow -> Editor settings -> Enable Intellisense.
But I am afraid that it will just stop showing that in IDE, but analysis continue, so it won't solve your CPU issue (according to the code)
But the option "Enable project-wide analysis" set to False should solve your problem

Very slow debugging ASP.NET MVC 3 project

I have a solution which contains 11 projects. The main project is ASP.NET MVC. When I run the project in debug mode (F5), the main page of the site loaded approximately 2 minutes. Note! Home page of the site is a login form, there's not a lot of sql queries.
Interestingly, when I run the project in without debug mode (ctrl + f5), the main page of the site loaded in a few seconds.
I tried to look for solutions. For example, I found this solution. Unfortunately, I was unable to execute the instructions written by Zeb Kimmel.
I would be glad to any advice and suggestions.
P.S. I have a processor intel core 2 duo e6300, 3gb ram, windows 7 (32 bit).
Visual Studio IDE settings
Go to Tools - Options and set the following:
Projects and Solutions - Build and Run. Check "Only build startup
projects and dependencies on Run"
This will prevent it building all the projects all the time!
Environment – General
Uncheck "Automatically adjust visual experience based on client
performance"
Uncheck "Enable rich client visual experience"
Uncheck "Use hardware graphics acceleration if available"
IntelliTrace – General
Uncheck "Enable IntelliTrace”
This disables a specific tracing technology most people don't use, that adds major overhead.
Environment - Startup
Set "At Startup" to "Show empty environment"
Disable "Download content every..."
PC setup
Get an SSD! We use an Intel SSD caching system, which is improves
our build times by about 50%. Specifically it's motherboards with
20GB mATA SSD drives, using Intel Smart Response Technology (or is
it called Rapid Storage Technology collectively?). We have it set to
"maximized" mode, which means it caches writes as well as reads. I
suspect, but haven't yet tried, that a pure SSD would improve it
even more.
If you can't get an SSD, defragment drives properly. The built-in XP
one might not good enough to defragment free space if the drive has
become heavily fragmented at some point! I used: Auslogics Disk
Defrag.
Exclude virus scan on network, work and VS folders.
Project specific
Unload projects if you're not maintaining them or are unlikely to
affect them in your work.
Refer this links
Ways to speedup Visual Studio 2010
http://social.msdn.microsoft.com/Forums/vstudio/en-US/09893b7e-8882-49e6-a1df-4b1e0ce82843/tips-for-speeding-up-debugging-stepping-through-code

Resources