Very slow debugging ASP.NET MVC 3 project - asp.net-mvc

I have a solution which contains 11 projects. The main project is ASP.NET MVC. When I run the project in debug mode (F5), the main page of the site loaded approximately 2 minutes. Note! Home page of the site is a login form, there's not a lot of sql queries.
Interestingly, when I run the project in without debug mode (ctrl + f5), the main page of the site loaded in a few seconds.
I tried to look for solutions. For example, I found this solution. Unfortunately, I was unable to execute the instructions written by Zeb Kimmel.
I would be glad to any advice and suggestions.
P.S. I have a processor intel core 2 duo e6300, 3gb ram, windows 7 (32 bit).

Visual Studio IDE settings
Go to Tools - Options and set the following:
Projects and Solutions - Build and Run. Check "Only build startup
projects and dependencies on Run"
This will prevent it building all the projects all the time!
Environment – General
Uncheck "Automatically adjust visual experience based on client
performance"
Uncheck "Enable rich client visual experience"
Uncheck "Use hardware graphics acceleration if available"
IntelliTrace – General
Uncheck "Enable IntelliTrace”
This disables a specific tracing technology most people don't use, that adds major overhead.
Environment - Startup
Set "At Startup" to "Show empty environment"
Disable "Download content every..."
PC setup
Get an SSD! We use an Intel SSD caching system, which is improves
our build times by about 50%. Specifically it's motherboards with
20GB mATA SSD drives, using Intel Smart Response Technology (or is
it called Rapid Storage Technology collectively?). We have it set to
"maximized" mode, which means it caches writes as well as reads. I
suspect, but haven't yet tried, that a pure SSD would improve it
even more.
If you can't get an SSD, defragment drives properly. The built-in XP
one might not good enough to defragment free space if the drive has
become heavily fragmented at some point! I used: Auslogics Disk
Defrag.
Exclude virus scan on network, work and VS folders.
Project specific
Unload projects if you're not maintaining them or are unlikely to
affect them in your work.
Refer this links
Ways to speedup Visual Studio 2010
http://social.msdn.microsoft.com/Forums/vstudio/en-US/09893b7e-8882-49e6-a1df-4b1e0ce82843/tips-for-speeding-up-debugging-stepping-through-code

Related

Visual Studio 2019 Diagnostic Tools always say "There is no data in the current set of filters"

The diagnostic tools in Visual Studio 2019 Community have stopped working. It shows it's recording the CPU profile but whenever I pause the program to see the results, the tools say "There is no data in the current set of filters."
It was working at one point, and as far as I'm aware, I didn't change anything. And if I go into the Filter drop down menu it shows everything except "Hide native code" is selected.
How can I fix this?
A recent Windows update has apparently added a second problem to the original NVIDIA Display adapter issue.
See this ticket in Microsoft Developer Community: No Data in CPU Usage Tool, Windows Update related
Recently a Windows Update broke the CPU Usage tool in Visual Studio
where no data is collected. After analyzing the tool will report,
“There is no data in the current set of filters”. This is due to a bug
in the Windows ETW subsystem such that profiling events are not
emitted, we are working with the Windows ETW team to root cause the
issue and create a fix. As this affects ETW, the underlying system
which creates the profiling data, this will affect any ETW profiler:
Visual Studio, PerfView, WPA, XPerf, etc.
This is also discussed in the Microsoft Developer Community link in Andrey's answer, profiling CPU still states no user code was running. Scroll down to Mar 17, 2021.
It seems that there is now another underlying root cause besides the
original NVIDIA one which is breaking ETW profiling system wide. This
means any ETW profiler (Visual Studio, WPA, PerfView, etc) will be
affected since all of them rely on the same ETW system. Unfortunately
the EnableTraceEx2 system call returns success and we end up with no
profiling data in the resulting trace which is making debugging
difficult. I’m engaging with the Windows team that owns the ETW
subsystem and will most likely need additional diagnostics once we
figure out what our next steps are. In the meantime if anyone has paid
product support licenses feel free to engage with that as well,
hopefully working together we can uncover the root cause. Once I get
more information from the ETW team I’ll report back, until then stay
tuned.
Hi folks, just wanted to let everyone know that we have engaged with
the Windows ETW team on this and they are investigating. It seems like
a recent Windows Update may have caused this issue and they are
working with an internal customer in Xbox who has a repro. When I have
more info on the cause, workaround, and fix I will let everyone know.
In the mean time stay tuned.
I've found an answer here. For now, the only solution seems to be disabling NVIDIA Display Adapter in Device Manager and reboot.
In my case the other sampling profilers, e.g. AMD μProf didn't work well either with that driver.
I have been having this issue for months and no fixes have worked but using the Visual Studio Installer I ran repair on the IDE and now profiling works properly.
Edit: This didn't permanently fix the issue. But going to "Virus and threat-detection settings" and disabling "Real-time detection" allows it to collect data now too.
Edit2: My first appears to be the root cause for this and there is a MSFT solution here

VFP 9 SP2 with MSSCCI: slow project loading

Is it possible, that MSSCCI make VFP project loading slow? Project has 1000+ files, workspace is server. Project loads about 120+ seconds. Network traffic is greater during loading, CPU and memory no significant change. How can I optimalize loading project please?
SOLUTION:
NO, it seems that slow loading is consequence of using MSSCCI provider for little large projects source controlled in VFP.
We looked into moving from Visual SourceSafe to TFS a few years ago. When the VFP project was integrated with TFS, opening the project took longer than with VSS. There were also other oddities with the integration, such as not being able to see when a file was already checked out by someone else. We ended up abandoning the idea and stuck with VSS. That said, I wouldn't necessarily blame the MSSCCI provider. It probably has more to do with the way VFP queries source control data.
Note that you are not required to use the VFP project integration. You can use a separate source control client to check files in/out. You'll need a process for generating text versions of binary files (SCX, VCX, etc.).
FWIW, opening projects with VSS can also be slow. Upgrading our VSS server made a big difference. You may find the same if you are running TFS on an older/slower server.
I am not using it so I cannot directly comment on it.
A project is merely a table, and a project with 1000+ files would be roughly mean around 2Mb which is nothing for today's networking (even if it meant to bring down all that data). Normally it should open instantly or with 1-2 seconds delay at most (assuming you are not using an extremely slow network).
Please provide more details about your environment.
Make sure your TFS and MSSCCI are used latest version.
Try on another client machine to see whether your issue would be reproduced.
Create a new workspace to see whether the performance persists.

Precompiling MVC ASP.NET application via publish, Are the resultant files IL Code or Native Images?

Just wanted to check whether the precompiled files from "publish" are IL or Native. The file extension are .compiled for the views, and dlls for others.
I have seen some comment that imply that one of the advantages for doing this is to eliminate startup lag due to JIT, but this implies that they are native, but I am not sure.
I have chosed "no merging" of files for my first attempt.
Confirmation appreciated.
Thanks.
EDIT
Is there any potential difference if I select x86, or "mixed platforms" or "any cpu". The latter 2 might imply IL code whereas x86 could be native. I am targetting a 32bit Azure Websites instance. I am trying to get rid of the warmup period issue.
It is IL. You can confirm it by running CorFlags.exe. The CorFlags section of the header of a portable executable image will indicate whether its 32bit or AnyCPU etc. Another utility that comes in handy is DUMPBIN.EXE.
Even if you precompile your web applications, there's going to be an initial hit when you access the website. That process is described here. Its a tad dated, but much of it still applies. But the hit with a precompiled website is going to be substantially less than a non-precompiled website.
When compiling, select "Any CPU" and control whether its a 32bit or 64bit application via IIS, Azure or the hosting environment. Let the framework do what the framework does best.

Why does my MVC4 project not have debug and release folders under the bin folder?

When I build my app, I just get a single bin folder, with all files in it, versus the usual bin\debug and bin\release folders. Why is this?
Because the website can be run by IIS ( and the various flavours of... ) in the location you built.
IIS expects the assemblies in the bin folder ( it's hard wired in the AppDomain setup ) so the web project type compile to this location.
For an interview, i was put across with this question. One of the link could be this which answers in brief .
The above link will give you the statement as below:-
Release Mode
When an assembly is built in release mode, the compiler performs all
available optimisations to ensure that the outputted executables and
libraries execute as efficiently as possible. This mode should be used
for completed and tested software that is to be released to end-users.
The drawback of release mode is that whilst the generated code is
usually faster and smaller, it is not accessible to debugging tools.
Debug Mode
Debug mode is used whilst developing software. When an assembly is
compiled in debug mode, additional symbolic information is embedded
and the code is not optimised. This means that the output of the
compiler is generally larger, slower and less efficient. However, a
debugger can be attached to the running program to allow the code to
be stepped through whilst monitoring the values of internal variables.
[Update] After little google i came across similar question- "Confused about Release/Debug folders in Visual Studio 2010" with same answer which i have quoted above.
Also, please look into why-have-separate-debug-and-release-folders-in-visual-studio. #riko and other members of Stackoverflow have answered quiet well..
This behavior is not specific to MVC4. In fact it is consistent with so-called "classic" ASP.Net, both Web Site projects and Web Applications.
The distinction between release and debug modes in ASP.Net is that Release builds need to be Published.

Could I install Delphi and my libraries on a USB key in such a way as to allow debugging of my app on a customers PC?

Back in the days of Delphi 7, remote debugging was mostly ok. You set up a TCP/IP connection, tweaked a few things in the linker and you could (just about) step through code running on another PC whilst keeping your Delphi IDE and its libraries on your development PC.
Today, with Delphi XE2,3,4 you have paserver which, at least at the moment can be flaky and slow. It is essential for iOS (cross platform) development, but here at Applied Relay Testing we often have to debug on embedded PC's that run recent Windows. To do this we have employed a number of strategies but the hardest situation of all is to visit a customer site and wish that one could 'drop in' a Delphi IDE + libraries and roll up ones sleeves to step through and set breakpoints in source code.
It is quite likely - hopefully - that the paserver remote debugging workflow and its incarnations will improve over time but for now I got to wondering how it might be possible to install Delphi + libraries + our source code on a USB key so that with only a minimal, perhaps automated setup, one could plug that key into a PC and be compiling, running and debugging fairly quickly.
I can see that the registry is one of the possible issues however I do remember that Embarcadero once talked about being able to run their apps from a USB key. Knowing how much of a pain it is to install my 20-odd libraries into Delphi though, it is not trivial and needs thinking about.
Has anyone done anything like this or have any ideas of how it might be done?
Delphi does not support what you are asking for. But what you could do is create a virtual machine with your OS, IDE, libraries etc installed in it, then copy the VM onto a USB drive, install the VM software on the customer system, and run your VM as-is. There are plenty of VM systems to choose from.
First, I need to get this out of the way: embedded PCs running Windows?? Sob.
Ok, now for a suggestion: if a full virtual machine isn't an option for this task, application-level virtualization may be. This intercepts registry calls and other application-level information and maps them to a local copy, allowing essentially any application to be turned into a portable version. The good news is that there are free versions of several programs that can turn Windows programs into virtualized apps.
The only one I've personally used is MojoPac, and found it delivered as promised although was very slow running off of a (old, very slow) flash drive.
http://lifehacker.com/309233/make-any-application-portable-with-mojopac-freedom
I haven't used this newer "freedom" version though.
Two other programs I've seen that appear to be popular are Cameyo:
http://www.techsupportalert.com/content/create-your-own-portable-virtual-version-any-windows-program.htm
and P-Apps,
http://dottech.org/26404/create-a-portable-version-of-any-software-with-p-apps/
but I can't vouch for the quality of either of these two.
Hope this helps.

Resources