How to avoid using BuildManager.AddReferencedAssembly - asp.net-mvc

We are developing a web application which has tons of plugin dll files.
Now, we are able to load all the assemblies we need but we are using BuildManager.AddReferencedAssembly while doing this. Which - I guess - causes a little problem.
The problem is, we need to update these dll files without restarting the application but these dll files are locked while application is running.
We are not using bin folder for these dll files. We are using 2 different folders (first one - ~/plugins - for copying files by us, second - ~/ptemp - folder is used for copying the files located under plugins folder and then loading in application like ones in bin folder)
As you know while application is working, you can overwrite the files in bin folder, so it means that it is possible to overwrite project assemblies while application is running.
After some research I found that it is possible by using private folder stuff in appdomainsetup but couldn't figure out how. (we dont want to create another appdomain, we want these files located in ptemp folder to be attached to main domain)
Sorry about my poor english but if you can, please help us.
By the way, some of these dll files includes some mvc areas.
We are using .Net 4.5, Mvc 4.
If it is necessary, I can post some code.

Here is what we are doing:
var shadowCopyDirectories = AppDomain.CurrentDomain.SetupInformation.ShadowCopyDirectories;
var newShadowCopyDirectories = shadowCopyDirectories
+ System.IO.Path.PathSeparator
+ HostingEnvironment.MapPath("~/App_Data/DynamicAssemblies/");
#pragma warning disable 618
// Disabled compiler warning
AppDomain.CurrentDomain.SetShadowCopyPath(newShadowCopyDirectories);
#pragma warning restore 618
Basicly enables shadow copying of assemblies in the given directory to Temporary ASP.NET Files which means that those assemblies will not be locked. Have tried finding the non obsolete way of doing this but after searching for like 30 minutes I came up empty and didnt have the time to search further.
Still you can not avoid an application restart because:
1) You can not unload the 'old' assemblies from the currently running application, which means you are gonna be left with junk in memory and can face a kind of dll hell by having loaded two versions of the same assembly (e.x. Type A originates from Assembly 1 while Type B originates from assembly B)
2) You need a way to tell the BuildManager to recompile everything
But in the end you are probably better off enabling shadow copying because when recycling, a second worker process might be spawned while the first one waits until the second one is ready in which case you will not be able to update those 'dynamic' assemblies until you completely stop your application pool.

It seems that the best way to do this is an application restart

Related

What is the purpose of "building" a .NET MVC application if it runs fine without it?

I have a simple MVC Web application in the .NET Framework. To run it, I can click the green arrow ("play" button) in Visual Studio, which does a "build" and starts a Web browser pointing to the application.
Or, I can just start up IIS Express with the proper command line options, and navigate to localhost:8080 in a browser and run the application without a "build".
What is the purpose of "building" the application in Visual Studio if it runs fine without it?
The simple answer is that it doesn't run without the build step; your assumptions are wrong.
However, Visual Studio continuously monitors your source files and compiles them, e.g. to be able to show intellisense suggestions and compiler errors while you type. This means that there are in fact compiled binaries based on your source somewhere, maybe just not in the bin folder under your project root (that somewhere might be in memory, or in some cache location on disk, depending on circumstances out of scope for this question).
It's also very likely that you've previously built your application, resulting in binaries in your bin folder, even if you didn't do it with the purpose of running the application right after. In either case, if you get it working with IIS Express it's because it can find compiled binaries somewhere, and run those.
The main reason to have Visual Studio explicitly rebuild your app when you hit play, is to make sure that you're running the latest version of your code. Sure, it takes a few extra seconds every time you start the debugger, but it's nothing compared to the time you'd lose trying to track down a bug that you've already fixed in your code, but which still manifests in the running application, because the running application is an outdated version. (It also makes things like stepping through the code much less confusing, since, again, the source code on file will always be in sync with the running application.)

Trying to understand the finer details application pool refresh and precompilation of Views in ASP.NET MVC

I am using ASP.NET MVC3, .NET4, Razor, C#, EF4.1, MS SQL2008(dev), SQL Azure(test,live).
I am deploying my web application to "Standard" Azure Websites. My process is:
1) Publish to local IIS folder, no precompilation options selected:
Project C# code is compiled as Project dll and put into "bin" folder.
Views stay as source *.cshtml files
2) FTP changed files to deployment server using Beyond Compare,so
Project DLL gets copied
Source *.cs files (Controllers, helpers, Models) get copied.
Changed source *.cshtml Views getting copied
With the initial call to the deployment server, the response is slow, due to JIT compilation. I suspect this is due to:
a) Views being compiled. Major factor as opposed to project dll??
b) Project DLL already precompiled so no issue here ??? Is this correct.
I try to keep the application pool in memory via pingback services either external monitor sites(Uptime Robot) or MS's "always on" service which is the same. But one can still get app pool refreshes and thus slow downs. It seems to me that everything should be precompiled for deployment so that if dropped from memory then rerunning will be quick.
My question(s)
1) Is my understanding correct about what is precompiled and what is not?
2) What should I do now to maximise precompilation and minimise these app pool refresh penalties, and thus keep performance at it peak ie no start of day warm ups etc. My initial impression is to precompile the Views. I did try editing the Project XML file, and this setting specifically:
<MvcBuildViews>true</MvcBuildViews>
However when I try to publish using the above setting I get an error:
obj\release\aspnetcompilemerge\source\web.config(45): error ASPCONFIG: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. This error can be caused by a virtual directory not being configured as an application in IIS.
EDIT
Having done a little more research it seems that I need to focus on precompiling the Views for wich 2 main tools exist:
a) RazorSingleFile
b) RazorGenerator
Apparently
<MvcBuildViews>true</MvcBuildViews>
just compiles rather than precompiles. Not sure of the difference. So it is recommended to use one of the tools above.
EDIT2
My main MVC project dll is 890k-ish in size. Is this large? Would the size cause more drop out from memory?
There is a serious rabbit hole when building views. There is some known bug that this particular error happens. The only known if is to use the <BaseIntermediateOutputPath>. This path needs to exist, and not be within the project path (I think). I usually use:
<MvcBuildViews>true</MvcBuildViews>
<BaseIntermediateOutputPath>..\..\tempMVC</BaseIntermediateOutputPath>
The Aspnet Compiler Build Task in Visual Studio 2010 ASP.Net MVC 2 Projects
I am 95% sure that: You can't pre-compile MVC views (cshtml, etc) for deployment (with any default version of Visual Studio, there may be add-ons). There are two separate issues here.
First, <MvcBuildViews> is only a buildtime strongly-typed checker when set to true. Meaning that if you build a view against a specific #model that the compiler will check the view during a build to make sure aren't using the model incorrectly (like trying to use a property on the model that doesn't exist).
Secondly, the notion of pre-compiling a view into an assembly is a process that exists for Webforms because there is a code in-front and code behind that can be combined. Views aren't tied to logic in this way so the compiler hasn't been designed to take MVC Views into account.
OK, my solution was to publish the application.
Then I selected:
Precompile during Publishing
I then clicked on "configure" and then unchecked:
Allow precompiled site to be updateable
I then selected:
Do not merge, create a seperate assembly for each page or control.
This enabled the precompilation of the views, and put them in the bin folder, in the form of:
_orders.cshtml.4deb95a2.compiled
I understand this to be p-code and so will still need a binary compilation on first load on the deployment server. The project dll is worse since it is bigger.
By the way I did encounter quite a few assembly references issues that were caused by old excluded code files. I thought if they were excluded that was good enough, but no..... In the end I just deleted them all and the precompilation completed successfully.
Doing this forced me to fixed quite a few hidden razor errors which would have come out at runtime as opposed to compile time.

TFS Build custom activity requiring more assemblies than needed

I've just written the first version of a workflow activity that will run Resharper's Code Issues on the projects and parse the output to display the issues as build warnings and errors.
At first, I was going to just call Resharper's command line and parse the resulting xml manually. After fiddling with the dlls in Resharper's SDK (through disassembly mostly), I found a way to parse the results using it's own public classes, which I figured was a much more elegant and safe way to do this.
The first problem I have is that that nuget package is absolutely huge. There is 140mb of files in there, which to me is absurd for a single, unpartitioned package. There seems to be such heavy coupling between them that by using just a few model classes and the parser class, I have to drag a dozen or so of those dlls along, some of them which seemingly have nothing to do with the main dlls I need. This is not a show stopper though, I'm struggling with something else now:
In the end, I managed to track down the dependencies I needed to 41 assemblies (which is, again, insane, but alas). Initially, I tried removing everything and adding the missing references one by one, but this turned out to be unreliable, still missing some indirect references, even after compiling successfully. Then, I decided to code a small console application to find all referenced assemblies in the main Resharper assemblies I used, which gave me the 41 references I mentioned. This is the code I used to find every dependency.
Since these are custom activities we are talking about, I decided to create a unit test project to validate them. Using these 41 references only, everything works correctly.
When I added the activity to the build workflow though, and pointed the build controller to the source control folder containing the required assemblies, every time I schedule a build, the process fails stating that I need one extra dll from Resharper's SDK. For example, this is the first one it asks:
Could not load file or assembly 'AsyncBridge.Net35, PublicKeyToken=b3b1c0202c0d6a87' or one of its dependencies. The system cannot find the file specified. (type FileNotFoundException)
When I add this specific assembly to the TFS folder, I get another similar error for another dll, and this keeps going on and on.
What I wanted to know is how can I know exactly which assemblies a workflow XAML will need in order to run correctly? My custom activity dll has two specific CodeActivities and a XAML only activity that uses these two. This XAML acticity is what I'm directly using in the modified workflow template.
I see that besides the references in my project, the XAML activity also contains a TextExpression.ReferencesForImplementation section, with some assembly names. I've run my dependency finder program on those dependencies too, and the results are the same 41 assemblies already at the TFS folder.
Meanwhile I'll go with having the whole SDK into the custom assemblies folder, but I would really like to avoid this in the future since it has such an enormous amount of unneeded and big dlls in there.
First, we have request for our command line tool to support workflow activity and we decided to implement just plain MsBuild task which is universal and works in TFS too. Task and targets files are included in ReSharper CLT 8.2.
Second, if you still want to implement workflow activity it's pretty easy to do with new API in CLT, designed specially for custom processing of found issues - http://confluence.jetbrains.com/display/NETCOM/Custom+InspectCode+Issue+Logger.
And last, but not least, you do not need to put in VCS binaries of ReSharper SDK package.
Use NuGet's restore package functionality.
If you have any other questions I'll be glad to answer them.
A custom activity is being load and run by .NET CLR like any other .NET program. If the stack trace reports a missing file, then it's required by the CLR and you can't change this fact without refactoring your code.
Having an entire SDK references in the custom assembly folder doesn't make sense. I would prefer GAC deployment over huge binaries folder in the source control. Or maybe consider having these activities running an pre\post build scripts in MSBuild or PowerShell.

ASP.Net MVC: How to dynamically load assemblies (controllers) without an AppPool restart/recycle

I'm trying to write a module/plugin system for a website I'm working on. The overall goals are:
That the main website does not have to be recompiled every time a new plugin is added.
To be able to dump DLL's and CSHTML files into a set of folders that would basically add a set of routes, controller(s), and any other assemblies that the module depends on (Entity Framework, etc).
To avoid marking plugin files as "embedded resources" -- especially views.
To be able to add and remove these modules/plugins WITHOUT having to restart/recycle the IIS app pool or unload the app domain.
I got #1-3 working by following the example of Umbraco. Basically, I marked a method with the PreApplicationStartMethod attribute, and in it I shadow copy DLLs around, and use a custom ViewEngine to locate the module's CSHTML files. When my website first starts up, my module's controllers and views are working, and assemblies are loaded: Hooray!
However, when it came time to try part #4, I am getting this error when calling BuildManager.AddReferencedAssembly():
This method can only be called during the application's pre-start
initialization phase. Use PreApplicationStartMethodAttribute to
declare a method that will be invoked in that phase
It's been a very frustrating process so far, and my gut tells me that this error signifies a dead end. Is this the case, or is there a clever workaround?
Editing the web.config file when you add the new modules should cause the site to recompile.
You could automate this in a script, forcing your new .dll's to be copied to the live ASP.NET temp files directory.
Check out portable areas. Essentially a regular MVC area (including views, controllers, etc.) gets compiled into a single dll. These dll's (one for each area) can be dropped into a hosting MVC website and can be called like any other MVC area.
Some references to get started:
Portable Areas three years later – Part 5
MvcContrib Portable Areas
"To be able to add and remove these modules/plugins WITHOUT having to restart/recycle the IIS app pool or unload the app domain."
It turns out that this you cannot unload an assembly from an app domain.
How to unload an assembly from the primary AppDomain?

IIS 6.0 app pool sharing DLLs

I have 10+ apps that are on a single app pool. All these apps have some common dlls that they all load. The issue right now is that these dlls are put in the '\bin' folder for each app. So each app; even though it uses dll_a, will end up loading its own 'copy' of dll_a.
I have a few questions
1) Is this ok?
2) Should i put dll_a in some common folder and have all apps reference 1 single copy?
3. Does each worker process serving these apps load multiple copies of dll_a from different paths even though they are basically the same dll?
1) Is this ok? -- yes this is ok.
2) Should i put dll_a in some common folder and have all apps reference 1 single copy? -- you can if you want. The problem you will run into is that if you ever need to have an app use a different version of the dll all the others will have to upgrade (or downgrade). If deployment isn't a problem for managing the dlls I would tend to give them their own sererate copy. We have a pretty automated process where I work though, so keeping things in sync when they need to be is pretty trivial here.
3) Does each worker process serving these apps load multiple copies of dll_a from different paths even though they are basically the same dll? -- Yes each process will have it's on copy of the dll. Each application runs in it's own memory space, so althought technically they are using the same one, each one will have a copy of it in memory.

Resources