IIS 6.0 app pool sharing DLLs - memory

I have 10+ apps that are on a single app pool. All these apps have some common dlls that they all load. The issue right now is that these dlls are put in the '\bin' folder for each app. So each app; even though it uses dll_a, will end up loading its own 'copy' of dll_a.
I have a few questions
1) Is this ok?
2) Should i put dll_a in some common folder and have all apps reference 1 single copy?
3. Does each worker process serving these apps load multiple copies of dll_a from different paths even though they are basically the same dll?

1) Is this ok? -- yes this is ok.
2) Should i put dll_a in some common folder and have all apps reference 1 single copy? -- you can if you want. The problem you will run into is that if you ever need to have an app use a different version of the dll all the others will have to upgrade (or downgrade). If deployment isn't a problem for managing the dlls I would tend to give them their own sererate copy. We have a pretty automated process where I work though, so keeping things in sync when they need to be is pretty trivial here.
3) Does each worker process serving these apps load multiple copies of dll_a from different paths even though they are basically the same dll? -- Yes each process will have it's on copy of the dll. Each application runs in it's own memory space, so althought technically they are using the same one, each one will have a copy of it in memory.

Related

(Delphi 7) I can't run my program executable on non-Delphi PCs?

I was making a Delphi application, and wanted to test it on another PC to see if everything was working properly. I compiled and built the executable file, of course and I transfered all of the files from the Project folder to the other PC. When I launched the .exe file on the PC, nothing would happen. I then ticked the "Build with runtime packages" option in Project Options:
This made the .exe go from around 300 KBs to around 30 KBs, but now, instead of being able to launch the application on another (non-Delphi) PC, that PC got an error saying it was missing various files required to open the .exe .
I sent the same thing to various friends and all reported the same problem.
My application is a rather simple lottery prototype application, so I don't understand why I'm having trouble opening it on other PCs. Are there other special options I need to enable for this to work?
When you use runtime packages, you need to distribute those packages. These are the .bpl files that your program links to. It will be a subset of the packages listed in the runtime packages edit box in your screenshot. You should just list the packages that you use.
The net result of doing this is that the total amount that you will have to distribute is much greater than a single monolithic executable. Because in a monolithic executable the unused code can be stripped. If you want to minimize the size of your program, and make life simple, do not use runtime packages.
It would be worthwhile reading Embarcadero's documentation:
Working with Packages and Components
Solve the first problem.
Using Runtime Packages will not solve the problem of your EXE not running on certain PC's. All it does is increase the complexity of deploying your application (as you have found).
Unless you need Runtime Packages for other, specific reasons, then you are far, far better off NOT using them, especially if you do not understand them (which based on the way you describe having discovered them does appear to be the case, if we're being honest).
Concentrate on finding out why your application does not run as a single, stand-alone EXE.
With all of the problems involving runtime packages your EXE is currently not even reaching the point of running your application code, and this may be where your original problem lies. Which means that once you have solved all the issues created by Runtime Packages, you will stil be left with an EXE which does not run. i.e. your original problem.
What does your application do when it starts ? Does it attempt to load files from any specific locations ? What are those locations ? What are the files ? Are you using any third party libraries which may expect DLL's to be present or other external files ? Are you trying to read or write settings to the registry or any external files (INI files etc).
What is the OS you are trying to run on ? This can be a very significant question for applications compiled with older Delphi versions. Have you tried configuring the EXE to run in Compatibility Mode for older versions of Windows ? (something that you do in Windows itself, not when compiling the EXE).
These are the questions you should be focussing on. Not runtime packages.
Gday,
A small tool that's been around for a while to help you with this is Dependency Walker. You can find it at http://www.dependencywalker.com. It's helped me out on more than one occasion. This will tell you what files (usually BPLs as stated in the other responses) need to be sent with your EXE.
Also look at NSIS to create a simple installer, and put your EXE and supporting BPLs and any other files in the same directory.

How to avoid using BuildManager.AddReferencedAssembly

We are developing a web application which has tons of plugin dll files.
Now, we are able to load all the assemblies we need but we are using BuildManager.AddReferencedAssembly while doing this. Which - I guess - causes a little problem.
The problem is, we need to update these dll files without restarting the application but these dll files are locked while application is running.
We are not using bin folder for these dll files. We are using 2 different folders (first one - ~/plugins - for copying files by us, second - ~/ptemp - folder is used for copying the files located under plugins folder and then loading in application like ones in bin folder)
As you know while application is working, you can overwrite the files in bin folder, so it means that it is possible to overwrite project assemblies while application is running.
After some research I found that it is possible by using private folder stuff in appdomainsetup but couldn't figure out how. (we dont want to create another appdomain, we want these files located in ptemp folder to be attached to main domain)
Sorry about my poor english but if you can, please help us.
By the way, some of these dll files includes some mvc areas.
We are using .Net 4.5, Mvc 4.
If it is necessary, I can post some code.
Here is what we are doing:
var shadowCopyDirectories = AppDomain.CurrentDomain.SetupInformation.ShadowCopyDirectories;
var newShadowCopyDirectories = shadowCopyDirectories
+ System.IO.Path.PathSeparator
+ HostingEnvironment.MapPath("~/App_Data/DynamicAssemblies/");
#pragma warning disable 618
// Disabled compiler warning
AppDomain.CurrentDomain.SetShadowCopyPath(newShadowCopyDirectories);
#pragma warning restore 618
Basicly enables shadow copying of assemblies in the given directory to Temporary ASP.NET Files which means that those assemblies will not be locked. Have tried finding the non obsolete way of doing this but after searching for like 30 minutes I came up empty and didnt have the time to search further.
Still you can not avoid an application restart because:
1) You can not unload the 'old' assemblies from the currently running application, which means you are gonna be left with junk in memory and can face a kind of dll hell by having loaded two versions of the same assembly (e.x. Type A originates from Assembly 1 while Type B originates from assembly B)
2) You need a way to tell the BuildManager to recompile everything
But in the end you are probably better off enabling shadow copying because when recycling, a second worker process might be spawned while the first one waits until the second one is ready in which case you will not be able to update those 'dynamic' assemblies until you completely stop your application pool.
It seems that the best way to do this is an application restart

Compiling an Xcode project from a network folder

I'm trying to compile an iOS project in xcode from a network folder and I get all sorts of issues, most of them about files not being found on the first build command.
Most of the time it just says it hasn't found the headers inside other headers. If I hit Build again, the previous errors go away and new ones appear. Other times it just says "xxx.h" Resource temporarily unavailable. Hitting build again, ofcourse that is available but some other header is now unavailable, and it just makes things very frustrating.
PS: I am connected to a network folder from a Windows System using SMB. Sometimes if I disconnect, and reconnect to the network folder it briefly works flawlessly for like 3 seconds, and then starts spewing off resource unavailable errors. Unfortunately my build process lasts longer than 3 seconds :|.
UPDATE : It seems like the cause was disk access latency. I am running an OS X Virtual Machine and connecting to my real machine's HDD. I have now moved the entire VM to an SSD drive and everything works much smoother and it seems I don't have these issues anymore, so disk access times make all the difference :).
UPDATE 2 : For some reason I'm getting this error again. No idea what changed the situation.
Seems like it could be referenced files that aren't copied to project folder. I have lots of projects that are located on a server resource and as long as all the resources used by the project are included in the project it works fine.
That said, my personal choice these days is to house all of my projects in Dropbox. Still have to be careful of forgetting to copy any added files into the project folder, but it is lots easier for me to manage working on these apps on the various machines I work from.
In my experience, trying to work with an Xcode project that resides on a network share is completely hopeless. As you say, all sorts of random errors, some of the time, but not all the time. I simply have a "special" folder on the Mac that is instead shared to the OTHER machines on my network, and anytime I want to work on an Xcode project, I copy the project over into that "local" folder (and back when done). Highly inefficient and error-prone, but in my experience the only thing that works reliably (pretty sad really!)

Proper Way to Update Production Server with new Compiled MVC Application Files

Ok, please bear with my noob question here.
I'm doing the simple task of making an update to my mvc application, compiling it and then moving in onto the production server.
I just wan't to know the best way to upload the compiled files. I have a single application pool, use ftp to upload the new application files and the site points to a single directory.
If I update just one view then which
files do I upload after compiling?
Is there a way to keep the site running
while I upload new code/views?
Where can I go to find out this
information?
Generally, you can update views without needing to re-cycle your web application. You would just want to replace the old version of the file with the new version, which can be done with a simple X-Copy command.
If there are code changes, then you will need to upate the web project DLL, which requires the app to recycle. This may or may not be a huge disruption, but it does mean that users may have their session interrupted, and lose some state.
Now, the question of how you could go about doing this is a little more complex. You can write a deployment process into your build scripts, which may be the easiest approach. The trick here, though, is that if you want to only include files that have changed, this can be a little trickier using vanilla NAnt or MSBuild tasks. You may also want to look at the WebDeploy tool from the IIS team. I've not used it much myself, but it is designed specifically to deploy web projects.
You may also want to hit google for some commercial deployment tools if none of the options so far seem to work for you.

Grouping DLL's for use in Executable

Is there a way to group a bunch of DLL's and still use them at run time (not zipped up). Sorry this question sounds terse and stupid, but I'm not sure what more to ask.
I'll explain the situation though:
We've had two standalone Windows Applications and now one of our Applications has swelled to such ungainly proportions that the other application cannot run outside of the scope of the first app. We want to maintain some of the encapsulation we had while letting the smaller program in on some of the bigger program's features.
There is no problem in running the application, other than we don't want to send out all the 20-30 DLL's that the smaller project has.
It is possible to do this by adding startup code which checks if the DLLs are present on the target system and if not then extracts them from the resources section (or simply tagged onto the end of the exe). A good example of this being done is Process Explorer - it's distributed as a single binary, but when run it extracts and installs a driver.
If you have a situation where most, or all, of those assemblies have to be kept together, then I would highly recommend just merging the code files into the same project and recompiling. This would leave you with one assembly.
Of course there are other considerations like compile time, overall size of the final dll, how often various pieces change, and whether each component is deployed without the others.
One example of a company that did this is Telerik. Their dev components are all compiled into the same assembly. This makes deployment an absolute breeze. Contrasting that is Dev Express which put just about each control into it's own assembly. Because of this just maintaining, much less deploying, a Dev Express project is not something for the faint of heart.
(I don't work for either of those companies. However, I have a lot of experience with both toolkits.)
You could store the DLLs as Resources, and use BTMemoryModule, which essentially allows you to LoadLibrary on a Stream.
That way you could compile-in the multiple DLLs straight into the EXE or into a single resource DLL.
see http://www.jasontpenny.com/blog/2009/05/01/using-dlls-stored-as-resources-in-delphi-programs/

Resources