ASP.NET MVC and EF Code First Memory Usage - asp.net-mvc

I have an application built in ASP.NET MVC 3 that uses SQL CE for storage and EF CTP 5 for data access.
I've deployed this site to a shared host only to find that it is constantly being recycled as it's hitting the 100mb limit they set on their (dedicated) application pools.
The site, when running in release mode uses around 110mb RAM.
I've tried using SQL Server Express rather than CE and this made little difference.
The only significant difference is when I removed EF completely (using a fake repo). This dropped the memory usage between 30mb-40mb. A blank MVC template uses around 20mb so I figured this isn't too bad?
Are there any benchmarks for "standard" ASP.NET MVC applications?
It would be good to know what memory utilisation other EF CTP users are getting as well as some suggestions for memory profiling tools (preferably free ones).
It's worth mentioning how I'm handling the lifetime of the EF ObjectContext. I am using session per request and instantiating the ObjectContext using StructureMap:
For<IDbContext>().HttpContextScoped().Use(ctx => new MyContext("MyConnStringName"));
Many thanks
Ben

We did manage to reduce our memory footprint quite significantly. The IIS worker process now sits around 50mb compared to the 100+mb before.
Below are some of the things that helped us:
Check the basics. Make sure you compile in release mode and set compilation debug to false in web.config. It's easy to forget such things.
Use DEBUG symbols for diagnostic code. An example of this would be when using tools like NHProf (yes I've been caught out by this before). The easiest thing is to wrap such code in an #if DEBUG directive to ensure it's not compiled into the release of your application.
Don't forget about SQL. ORMs make it too easy to ignore how your application is talking to your database. Using SQL Profiler or tools like EFProf/NHProf can show you exactly what is going on. In the case of EF you will probably feel a little ill afterwards, especially if you make significant use of lazy loading. Once you've got over this, you can begin to optimize (see point below).
Lazy loading is convenient but shouldn't be used in MVC views (IMO). This was one of the root causes of our high memory usage. The home page of our web site was creating 59 individual queries due to lazy loading (SELECT N+1). After creating a specific viewmodel for this page and eagerly loading the associations we needed we got down to 6 queries that executed in half the time.
Design patterns are there to guide you, not rule the development of your application. I tend to follow a DDD approach where possible. In this case I didn't really want to expose foreign keys on my domain model. However since EF does not handle many-to-one associations quite as well as NH (it will issue another query just to get the foreign key of an object we already have in memory), I ended up with an additional query (per object) displayed on my page. In this case I decided that I could put up with a bit of code smell (including the FK in my model) for the sake of improved performance.
A common "solution" is to throw caching at performance issues. It's important to identify the real problem before you formulate your caching strategy. I could have just applied output caching to our home page (see note below) but this doesn't change the fact that I have 59 queries hitting my database when the cache expires.
A note on output caching:
When ASP.NET MVC was first released we were able to do donut caching, that is, caching a page APART from a specific region(s). The fact that this is no longer possible makes output caching pretty useless if you have user specific information on the page. For example, we have a login status within the navigation menu of the site. This alone means I can't use Output caching for the page as it would also cache the login status.
Ultimately there is no hard and fast rule on how to optimize an application. The biggest improvement in our application's performance came when we stopped using the ORM for building our associations (for the public facing part of our site) and instead loaded them manually into our viewmodels. We couldn't use EF to eagerly load them as there were too many associations (resulting in a messy UNION query).
An example was our tagging mechanism. Entities like BlogPost and Project can be tagged. Tags and tagable entities have a many-to-many relationship. In our case it was better to retrieve all tags and cache them. We then created a linq projection to cache the association keys for our tagable entities (e.g. ProjectId / TagId). When creating the viewmodel for our page we could then build up the tags for each tagable entity without hitting the database. Again, this was specific to our application but it yielded a massive improvement in performance and in lowering our memory usage.
Some of the resources / tools we used along the way:
EFProf - to monitor the queries generated by Entity Framework (free trial available)
ANTS Memory Profiler (free trial available)
Windows performance monitor (perfmon)
Tess Ferrandez's blog
Lots of coffee :)
Whilst we did make improvements that would take us under the hosting company's (Arvixe) application pool limits, I do feel a sense of duty to advise people who are looking at their Windows Reseller plans, that such restrictions are in place (since Arvixe do not mention this anywhere when advertising the plan). So when something looks too good to be true (unlimited x,y,z), it usually is.

The funny thing is, I think they got their estimate from this URL:
http://blog.whitesites.com/w3wp-exe-using-too-much-memory-and-resources__633900106668026886_blog.htm
P.S. It's great article to check and see if you're doing anything that the guy is describing. (For example caching your pages)
P.S.S. Just checked our system and it's running at 50 megs currently. We're using MVC 2 and EF CTP 4.

Related

Reflection and performance in web

We know Reflection is a quite expensive engine. But nevertheless ASP.NET MVC is full of it. And there is so much ways to use and implement additional reflection-based practices like ORM, different mappings between DTO-entities-view models, DI frameworks, JSON-parsing and many many others.
So I wonder do they all affect performance so much that it is strongly recommended to avoid using reflection as much as possible and find any another solutions like scaffolding etc? And what is the tool to perform server's load testing?
There's nothing wrong with Reflection. Just use it wisely, a.k.a cache the results so that you don't have to perform those expensive calls over and over again. Reflection is used extensively in ASP.NET MVC. For example when the controller and action names are parsed from the route, Reflection is used to find the corresponding method to invoke. Except that once found, the result is cached so that the next time someone requests same controller and action name, the method to be invoked is fetched from the cache.
So if you are using a third party framework check the documentation/source code whether it uses reflection and whether it caches the results of those calls.
And if you have to use it in your code, same rule applies => cache it.
For stress testing, this SO post gives quite a few possibilities: Stress Testing ASP.Net application.
I have thought about this question myself, and come to the following conclusions:
Most people don't spend their days resubmitting pages over and over again. The time the user spends reading and consuming pages which at worst contain a few Ajax calls is minimal when taken into context with the time spent visiting an actual website. Even if you have a million concurrant users of your application, you will generally not have to deal with a million requests at any given time.
The web is naturally based on string comparisons... there are no types in an HTTP response, so any web application is forced to deal with these kinds of tasks as a fact of everyday life. The fewer string comparisons and dynamic objects the better, but they are at their core, unavoidable.
Although things like mapping by string comparison or dynamic type checking are slow, a site built with a non-compiled, weakly-typed language like PHP will contain far more of these actions. Despite the number of possible performance hits in MVC compared to a C# console application, it is still a superior solution to many others in the web domain.
The use of any framework will have a performance cost associated with it. An application built in C# with the .NET framework will for all intents and purposes not perform as well as an application written in C++. However, the benefits are better reliability, faster coding time and easier testing among others. Given how the speed of computers has exploded over the past decade or two, we have come to accept a few extra milliseconds here and there in exchange for these benefits (which are huge).
Given these points, in developing ASP.NET MVC applications I don't avoid things such as reflection like the plague, because it is clear that they can have quite a positive impact on how your application functions. They are tools, and when properly employed have great benefits for many applications.
As for performance, I like to build the best solution I can and then go back and run stress tests on it. Maybe the reflection I implemented in class X isn't a performance problem after all? In short, my first task is to build a great architecture, and my second is to optimise it to squeeze every last drop of performance from it.

Wicket: “large memory footprint!”, "Does Wicket scale?".. etc

Wicket uses the Session heavily which could mean “large memory footprint” (as stated by some developers) for larger apps with lots of pages. If you were to explain to a bunch of CTOs from Fortune 500 that they have to adopt Apache Wicket for their large web application deployments and that their fears about Wicket problems with scaling are just bad assumptions; what would you argue?
PS:
The question concerns only
scaling.
Technical details and real world
examples are very welcomed.
IMO credibility for Apache Wicket in very large scale deployment is satisfied with the following URL: http://mobile.walmart.com View the source.
See also http://mexico.com, http://vegas.com, http://adscale.de, and look those domains up with alexa to see their ranking.
So, yes it is quite possible to build internet scale applications using Wicket. But whether or not you are using Wicket, Struts, SpringMVC, or just plain old JSPs: internet scale software development is hard. No framework can make that easy for you. No framework can give you software with a next-next-finish wizard that services 5M users.
Well, first of all, explain where the footprint comes from, and it is mainly the PageMap.
The next step would be to explain what a page map does, what is it for and what problems it solves (back buttons and popup dialogs for example). Problems, which would have to be solved manually, at similar memory costs but at a much bigger development cost and risk.
And finally, tell them how you can affect what goes in the page map, the secondary page cache and thus how the size can be kept under control.
Obviously you can also show them benchmarks, but probably an even better bet is to drop a line to Martijn Dashorst (although I believe he's reading this post anyway :)).
In any case, I'd try to put two points across:
There's nothing Wicket stores in memory which you wouldn't have to store in memory anyway. It's just better organised, easier to develop, keep consistent, and test.
Java itself means that you're carrying some inevitable excess baggage around all the time. If they are so worried about footprint, maybe Java isn't the language they want to use at all. There are hundreds of large traffic websites written in other languages, so that's a perfectly workable solution. The worst thing they can do is to go with Java, take on the excess baggage and then not use the advantages that come with an advanced framework.
Wicket saves the last N pages in the session. This is done to be able to load the page faster when it is needed. It is needed mostly in two cases - using browser back button or in Ajax applications.
The back button is clear, no need to explain, I think.
About Ajax - each ajax requests needs the current page (the last page in the session cache) to find a component in it and call its callback method, update some model, etc.
From their on the session size completely depends on your application code. It will be the same for any web framework.
The number of pages to cache (N above) is configurable, i.e. depending on the type of your application you may tweak it as your find appropriate. Even when there is no inmemory cache (N=0) the pages are stored in the disk (again configurable) and the page will be find again, just it will be a bit slower.
About some references:
http://fabulously40.com/ - social network with many users,
several education sites - I know two in USA and one in Netherlands. They also have quite a lot users,
currently I work on a project that expects to be used by several million users. Wicket 1.5 will be improved wherever we find hotspots.
Send this to your CTO ;-)

Is entity framework a bad choice for multiple websites and large aplications?

Scenario : We currently have a website and are working on building couple of websites with an admin website. We are using asp.net-mvc , SQL Server 2005 and Entity Framework 4. So, currently we have a single solution that has all the websites and all the websites are using the same entity framework model. The Model currently has over 70 tables and will potentially have a lot more in the future... around 400?
Questions : Is Entity Framework model going to be slower when it is going to grow bigger? I have read quite a few articles where they say it is pretty slow due to the additional layers of mapping when as compared to say ado.net? Also , we thought of having multiple models but it seems that it is a bad practice too and is LINQ useful when we are not using any ORM?
So, we are just curious what and how all the large websites using a similiar technology as we have achieve good performance while using an ORM like EF or do they never opt for an ORM ? I have also worked on a LINQ to SQL application that had over 150 tables and we encurred a huge startup penalty, site took 15-20 seconds to respond when first loaded. I am pretty sure this was due to large startup cost of LINQ to SQL ORM. It would be great if someone can share their experience and thoughts regarding this ? What are the best practices to follow and I know it depends on every application but if performance is a concern then what are the best steps to be taken ?
I don't have a definite answer for you, but I have found this SO post: ORM performance cost, it will probably be informative for you, expecially the second highest answer mentioning this site:
http://ormbattle.net/
My personal experience is that for any ORM mapper I have seen so far, Joel's law of leaky abstraction applies heavily. So if you are going to use EF, make sure you have alternatives for optimization at hand.
I think you can certainly get EF4 to work in a performant way with a database with a large number of tables. That said, you will certainly have to overcome a number of hurdles that are specific to EF.
I don't think LinqToSql is a good alternative since Microsoft has stopped enhancing it for the most part.
What other alternatives have you considered? ADO.NET? NHibernate? Stored Procedures?
I know NHibernate may have trouble establishing the SessionFactory for 400 tables quickly, but that only happens once when the website application starts, which should be fairly rare if the application is used heavily. Each web request generally has a new Session and creating sessions from the session factory is very quick and inexpensive.
My biggest concern with EF is the management of the thing, if you have multiple models, then you're suddenly going to have multiple work to do maintaining them, making sure you never update the wrong model for the right database, or vice versa. This is a problem for us at the moment, and looks to only get worse.
Personally, I like to write SQL rather than rely on an abstraction on top of an abstration. The DB knows SQL and so I keep it happy with hand-crafted stored procedures, or hand-crafted SQL in some cases. One huge benefit to this is that I can reply code to see what it was trying to do, and see the resulting data by c&p the sql from the log to the sql query editor. That, in my opinion, makes support so much easier it entirely invalidates any programmer benefit you might have from using an ORM in the first place (especially as EF generates absolutely unreadable SQL).
In fact, come to think of it, the only benefit an ORM gives you is that you can code a bit quicker (once you have everything set up and are not changing the schema, of course), and ultimately, I don't think the benefit is worth the cost, not when you consider that I spend most of my coding time thinking about what I'm going to do as the 'doing it' part is relatively small cmpared to the design, test, support and maintain parts.

Any MVC 3 tutorials on *Real World* developement situations?

MVC total noobe here but long time web developer - 10 + years. The tutorials for MVC 3 (and earlier versions) are great but as usual they lack a ton for real on the job type scenarios.
For example, how often do you find yourself in a situation where you are going to create a new database from scratch with no stored procs so that you could actually use EF Code-First. I don't know about you but in my career it has been NEVER.
The usual story is that you are creating a new app or enhancing an existing app with new functionality that will connect to an existing very mature database with tons of stored procs, user defined functions and views and you are required either by management or time restrictions to use it all. And of course you may get to create some new tables but they usually will have joins to existing tables or in the least your app will have to query existing tables for some of the data.
To see a tutorial based on that scenario would be WORTH IT'S WEIGHT IN GOLD. Especially the stored procedure scenario.
Thank you for any advice
Most of the earlier examples (e.g. the original NerdDinner) were based on either Linq to Sql or Entity Framework (without CodeFirst). Since CodeFirst is the 'new hotness' most of the latest examples use it.
The interesting part of the question, though, is that it highlights an important point: "it doesn't matter". Your data access strategy (EF, EF code first, NHibernate, L2S, raw SQL) is totally irrelevant to MVC. By that I don't mean it's unimportant, I mean that MVC doesn't place any constraints on you at all in that regard.
You will generally (in a well-designed MVC app) pass your controllers interfaces which let them call data access methods of various sorts (or perhaps another layer of indirection with services that do other things before hitting the storage layer). The implementation of the data access, if it is using an ORM like EF or Nhibernate, will then have mechanisms for you to either use a query syntax of some sort (e.g. LINQ) or to call stored procedures (possible in all major ORMs that Iv'e used) or to push raw SQL if hte situation calls for it.

Caching user data to avoid excess database trips

After creating a proof of concept for an ASP.NET MVC site and making sure the appropriate separation of concerns were in place, I noticed that I was making a lot of expensive redundant database calls for information about the current user.
Being historically a desktop and services person, my first thought was to cache the db results in some statics. It didn't take much searching to see that doing this would persist the current user's data across the whole AppDomain for all users.
Next I thought of using HttpContext.Current. However, if you put stuff here when a user is logged out, then when they log in your cached data will be out of date. I could update this every time login/logout occurs but I can't tell if this feels right. In the absence of other ideas, this is where I'm leaning.
What is a lightweight way to accurately cache user details and avoid having to make tons of database calls?
If the information you want to cache is per-user and only while they are active, then Session is the right place.
http://msdn.microsoft.com/en-us/library/system.web.sessionstate.httpsessionstate.aspx
What you're looking for is System.Web.Caching.Cache
http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx
ASP.NET session-state management is good for some situations but when heavy load is put, it tends to create bottlenecks in ASP.NET performance. Read more about it here:
http://msdn.microsoft.com/en-us/magazine/dd942840.aspx
http://esj.com/articles/2009/03/17/optimize-scalability-asp-net.aspx
The solution to avoid bottlenecks is use of distributed caching. There are many free distributed caching solutions in the market like Memcached or NCache Express.
Dont know much about Memcached but i've used NCache Express by Alachisoft, it lets you use ASP.NET caching without requiring any code change.

Resources