ASP.NET MVC - what's the equivalent of ViewState - asp.net-mvc

I have an existing web-forms website that runs on a web-farm (multiple web servers, each request is NON-sticky).
One of the tasks is to retrieve a lot of data from a 3rd party webservice. This is an expensive process (in terms of time taken to respond). The optimal solution has been to initially grab all the data and stick it in the page's ViewState (as a List<Product>. We then have a grid that allows us to page through this list of products. For each request for the next page we don't have to re-visit the slow web service because we've already cached the data in the ViewState.
So, how would I accomplish this using MVC? If I were using classic ASP, I would serialize the list and hold it in a hidden field in the form.
But what is the preferred approach when using MVC? As mentioned, I'm using non-sticky sessions so can't rely upon caching on the server.
If I am to hold it in a hidden-field, then is it sensible to compress the data first (zip) to reduce the size of the page? Again, what's "best practice" here?
Many thanks for any/all advice
Griff
PS - I know there are similar posts out there (e.g. ASP.NET MVC and ViewState), but they don't quite provide the detail I require.

Caching, in my humble opinion, is the best way to deal with this; hit the webservice, cache the data, and use that for each subsequent request.
It is possible to share a cache across many web servers. The default behaviour is to hold it InProcess, but this is by no means fixed and can be configured to store it in a remote InProc cache, a database, or any other method of semi-persistant storage. AzureAppFabric caching comes to mind.
Second to that, as you mentioned is to dump all the data in a hidden field, but I dont like this idea for a number of reasons
Page bloat - you're submitting this data every time the page is changed
Lost data - you must submit a form for every navigation, forgetting to do this means loosing your data
To name 2.

Your strategy should depend on how volatile the data retrieved from the 3rd party service is going to be and how expensive it is to call it.
Volatile and expensive. I would go down the road of caching the data in a distributed cache such as Appfabric Velocity, or Memcached.
Non-Volatile and expensive. Cache a copy of it in memory on each server node.
Cheap. Hit the call every time you need it. Don't bother caching.
If the data set returned from the service is large, I would not pass this up and down every time you page through your grid data. Only retrieve the data for the current page and render it in your grid. Telerik have a good implementation for this or you could try and roll your own.

Related

nHibernate w/ASP.NET Session Per Request And Active Work Item

I am building an ASP.NET application using nhibernate and I implemented the session per request architecture. Each request I am opening a session, using it, then closing it. I am using one large object across several views and I am storing the object in user session cache so it maintains state for them across several different pages. The users do not want to have to save their changes on each page, they want the ability to navigate between several pages making changes, then commit them to the DB. This works well except for when the users try to hit a page that triggers lazy loading on the proxy object (which fails due to the session per request design closing the nhibernate session in the previous request). I know that turning lazy loading off would fix it; however, that is not an option due to the performance issues it would cause in other areas. I have tried changing the session per request design but have had no luck since I do not know when it is "safe" to close the nhibernate session.
Has anyone else done anything similar to this or have any advice?
Thanks in advance!
Keeping any objects in user session - the server session, server resource is not the best approach. Just imagine, that accidently your application will be very very successful... and there will be many users... therefore many sessions == many resources.
I would suggest (based on some experience), try to re-think the architecture to avoid session. Extreme would be to go to single page application... but even some async during "...navigation among several pages..." will be better.
All that will mean, that:
we passs data to client (standard ASP.NET MVC way with rendered views or some Web Api JSON)
if needed we send data back to server (binding of forms or formatting JSON)
In these scenarios, standard NHiberante session will work. Why? Because we "reload and reassign" objects with standard NHibernat infrastructure. That would be the way I suggest to go...
But if you want to follow your way, then definitely check the merge functionality of NHibernate:
9.4.2. Updating detached objects
9.4.3. Reattaching detached objects
19.1.4. Initializing collections and proxies
Some cites:
In an application with a separate business tier, the business logic must "prepare" all collections that will be needed by the web tier before returning. This means that the business tier should load all the data and return all the data already initialized to the presentation/web tier that is required for a particular use case. Usually, the application calls NHibernateUtil.Initialize() for each collection that will be needed in the web tier (this call must occur before the session is closed) or retrieves the collection eagerly using a NHibernate query with a FETCH clause or a FetchMode.Join in ICriteria. This is usually easier if you adopt the Command pattern instead of a Session Facade.
You may also attach a previously loaded object to a new ISession with Merge() or Lock() before accessing uninitialized collections (or other proxies). No, NHibernate does not, and certainly should not do this automatically, since it would introduce ad hoc transaction semantics!

asp.net mvc consuming asp.net web api end point

Looking at this question:
SO question
The accepted answer by Darin Dimitrov looks appealing (.NET 4.5 version). I am just wondering how this compares performance wise with client side solutions (e.g. using knockout/angular/jquery) to assemble the HTML given some JSON from the web api endpoint. Did someone ever do some perfromance tests on this. What are the pros and cons of the 'client side solution' vs the 'razor server side' solution?
You should have to define performance.
However there is a very big difference between the two options:
if you do it client-side (with ko/ng/jQuery) the server only executes the API controller action and returns the formatted data.
if you do it server side, apart from execution the API action, the server has to execute the MVC controller action, so, undoubtedly, the server does more work.
The only conclusion is that the server has less work to do in the first case. And, usually, the network traffic is reduced (a JSON object is usually lighter than a rendered partial view).
If we're speaking about the user experience, in general client side technologies (jQuery, ko, ng) offer a much better user experience becasue the page is much more responsive: it's easy to show/hide elements, set the focus, make trivial calculations, remote validations... And if we use modern libraries, we can go further on improving the interface resposiveness. For example breeze.js allows to cache data in the client side to avoid making extra ajax calls to the server, giving a much more responsive experience, specially if you anticipate what data can be needed and cached it before hand. You could even persist data in HTML5 storage, so that it's available for other sessions.
Then, from the user viewpoint, I think it's much better the second option. And the server has less work to do, which can make it also more resposive in high-traffic sites.
Even so, I don't know what is "more performant" or even what it is "to be performant".
Whatever it is, using client side technologies is a much better option. But it takes some time to master the associated technologies.

Session State between Pages in MVC

besides TempData, which I believe isn't the best thing to use nowdays, what are some best practices in how you can persist user data from page to page?
Do you usually just go back to the DB every time...ajax or not...do you make a request every time or do you store it in lets say the Request object or some other in process object instance?
I'm just looking for a broad range of ideas as I am overwhelmed with looking this up on the net...there's a LOT out there and it would be easier for me to get some insight via stack as well.
There are several options. If we're talking about intrarequest, then of course ViewBag is the best choice. Intrapage (across requests) then the best choice is probably hidden fields, unless it's sensitive data.
Between pages, then there are several options. You can of course pass data as query string parameters. Session also makes a convenient option, so long as the data size is small, and it's ok if the session gets lost (ie, you can get it again or regenerate it). In certain other situations, you can post data to another page using hidden fields, but this should largely be discouraged since you should prefer to use PRG (Post/Redirect/Get) pattern.
Some other options are using the context cache HttpContext.Cache (Which i feel is misnamed, but oh well) or saving it in temporary tables in the database. Any "in-memory" option will have scalability issues if you decide to move to a web farm (Session can be made to be backed by database, but it slows things down).
It also depends on what data you're talking about. If it's user data, then another option is to store it in a cookie, or to use the user data portion of the Forms Authentication cookie, and create a custom IIdentity object.
Finally, there's just rebuilding the data from its source on every request.

how to implement pagination correctly?

I am currently working on an MVC4 application that accesses a set of wcf services which delivers content.
I have a page that lists products. This page has a pagination feature called infinite scrolling, so as you scroll down the page, products are loaded.
I am wondering what is the best way to achieve such a pagination feature. The data source is sql server. Options as I see it are:
Paginate at the sql server 2012 layer - returning only the required recordset and feed it back up the stack through wcf and in to the MVC application to display
As option 1 but also include caching at the WCF layer so that the recordset is cached long term. This will mean though that a number recordsets will be stored in cache instead of one large one
Cache all the data, and paginate the cached items, returning the subset from the WCF services cached data
Note: I am using asp.net for caching.
So I am looking for feedback as to the best practice for this.
How big is your possible dataset? That I think would be the concern with caching, if it's feasible to hold it all in memory, then do option 3, I don't see the point of option 2, as if you cache for long term, you will most likely get to caching everything anyway. If you want to implement #2, I would cache for a short period of time (the timeframe would depend on how busy the site is).
When i tried this for test purposes i followed this guide: http://www.gavindraper.co.uk/2012/05/10/infinite-scroll-with-asp-net-mvc-4/
I dont know if it's the best way of implementing infinite scroll but it's at least a proof of concept.

Persisting complex data between postbacks in ASP.NET MVC

I'm developing an ASP.NET MVC 2 application that connects to some services to do data retrieval and update. The services require that I provide the original entity along with the updated entity when updating data. This is so it can do change tracking and optimistic concurrency. The services cannot be changed.
My problem is that I need to somehow store the original entity between postbacks. In WebForms, I would have used ViewState, but from what I have read, that is out for MVC. The original values do not have to be tamper proof as the services treat them as untrusted. The entities would be (max) 1k and it is an intranet app.
The options I have come up are:
Session - Ruled out - Store the entity in the Session, but I don't like this idea as there are no plans to share session between
URL - Ruled out - Data is too big
HiddenField - Store the serialized entity in a hidden field, perhaps with encryption/encoding
HiddenVersion - The entities have a (SQL) version field on them, which I could put into a hidden field. Then on a save I get "original" entity from the services and compare the versions, doing my own optimistic concurrency.
Cookies - Like 3 or 4, but using a cookie instead of a hidden field
I'm leaning towards option 4, although 3 would be simpler. Are these valid options or am I going down the wrong track? Is there a better way of doing this?
If you do Store it in a session then you need to ensure that if you implement a web farm that the session is loaded correctly.
We have (exactly) the same question here at the moment and what we've decided to do is to implement the Repository Pattern and link it to a cookie.
Then, if this becomes an issue, we can simply slot in either a session manager, db manager or whatever and our code need not even know because of the repository pattern.
We tinkered with the idea of hidden fields but it felt too much like ViewState and we all hated it in WebForms so the idea was scrapped. But not just because we hated view state. There were issues when you pressed Ctrl F5. The contents would be cleared and then what do you do?
So at this point its a repository pattern with a cookie which may change but the implementation lends itself kindly to change.
EDIT
We also decided against hidden fields because it would be too easy to make changes to them and so you need to do some token stuff from the server to ensure it wans't tampered with.
The hidden fields just kept on adding complexity to what essentially should have been a very simple problem.
At least that was our thoughts on the matter.
I am not quite sure why Session is a bad idea, if the client dose not need that backup copy, keeping the whole thing in server memory sounds the best; since the rest of the candidates are all sending from server to place (temporary) in clients' browser, and then get it back whenever client does any action. Situation is, whenever client ping back, server will unpack the encoded data (either in hidden field, cookie, url etc) and would possibly place in server again! It also waste bandwidth IMO.
OK, if client need (to inspect) the backup, I would consider hidden field(set) or simply serialize the data in XML and put it somewhere in the HTML.
EDIT
I still vote for Session. If you plan to take care the server farm, consider implement a cross server session provider:
http://msdn.microsoft.com/en-us/library/ms178587%28VS.80%29.aspx
and simply store the state in database.

Resources