How many users is too much when using session variables? - asp.net-mvc

I'm currently using session variables as a cache to cut down the calls to the database.
I'm wondering at about how many concurrent users does this stop working at? 1,000, 10,000 ....100,000??? Also will iis start flaking out at a certain load? And are there
any alternatives?
I know it depends on how much data I'm storing per user, but I want to hear from other peoples experiences.
I do have it setup so that when the code tries to access a timed out session that it reloads from the database.
I'm currently using iis6 but I could easily use iis7 if it handles sessions better.
Edit: Yes I'm using application variables for non user specific data.

If this is of any concern to you, use a State Server or the SQL Storage options for the session. For almost all applications it will not prove to be a problem though.

You should probably be looking at Memcached if you're getting to this point.

If you have more than 124,889 users, your server will begin to be unresponsive.
Edit: if your data does not change and can be re-used, try caching it in an application scoped variable, i.e. reference data.

It's very unlikely that session variable capacity will ever be the limiting resource for your server. Any particular reason you're asking?

How about using the Cache instead, it allows automatic cache invalidation.
Cache invalidation can be done both based on a timeout and due to it being "kicked out" due to ressources.
You can use the Cache on a per-user-basis by giving each item a user specific key.

Related

Is it strange to let one user refresh the server cache when it affects all users?

I'm displaying a lot of data on a website that won't change often. Because of this, I'm caching the data in HttpRuntime.Cache, which I understand to cache data for all users of the website.
However, I also want to offer the ability to force a refresh in case cache data becomes stale. Since the data is cached for all users, this means that if a few people are using the site at once, it'll affect everyone. Is this a common pattern? It seems like strange behavior for a site to display, especially since one user could slow everyone down by constantly forcing cache refreshes. It still doesn't make sense to do clientside caching since the data will be the same for all users.
Caching data visible to all users is extremely common, and is a good practice. However giving users the ability to refresh the cache is pretty rare. The better path would be to expire your cache when data is saved that would change the contents of a cached page.
Smart cache invalidation means that your users always see the freshest data, but they also get the benefits of your caching efforts. Ideally, you're only expiring the pages affected by a change - not your entire cache.
I think it would be careless of you to allow a normal user to have the ability to invoke a "clear cache" operation.. your cache-able data should have some sort of dependency defined. See: Cache Expiration Policies

Asp.Net MVC - how would i maintain user state in azure in my application

I know that there are a few questions like this, but the question is more in respect to this specifict situation.
Im developing a platform for taking tests online. A test is a set of images and belonging questions. Its being hosted on Azure and using MVC 4.
How I would love that if the user have taken half the test and the brower crashes or something makes it deviate from the test and comes back, it will give the option to resume.
I have one idea my self, but would like to know if theres other options. I was considering to use the localstorage. When a user starts a test, the information for the test is saved in localstorage and every time he moved on to a new image, it updates the local state. Then if the test-player is loaded it will check if any ongoing tests are avalible.
How could i do it? any one witch similar problem/solution.
Local Storage is not a good choice, because it is specific to each instance. That means if you have two instances of a Web Role (the recommended minimum), then each instance would have it's own local storage. They are not shared, and there is no way to access local storage on a specific machine.
You really have two options. You could use a database like SQL Azure, or use Azure caching. Azure caching is probably easier, since it's super easy to serialize/deserialize complex objects, but the downside is that caching is only valid for 72 hours. If a cached object isn't accessed/updated in 72 hours, it gets purged.
I would not recommend you storing this information on the client browser. The user has access to local storage, cookies, etc ... and could modify it. You could store the test start time in your database on the server. Then every time the user sends a request to the server in order to answer a question you would verify if the test is still active or the maximum allowed time has elapsed.

BroadCast Admin message to Each Session User

I have a requirement to inform every user to save their work and logout so that admin can reset iis or do some changes in the asp.net MVC application server.
looping through session object collection is not thread safe that is what i have learned.
any other ideas?
and even if i can get hold of active sessions how do i send a message to those clients ?
thanks in advance.
Save the message in a database and query the database for every request to see if a message exist.
This seems like a poorly-defined requirement.
Serious maintenance should be done at a specific time, and users should be alerted to that time window well in advance.
Simply restarting IIS is a pretty quick procedure... is there any reason users would lose their work when simply restarting IIS? While I've been filling out this StackOverflow answer, for instance, they could have restarted the server a dozen times. Once I hit Post, if the server is down, it'll either timeout and leave my work in the textarea, or else it will connect successfully if the server is back in time.
If I'm not submitting data, but just clicking a link, the same applies: either the browser times out, in which case a simple refresh is enough once the server is back up, or it eventually takes the user where they want to go.
If you're doing pure AJAX requests you will need to handle a missing server yourself, rather than relying on the browser to do it, but you'd need to work that out anyway because of the Eight Fallacies of Distributed Computing #1: "The network is reliable." (see http://en.wikipedia.org/wiki/Fallacies_of_Distributed_Computing)
So, I'd actually push back on that requirement. They're asking you to do something that won't really meet the need (users don't lose data, have a reasonably good experience), that will become complicated, and that will be a brittle solution in the end.
Sounds like a case for SignalR!
https://github.com/SignalR/SignalR

Persisting ActiveRecord objects across requests in ruby on rails

I am trying to figure out a way to persist ActiveRecord objects across requests in the most efficient way possible.
The use case is as follows: a user provides some parameters and clicks "show preview". When that happens, I am doing a lot of background computation to generate the preview. A lot of ActiveRecord objects are created in the process. After seeing the preview, the user clicks "submit". Instead of recomputing everything here, I would like to simply save the ActiveRecord objects created by the previous request. There is no guarantee that these two requests always happen (e.g. the user may opt out after seeing the preview, in which case I would like to remove these objects from the persistence layer).
Are there any proven efficient ways to achieve the above? Seems like it should be a common scenario. And I can't use sessions since the data can exceed the space allotted to session data. Moreover, I'd rather not save these objects to the DB because the user hasn't technically "submitted" the data. So what I am looking for is more of an in-memory persistence layer that can guarantee the existence of these objects upon executing the second request.
Thanks.
You can save you a lot of unnecessary work by just saving it to the DB and not add other not-really-persistent-layers to your app.
A possible approach: Use a state attribute to tell, in what state your record is (e.g. "draft", "commited"). Then have a garbage collector run to delete drafts (and their adjactent records) which haven't been commited within a specific timeframe.
Im not sure if not saving the object in a dirty state would be the best option as you could manage this with some sort of control attribute like state or status.
Having this would also be pretty great as you could validate data along the way and not do it until the user decides to submit everything. I know Ryan Bates has a screencast to create this sorts of complex forms (http://railscasts.com/episodes/217-multistep-forms).
Hopefully it can help.
Is the reason the data can exceed the space allotted to session data because you're using cookie based sessions? If you need more space, why not use active record based sessions? It's trivial to make the change from cookie based sessions and is actually the recommended way (so why it's not the default I don't know)

Do I need to store session state when using AspProviders, MVC and Azure

My application uses MVC3 connected to a back end Azure table storage. I use AspProviders for login and logout.
There are times when I can confuse the AspProviders such that I will see the problems listed here
What I would like to know is why does my application even need to store session state. The way my application works is that every page call is independent and it could be sent to any running instance. With this in mind am I adding additional overhead by storing session data and is it really needed?
I hope someone out there can give me some advice.
Thanks,
Jon Wiley
You do not need session state and can even let the application know that, explicitly, that your controller will not be using session at all.
[SessionState (System.Web.SessionState.SessionStateBehavior.Disabled)]
public class MySessionlessController : Controller
{
...
}
Just remember that there are little "gotchas" that you might run into (TempData, for example, relies on session state by default).
Hope this helps.
If none of your application code uses any Session storage, then you can simply remove the Session provider from your web.config
Please also be careful using the AspProviders from the original PDC08 samples - these were never fully QAed to a commercial level.

Resources