Is it strange to let one user refresh the server cache when it affects all users? - asp.net-mvc

I'm displaying a lot of data on a website that won't change often. Because of this, I'm caching the data in HttpRuntime.Cache, which I understand to cache data for all users of the website.
However, I also want to offer the ability to force a refresh in case cache data becomes stale. Since the data is cached for all users, this means that if a few people are using the site at once, it'll affect everyone. Is this a common pattern? It seems like strange behavior for a site to display, especially since one user could slow everyone down by constantly forcing cache refreshes. It still doesn't make sense to do clientside caching since the data will be the same for all users.

Caching data visible to all users is extremely common, and is a good practice. However giving users the ability to refresh the cache is pretty rare. The better path would be to expire your cache when data is saved that would change the contents of a cached page.
Smart cache invalidation means that your users always see the freshest data, but they also get the benefits of your caching efforts. Ideally, you're only expiring the pages affected by a change - not your entire cache.

I think it would be careless of you to allow a normal user to have the ability to invoke a "clear cache" operation.. your cache-able data should have some sort of dependency defined. See: Cache Expiration Policies

Related

ServiceWorker on network response strategy not working

I would like to implement the on network response caching strategy. But once the cache populates, it will never update the cache again. So I was thinking if I have to clear the cache manually for this strategy to work.
They say this strategy is ideal for frequently updating resources such as a user's inbox, or article contents. Also useful for non-essential content such as avatars, but care is needed.
My content is never updated unless I clear the cache. Is something wrong in my understanding?

PWA offline capability on data that frecuently changes

Are Progressive Web Apps offline capability a good idea on applications that display data that change frequently like a bank account balance?
If The user is using his PWA offline mode and navigates for example to the bank product balances section he is actually viewing no updated data about his balances and allowing to make operations based on a data that may not be updated.
Do I miss something about this approach (PWA) on data that frequently changes?
PWA doesn't mean you capture the entire page. As a developer, you choose what you want to cache. Two type of cache can be done.
1) Static content cache aka App shell cache - like your HTML/CSS/JS and image files. This can be refreshed using service worker when the change(will happen in the background without user needing to do anything). This is something which can be done even for sites like bank transaction page.
2) API data cache - This is where you cache the dynamic data like JSON response from your web service. Even this can be implemented for bank transaction page, if displayed the information responsibly. Say on top of the transactions, you can display a message "Transactions as of June-6th-2018 5.11PM" in a nice prompting way so user knows he is not seeing real-time data, but he/she might be happy to see the old transactions if thats what he is looking for.
Or you can completely ignore to cache dynamic data like API response or server rendered HTML which has such dynamic data and cache what is static only.
End of the day, its you as a developer who decides what to cache and caching something will give you improvement over no cache even in such dynamic content site.
Here is a Google's doc on explaining both.

How to ensure data consistency and truly take advantage of Core Data?

I've worked on several iOS apps, some of them utilize Core Data, and some of them don't. While I consider myself having a basic to somewhat good understanding of Core Data, there's always something that makes me doubt the usefulness of it. I've done a lot of reading on the subject, and the general consensus seems to be the advantages of using it outweighs the disadvantages. I recently submitted an app without using Core Data, and always planned on going back to update the project to utilize it when I have the time for some optimization work. Now's the time, but I wonder if it makes sense for the app I'm working on, and maybe I am not using it correctly all along. Please advise and point out what I am missing.
The project I am working on is a social networking app, which also has a front-end site. We have standard features like a newsfeed, event listing, the ability to follow/unfollow someone, and a map with POIs at user's location. Currently, we're using pagination whenever needed when requesting data from server.
My understanding of why Core Data is great:
Easier to manage data with complicated relationship
Easier to access data without having to pass them around
Easier to manipulate, fetch, and sort your data
Better memory utilization
Improve perceived performance by preloading data stored locally until latest data's received
The problem I am having is, since I am using pagination when requesting for data instead of requesting for all at once. The data stored locally is only a subset of the current state in the database. Let's use newsfeed as an example. If I do the following, it will cause some problems:
User manually refresh the newsfeed -> Controller notifies model that it needs the latest 20 items -> Model requests for the latest 20 items in the newsfeed and save them as NSManagedObject -> Model notifies controller that data is ready -> Fetch the latest 20 items to show in UITableView
If user A refreshes the newsfeed, background the app, and then user B deletes his post in the newsfeed (let's say it was 10th item) before user A foregrounds the app again to refresh the newsfeed. In user A's newsfeed, B's post will still be up there because according to the createdAt attribute, it's indeed one of the latest 20 items.
To fix this problem, I can think of a few solutions:
Add a flag to the item to indicate it's removed
Always discard local data when new data arrives
Disable pagination
Instead of using the workflow described above, always present the requested data only instead of fetching the latest
Solution 1 means custom code is required to deal with different clients since browser doesn't need deleted items but iOS client does. However, even though it can work, it can potentially mess up the pagination mechanism and can cause weird behaviours in the client. For example, if a large amount of items gets removed, the latest 20 items will contain only a few items that will actually show up in the newsfeed on the client when user refreshes it. As user follows more people, more stories will be inserted in his newsfeed as well. This solution won't work very well in this situation.
Solution 2 totally defeats the purpose of using Core Data in the first place unless I am missing something.
Solution 3 means the client always needs to request for all data. This is nearly impossible to do because as you get more data, the time to retrieve and process them will make the app slow and unresponsive. It also doesn't make sense from technical and UX point of view.
Solution 4 also kinda defeats the purpose of using Core Data because it's the same workflow when we only store data in memory. You can still fetch and find objects but they might be invalid on the server already at the time of access.
Am I missing something? How is Core Data supposed to be used in this scenario? How do you ensure data consistency when the client doesn't have all the data? Thanks you in advance.

Programmatically caching a bunch of pages all at the same time using the Output Cache

I have an MVC4 website and I'm using the OutputCache to cache the result of a view that displays multiple pages of ranked results. The cached output varies on the Page parameter. The rankings are a function of time, so the results for any given page can be out of sync depending on when they're cached, which is exacerbated by the fact that I'm using an infinite-scroll mechanism where duplicate results can be shown if a result gets pushed to the next page.
The ideal solution to this problem would be to cache some reasonable number of pages all at once. This would require being able to check if the cached output is expired, re-generate cached results if they are expired and then return the cached response. Is this possible?
I should also note that I'm using OutputCaching with the Azure Output caching provider, and I have a dedicated caching role (Note: not the shared caching service).
Any help would be greatly appreciated.
This would require being able to check if the cached output is
expired, re-generate cached results if they are expired and then
return the cached response. Is this possible?
This is exactly how OutputCaching works - request a page, if it exists in cache and isn't expired, retrieve it from cache, otherwise, render page and update the cache.
If the data really is this dynamic, you are probably causing more work/problems by caching the output without realizing any gains in performance (KISS applies here! Don't create a solution for a problem if you can avoid the problem in the 1st place).
However, to architect a solution as you describe (if really required) could be done with an Azure Queue and a Worker Role. Have your ratings engine stuff a value in the queue when a rating is added/updated. Then, have the Worker Role poll the Queue every second (for example) for values. If a value is found, the have the Worker Role do a web request against the cached page. This will update the output cache if it has expired. However, you are still limited by the cache expiration. unless you do something like from this SO post):
HttpResponse.RemoveOutputCacheItem() is probably the method you want
to use. If you can figure out what name the actions are cached under,
you can remove just the specific action (try setting a breakpoint or
dumping all of the names of cached items to the screen)

How many users is too much when using session variables?

I'm currently using session variables as a cache to cut down the calls to the database.
I'm wondering at about how many concurrent users does this stop working at? 1,000, 10,000 ....100,000??? Also will iis start flaking out at a certain load? And are there
any alternatives?
I know it depends on how much data I'm storing per user, but I want to hear from other peoples experiences.
I do have it setup so that when the code tries to access a timed out session that it reloads from the database.
I'm currently using iis6 but I could easily use iis7 if it handles sessions better.
Edit: Yes I'm using application variables for non user specific data.
If this is of any concern to you, use a State Server or the SQL Storage options for the session. For almost all applications it will not prove to be a problem though.
You should probably be looking at Memcached if you're getting to this point.
If you have more than 124,889 users, your server will begin to be unresponsive.
Edit: if your data does not change and can be re-used, try caching it in an application scoped variable, i.e. reference data.
It's very unlikely that session variable capacity will ever be the limiting resource for your server. Any particular reason you're asking?
How about using the Cache instead, it allows automatic cache invalidation.
Cache invalidation can be done both based on a timeout and due to it being "kicked out" due to ressources.
You can use the Cache on a per-user-basis by giving each item a user specific key.

Resources