I have search page in which I show results.
I configured redis and able to do redis fragment caching for result set and it is working as expected.
For my application depending on user preferences, some elements in search results will be changed for individual users. So, what I need now is to cache for individual user.
I'm able do this part as well with below statement.
$redis.set("result_#{current_user.id if current_user}")
My question is how can I delete all this cache for every user,
because I won't be knowing for which user cache is created and also I
don't to perform sql query on User to get all those ID's.
Does redis-namespace help me in this and how can I delete a namespace and keys under it.
And also is this best practice to have caching for user specific
Does redis clears on each deployment. I mean when does redis cache get clears.
You can let Redis remove keys with the LRU algorithm. Configure the memory limit with the maxmemory directive, and then set maxmemory-policy to the value allkeys-lru. You can find more information about how to use Redis as a cache at redis.io/topics/lru-cache.
Related
I need to store a global/class variable that is updated by managers from a web dashboard. The variable will be an array, lets call it car_types. About once a week managers need to go in and change the value. So maybe they'll update from ['suv', 'convertible', 'sedan'] to ['suv', 'convertible'].
What I'm not sure on is where to store this variable.
I could certainly create a database table with one record in it that gets updated, but that seems like overkill.
We use memecached, so I could send the variable there, though I'm not sure if thats persistent enough.
I was thinking of having the dashboard update a class variable, but we have dozens of servers running the same app, and I'm unclear if the change would be replicated to all boxes or just stay on one box.
thanks
Global variables are prefixed by $, example: $cars
But What if your application goes down? The global var is reinitialized to its default value.
I would recommend a database, eventually with caching if you want to save on performances.
You could cache your database values in you $cars variable
That's my personal approach: database + cache for records that being updated not often. cache is cleared when a change is made in the table, and cache is created (with a db request) during first fetch of the record.
As a result its all good cause you have the flexibility to change the records sometimes, no problem arise when the server goes down, or with multi-threading, and the cache permit not to kill performances
I have an MVC4 website and I'm using the OutputCache to cache the result of a view that displays multiple pages of ranked results. The cached output varies on the Page parameter. The rankings are a function of time, so the results for any given page can be out of sync depending on when they're cached, which is exacerbated by the fact that I'm using an infinite-scroll mechanism where duplicate results can be shown if a result gets pushed to the next page.
The ideal solution to this problem would be to cache some reasonable number of pages all at once. This would require being able to check if the cached output is expired, re-generate cached results if they are expired and then return the cached response. Is this possible?
I should also note that I'm using OutputCaching with the Azure Output caching provider, and I have a dedicated caching role (Note: not the shared caching service).
Any help would be greatly appreciated.
This would require being able to check if the cached output is
expired, re-generate cached results if they are expired and then
return the cached response. Is this possible?
This is exactly how OutputCaching works - request a page, if it exists in cache and isn't expired, retrieve it from cache, otherwise, render page and update the cache.
If the data really is this dynamic, you are probably causing more work/problems by caching the output without realizing any gains in performance (KISS applies here! Don't create a solution for a problem if you can avoid the problem in the 1st place).
However, to architect a solution as you describe (if really required) could be done with an Azure Queue and a Worker Role. Have your ratings engine stuff a value in the queue when a rating is added/updated. Then, have the Worker Role poll the Queue every second (for example) for values. If a value is found, the have the Worker Role do a web request against the cached page. This will update the output cache if it has expired. However, you are still limited by the cache expiration. unless you do something like from this SO post):
HttpResponse.RemoveOutputCacheItem() is probably the method you want
to use. If you can figure out what name the actions are cached under,
you can remove just the specific action (try setting a breakpoint or
dumping all of the names of cached items to the screen)
I have a system I've built in MVC 3 that currently provides a yearly submission cycle where the system proceeds through a serious of seven steps tied to dates stored in the web.config as AppSettings. However, each year, I always have to roll the system back and forth between previous steps in order to accommodate the end users. I would like to give the administrator the ability to control the system status without having to contact a developer. What is the best way to do this?
I plan to build a page with proper validation that lets the administrator set the dates. I've considered a couple options of how I should store those date, but none of them seem correct. Our entire permission system uses these dates, and various bits of text on the pages turns on and off based on what period we're currently in.
So far I've come up with two options:
Option 1: Create a database table – This was my first thought. I’ve set up properties on the MvcApplication class in the global.asax and pulled them from the database. Using a lazy loader, I can set the properties the first time they're needed. However, when they change in the database, I don't have a way to force the system to “reset” and read the date changes. If I do this action on Begin_Request(), I'm constantly opening the connection and resetting the properties for each file that the web browser opens on the server, regardless if it's static content or not.
I could directly fetch the dates from the database every time I need one of the dates, but then I'm having to redo a lot of functionality to reduce repeated database calls. I'd like to cache the dates for each request, and only pull them when I need them,
Option 2: Allow editing a config file through the application – I've looked up how to split the web.config file so I can have a separate file that just contains the appSettings. Then I could just update the new config file from a controller action. I think this would work nicely, and not require me to rewrite any of the existing functionality, but it feels like I would be introducing a bad design pattern into the code.
I'd vote for the database. For the sake of performance you can cache those parameter values in a static class inside your app and provide a method to reread them from DB in the same class. So:
When a user makes request, check if those properties are already cached. If they are - use cached values, if no - read them from DB
When administrator makes changes to those parameters - store them to database and enforce your static caching class to reread them from DB.
I would suggest an approach that doesn't care whether the settings are stored in database or key/value pairs in config file.
Since you want the settings to be accessed globally by all users you can cache the settings and the cache implementation should be generic and distributed. There are plenty of online resources available how to create such an interface.
Since you want the cache to be sync with the underlying data you have to set cache dependencies (AppFabric won't supports sql cache dependency see this thread, while NCache supports both sql and file).
I would store the values in a database and use a distributed cache to persist the data across the web farm. MS AppFabric Caching has worked well for me. You will need to implement a standard caching pattern (check the cache, if null load from db and insert into cache).I would probably just create a static Load() method that abstracts this logic away. When the admins update the db you could update the cache or just delete the cachekey.
Therr are other considerations to be added to performance. Namely if you modify the config file thr application pool is re iniyializrd, while the database solution doesnt cause application reinitialization
...so do you need to re initialize the app after the changes or not?...If there i no way to avoid the inizialization whitout drastic changmes to the application ptobably the config filr solution is better
In the process of looking at my logs from Mongrel, I found some SQL statements that I wanted to optimize. While looking into these, I noticed that these entries sometimes have CACHE in front of them, e.g.:
CACHE (0.0ms) SELECT * FROM `customers` WHERE (`customers`.`id` = 35)
Given the execution time, I'm assuming Mongrel really is caching this data. My question is how is this configured? I haven't been able to find much online about caching of model data; most of what I've read has to do with caching static pages or page fragments. I haven't done anything explicitly to enable this caching, so I'm really just looking for a pointer on how it's configured and how it works. Thanks in advance!
It isn't actually anything to do with mongrel. Rails does a ActiveRecord::Base.cache around every controller action by default. This means that in the scope of that action it will cache the results of queries and provide the results from the cache rather than hitting the database again. You should see an identical query higher in the log (within the same action) that is not prefixed with CACHE which is the original query for which the results have been stored.
Some more details here.
I want to cache query results so that the same results are fetched "for more than one request" till i invalidate the cache. For instance, I want to render a sidebar which has all the pages of a book, much like the index of a book. As i want to show it on every page of the book, I have to load it on every request. I can cache the rendered sidebar index using action caching, but i also want to actually cache the the query results which are used to generate the html for the sidebar. Does Rails provide a way to do it? How can i do it?
You can cache the query results using ActiveSupport's cache store, which can by backed by a memory store such as memcached, or a database store if you provide your own implementation. Note that you'll want to use a database store if the cache is shared across multiple Ruby processes which will be the case if you're deploying to Mongrel Cluster or Phusion Passenger.
This Railscast has the details
You also could check for your action cache before querying the database in the controller.