Options for caching data from the database? - asp.net-mvc

My code already has things like this in the controller:
[OutputCache(Duration = 86400]
public string SelectTopics(bool showAll = true, string topicID = null)
{
return SelectHelper.Topics(showAll, topicID);
}
Am I correct in saying this will be cached?
How about in external non-controller classes and methods. Is there any way I can have database caching with these or must all database caching be through a controller? What about other ways/means of accessing the data. Anything else that allows caching?

Am I correct in saying this will be cached?
This caches the entire HTML output. It means that the controller action will not even be executed if the output is cached. And in this case the SelectHelper.Topics won't run.
Other ways of caching data involve using classes in the System.Runtime.Caching namespace. They allow you to store NET objects into cache. For example you could store the results retrieved from a database call into the cache and then check whether the cache contains those results before hitting the database the next time. You may take a look at the MemoryCache class for an example.

Related

How to fix connection close issue in synchronised method?

In our application, we are using grails framework and SQL server for database. We have multiple sites and those sites can have multiple users (a few users) and if they are accessing the same method via AJAX that can cause issue so we made the that method as synchronized method and to minimize the database interaction we are storing data in map on site basis since all the user from one site will get the same data, and if the data is older than 10 seconds we get the data from database and update the map object. Here we are getting a lot of database connection close issues on the very first line of synchronized method where we are getting site object from database. What is the issue here and how we can resolve the issue?
def synchronized getData(params){
Site site = Site.get(params.siteId)
// Here we are checking whether site data does not exists in map
// or the data expired (10 second older data) then we get data from
// database and update the map object
// Then here we create new list object from the data in map object
return list
}
Difficult to figure out the exact problem without more information here. Several things stand out...
I'm not especially familiar with using the synchronized keyword in front of a service method, I would recommend trying the synchronized annotation with a static object key:
private static final myLock = new Object()
#Synchronized("myLock")
void getData() {
//do stuff
}
or synchronizing explicitly within the method
void getData() {
synchronized(myLock) {
//do stuff
}
}
I don't know if that's related to your connection closing issues, but worth a try.
But also notably, grails and hibernate provide caching of database retrieves, so if you're loading the same data that's been loaded into hibernate cache, you don't need to cache this in a Map locally... grails is already doing that for you. Site site = Site.get(params.siteId) will NOT make a database call if it's been called recently and is already cached by the framework.
I would strongly suggest running some performance checks just making that call vs. caching in a Map object, especially if you're expiring in ~10s anyway.

Where to store a Doctrine variable created in a component so that it's accessible anywhere?

Note I am referring to one request, and not several requests and sessions.
I have several components that require Doctrine user object, some are located in layout, others are located in templates. Sometimes I need that Doctrine user object in action. Currently I have added a function to sfUser class that loads that object from database, which means every time I call that function I make a call to db. I'd like to know where to store this object so that I can access it without having to query db every time I need it. Again, we're talking about a single request, not several requests or something that would require session.
Can I save it in sfContext somehow? Any other places so that it can be available everywhere?
You can store it in your model's Table class, because tables are always accessed as singletones.
class sfGuardUserTable extends PluginsfGuardUserTable
{
protected $specialUser = null;
public function getSpecialUser()
{
if (null === $this->specialUser)
{
$this->specialUser = $this->findOneById(1);
}
return $this->specialUser;
}
}
Now, you can use this in actions and components like this:
$u = sfGuardUserTable::getInstance()->getSpecialUser();
And you will always end up with one query.
you can configure Doctrine cache so that the result of this specific request is always cached. What if so good about it is that if you use, say, the APC backend, you will have it cached across requests. You also get query caching as a bonus (this is not result caching, read the link I provided carefully)!

How can I store user information in MVC between requests

I have an MVC2-site using Windows authentication.
When the user requests a page I pull some user information from the database. The class I retrieve is a Person class.
How can get this from the database when the user enters the site, and pick up the same class without touching the db on all subsequent page requests?
I must admit, I am pretty lost when it comes to session handling in ASP.net MVC.
You can store that kind of information in HttpContextBase.Session.
One option is to retrieve the Person object from your database on the first hit and store it in System.Web.HttpContext.Current.Cache, this will allow extremely fast access and your Person data will be temporarily stored in RAM on the web server.
But be careful: If you are storing significantly large amount of user data in this way, you could eat up a lot of memory. Nevertheless, this will be perfectly fine if you only need to cache a few thousand or so. Clearly, it depends upon how many users you expect to be using your app.
You could add like this:
private void CachePersonData (Person data, string storageKey)
{
if (HttpContext.Current.Cache[storageKey] == null)
{
HttpContext.Current.Cache.Add(storageKey,
data,
null,
Cache.NoAbsoluteExpiration,
TimeSpan.FromDays(1),
CacheItemPriority.High,
null);
}
}
... and retrieve like this:
// Grab data from the cache
Person p = HttpContext.Current.Cache[storageKey];
Don't forget that the object returned from the cache could be null, so you should check for this and load from the database as necessary (then cache).
First of all, if you are using a load balanced environment, I wouldn't recommend any solution that you try without storing it in a database, because it will eventually fail.
If you are not in a load balancing environment, you can use TempData to store your object and then retrieve it in the subsequent request.
HttpContext.Current.Session[key];

Data Access Layer - static list objects and caching

i am devloping a site using .net MVC
i have a data access layer which basically consists of static list objects that are created from data within my database.
The method that rebuilds this data first clears all the list objects. Once they are empty it then add the data. Here is an example of one of the lists im using. its a method which generates all the UK postcodes. there are about 50 methods similar to this in my application that return all sorts of information, such as towns, regions, members, emails etc.
public static List<PostCode> AllPostCodes = new List<PostCode>();
when the rebuild method is called it first clears the list.
ListPostCodes.AllPostCodes.Clear();
next it re-bulilds the data, by calling the GetAllPostCodes() method
/// <summary>
/// static method that returns all the UK postcodes
/// </summary>
public static void GetAllPostCodes()
{
using (fab_dataContextDataContext db = new fab_dataContextDataContext())
{
IQueryable AllPostcodeData = from data in db.PostCodeTables select data;
IDbCommand cmd = db.GetCommand(AllPostcodeData);
SqlDataAdapter adapter = new SqlDataAdapter();
adapter.SelectCommand = (SqlCommand)cmd;
DataSet dataSet = new DataSet();
cmd.Connection.Open();
adapter.FillSchema(dataSet, SchemaType.Source);
adapter.Fill(dataSet);
cmd.Connection.Close();
// crete the objects
foreach (DataRow row in dataSet.Tables[0].Rows)
{
PostCode postcode = new PostCode();
postcode.ID = Convert.ToInt32(row["PostcodeID"]);
postcode.Outcode = row["OutCode"].ToString();
postcode.Latitude = Convert.ToDouble(row["Latitude"]);
postcode.Longitude = Convert.ToDouble(row["Longitude"]);
postcode.TownID = Convert.ToInt32(row["TownID"]);
AllPostCodes.Add(postcode);
postcode = null;
}
}
}
The rebuild occurs every 1 hour. this ensures that every 1 hour the site will have fresh set of cached data.
the issue ive got is that occasionally if during a rebuild, the server will be hit by a request and an exception is thrown. The exception is "Index was outside the bounds of the array." it is due to when a list is being cleared.
ListPostCodes.AllPostCodes.Clear(); - // throws exception - although its not always in regard to this list.
Once this exception is thrown application dies, All users are affected. I have to restart the server to fix it.
i have 2 questions...
If i utilise caching instead of static objects would this help ?
Is there any way i can say "while the rebuild is taking place, wait for it to complete until accepting requests"
any help is most appricaiated ;)
truegilly
1 If i utilise caching instead of
static objects would this help ?
Yes, all the things you do are easier done by the caching functionality that is build into ASP.NET
Is there any way i can say "while the
rebuild is taking place, wait for it
to complete until accepting requests"
The common pattern goes like this:
You request data from the Data layer
If the Datlayer sees that there is data in the cache, then it serves the data from cache
If no data is in the cache the data is requested from the db and put into cache. After that it is served to the client
There are rules (CacheDependency and Timeout) when the cache is to be cleared.
The easiest solution would be you stick to this pattern: This way the first request would hit the database and other requests get served from the cache. You trigger the refresh by implementing an SQLCacheDependency
You have to make sure that your list is not modified by one thread while other threads are trying to use it. This would be a problem even if you used the ASP.NET cache since collections are just not thread-safe. One way you can do this is by using a SynchronizedCollection instead of a List. Then make sure to use code like the following when you access the collection:
lock (synchronizedCollection.SyncRoot) {
synchronizedCollection.Clear();
etc...
}
You will also have to use locking when you read the collection. If you are enumerating over it, you should probably make a copy before doing so as you don't want to lock for a long time. For example:
List<whatever> tempCollection;
lock (synchrnonizedCollection.SyncRoot) {
tempCollection = new List<whatever>(synchronizedCollection);
}
//use temp collection to access cached data
The other option would be to create a ThreadSafeList class that uses locking internally to make the list object itself thread-safe.
I agree with Tom, you will have to do synchronization to make this work. One thing that would improve the performance is not clearing the list until you actually receive the new values from the database:
// Modify your function to return a new list instead of filling the existing one.
public static List<PostCode> GetAllPostCodes()
{
List<PostCode> temp = new List<PostCode>();
...
return temp;
}
And when you rebuild the data:
List<PostCode> temp = GetAllPostCodes();
AllPostCodes = temp;
This makes sure that your cached list is still valid while GetAllPostCodes() is executing. It also has the advantage that you can use a read-only list which makes the synchronization a bit easier.
In your case you need to refresh the data every one hour.
1) IT should use cache with absolute expiration set to 1 hour, so it expires after every 1 hour. Check the Cache before using it, by doing a NULL check.If its NULL get the data from DB and populate the Cache.
2) With above approach the disadvantage is that data can be stale by 1 hour. So if u want most updated data at all times, use SQLCacheDependency (PUSH). so whenever there is a change in the select command u r using, cache will be refreshed from the database with updated data.

ASP.NET MVC - Sharing Session State Between Controllers

I am still mostly unfamiliar with Inversion of Control (although I am learning about it now) so if that is the solution to my question, just let me know and I'll get back to learning about it.
I have a pair of controllers which need to a Session variable, naturally nothing too special has happen because of how Session works in the first place, but this got me wondering what the cleanest way to share related objects between two separate controllers is. In my specific scenario I have an UploadController and a ProductController which work in conjunction with one another to upload image files. As files are uploaded by the UploadController, data about the upload is stored in the Session. After this happens I need to access that Session data in the ProductController. If I create a get/set property for the Session variable containing my upload information in both controllers I'll be able to access that data, but at the same time I'll be violating all sorts of DRY, not to mention creating a, at best, confusing design where an object is shared and modified by two completely disconnected objects.
What do you suggest?
Exact Context:
A file upload View posts a file to UploadController.ImageWithpreview(), which then reads in the posted file and copies it to a temporary directory. After saving the file, another class produces a thumbnail of the uploaded image. The path to both the original file and the generated thumbnail are then returned with a JsonResult to a javascript callback which updates some dynamic content in a form on the page which can be "Saved" or "Cancelled". Whether the uploaded image is saved or it is skipped, I need to either move or delete both it and the generated thumbnail from the temporary directory. To facilitate this, UploadController keeps track of all of the upload files and their thumbnails in a Session-maintained Queue object.
Back in the View: after the form is populated with a generated thumbnail of the image that was uploaded, the form posts back to the ProductsController where the selected file is identified (currently I store the filename in a Hidden field, which I realize is a horrible vulnerability), and then copied out of the temp directory to a permanent location. Ideally, I would like to simply access the Queue I have stored in the Session so that the form does not need to contain the image location as it does now. This is how I have envisioned my solution, but I'll eagerly listen to any comments or criticisms.
A couple of solutions come to mind. You could use a "SessionState" class that maps into the request and gets/sets the info as such (I'm doing this from memory so this is unlikely to compile and is meant to convey the point):
internal class SessionState
{
string ImageName
{
get { return HttpContext.Current.Session["ImageName"]; }
set { HttpContext.Current.Session["ImageName"] = value; }
}
}
And then from the controller, do something like:
var sessionState = new SessionState();
sessionState.ImageName = "xyz";
/* Or */
var imageName = sessionState.ImageName;
Alternatively, you could create a controller extension method:
public static class SessionControllerExtensions
{
public static string GetImageName(this IController controller)
{
return HttpContext.Current.Session["ImageName"];
}
public static string SetImageName(this IController controller, string imageName)
{
HttpContext.Current.Session["ImageName"] = imageName;
}
}
Then from the controller:
this.SetImageName("xyz");
/* or */
var imageName = this.GetImageName();
This is certainly DRY. That said, I don't particularly like either of these solutions as I prefer to store as little data, if any, in session. But if you're intent is to hold onto all of this information without having to load/discern it from some other source, this is the quickest (dirtiest) way I can think of to do it. I'm quite certain there's a much more elegant solution, but I don't have all of the information about what it is you're trying to do and what the problem domain is.
Keep in mind that when storing information in the session, you will have to dehydrate/rehydrate the objects via serialization and you may not be getting the performance you think you are from doing it this way.
Hope this helps.
EDIT: In response to additional information
Not sure on where you're looking to deploy this, but processing images "real-time" is a sure fire way to be hit with a DoS attack. My suggestion to you is as follows -- this is assuming that this is public facing and anyone can upload an image:
1) Allow the user to upload an image. This image goes into the processing queue for background processing by the application or some service. Additionally, the name of the image goes into the user's personal processing queue -- likely a table in the database. Information about background processing in a web app can be found # Schedule a job in hosted web server
2) Process these images and, while processing, display a "processing graphic". You can have an ajax request on the product page that checks for images being processed and trys to reload them every X seconds.
3) While an image is being "processed", the user can opt out of processing assuming they're the one that uploaded the image. This is available either on the product page(s) that display the image or on a separate "user queue" view that will allow them to remove the image from consideration.
So, you end up with some more domain objects and those objects are managed by the queue. I'm a strong advocate of convention over configuration so the final destination of the product image(s) should be predefined. Something like:
images/products/{id}.jpg or, if a collection, images/products/{id}/{sequence}.jpg.
You then don't need to know the destination in the form. It's the same for all images.
The queue then needs to know where the temp image was uploaded and what the product id was. The queue worker pops items from the queue, processes them, and stores them accordingly.
I know this sounds a little more "structured" than what you originally intended, but I think it's a little cleaner.
Is there complete equivalence between the UploadController and ProductController?
As files are uploaded by the UploadController, data about the upload is stored in the Session. After this happens I need to access that Session data in the ProductController.
As I read that the UploadControl needs read and write access to Upload data, the ProductController needs only read.
If that's true then you can make it clear by using an immuatable wrapper around the upload information and have the UploadController put that into the session.
The Session itself is by definiton a public shared noticeboard, decouples explicit relationships at the cost of allowing anyone to get and put. You could allow the ProductController to know about the UploadController and hence remove the need for passing the upload information via the session. My instinct is that the upload info is interesting to the public, so using Session is reasonable.
I don't see any DRY violation here, we are explicitly trying to separate responsibilities.

Resources