How to fix connection close issue in synchronised method? - grails

In our application, we are using grails framework and SQL server for database. We have multiple sites and those sites can have multiple users (a few users) and if they are accessing the same method via AJAX that can cause issue so we made the that method as synchronized method and to minimize the database interaction we are storing data in map on site basis since all the user from one site will get the same data, and if the data is older than 10 seconds we get the data from database and update the map object. Here we are getting a lot of database connection close issues on the very first line of synchronized method where we are getting site object from database. What is the issue here and how we can resolve the issue?
def synchronized getData(params){
Site site = Site.get(params.siteId)
// Here we are checking whether site data does not exists in map
// or the data expired (10 second older data) then we get data from
// database and update the map object
// Then here we create new list object from the data in map object
return list
}

Difficult to figure out the exact problem without more information here. Several things stand out...
I'm not especially familiar with using the synchronized keyword in front of a service method, I would recommend trying the synchronized annotation with a static object key:
private static final myLock = new Object()
#Synchronized("myLock")
void getData() {
//do stuff
}
or synchronizing explicitly within the method
void getData() {
synchronized(myLock) {
//do stuff
}
}
I don't know if that's related to your connection closing issues, but worth a try.
But also notably, grails and hibernate provide caching of database retrieves, so if you're loading the same data that's been loaded into hibernate cache, you don't need to cache this in a Map locally... grails is already doing that for you. Site site = Site.get(params.siteId) will NOT make a database call if it's been called recently and is already cached by the framework.
I would strongly suggest running some performance checks just making that call vs. caching in a Map object, especially if you're expiring in ~10s anyway.

Related

Unable to clear cache in EF

Iam facing a problem while using factory model in MVC.
As i update and try to display the data from the same table, the update is being performed in the database but the updated data is not being fetched from the database.
I feel that it is fetching the data from the Entities and displaying the data.
I used Modelstate.clear() outputcache etc., but none of it worked.
code used:
For Update:
public virtual void Update(TObject TObject)
{
var entry = Context.Entry(TObject);
DbSet.Attach(TObject);
entry.State = EntityState.Modified;
}
calling Update method in my service and saving changes:
Registry.RepositoryFactory.GetUsersRepository().Update(userobj);
Registry.Context.SaveChanges();
Fetching data after save:
Select:
public virtual IQueryable<TObject> All()
{
return DbSet.AsQueryable();
}
I am able to update in the database, but as it try to retrieve the data immediately from the same table it is not hitting the database, i think it is fetching the data from the cache.
Any pointers are welcome.
Thanks in advance,
Girish.
I have followed the link provided by Damon, the problem is that Refresh occurs. But it is taking few seconds(2 or 3). The page has to load immediately.
The solution that has worked for me is that, while fetching the data from the repository, i am setting the entity using Set. And then i used Refresh method before fetching the data.
Code used:
DbSet set = ((DbContext)Context).Set();
((IObjectContextAdapter)Context).ObjectContext.Refresh(System.Data.Objects.RefreshMode.StoreWins, set);
return DbSet as IQueryable;
I'm going to guess that you are re-using the same EF Context object for both the Update and the Select. If the Context is not disposed of between these events you will end up with a stale context and data will be returned from the EF cache.
Make sure you are disposing of the EF context between calls with the best practise being to surround it in a Using statement. An alternative to this is to call Refresh() on the Context (See this question). You'll still need to dispose of the context at some point because otherwise it will continue to grow and your application will get slower and slower.
I've answered a similar question here.

MVC 3 - Sessionless controllers datastore options

i've been reading about sessionless controllers lately and it seems an interesting idea, since it improves perfomance and let ajax calls to be asynchronous, as usually they should be.
However, i can't figure a nice way to store data that would previously stored by a session. I have a lot of single-fetch data that i get once and walks with it through several pages. My first thought was to use MemoryCache, but reading this post i begin to doubt it, since IIS can let go my data anytime.
Because of this, i got a little confused on what should i do to store data in a session like way. I read a couple of thing about NoSQL and MongoDB, but wouldn't that be the same as to fetch data all the time i need it?
Can you give me some clarifications and technologies i can use to serve as temporary datastore?
Have you considered using the HttpContext.Cache? As you're saying in a session like way, there is no reason you couldn't create a cache key based upon the sessionid of the current request:
// cache key
var cacheKey = string.Format("{0}-{1}", "SomeKey", Session.SessionID);
// save to cache
HttpContext.Cache.Insert(cacheKey, <yourobject>, null, Cache.NoAbsoluteExpiration, TimeSpan.FromMinutes(20));
From there it would simply be a matter of passing along the sessionid and retrieving at a later time:
HttpContext.Cache[cacheKey]

Where to store a Doctrine variable created in a component so that it's accessible anywhere?

Note I am referring to one request, and not several requests and sessions.
I have several components that require Doctrine user object, some are located in layout, others are located in templates. Sometimes I need that Doctrine user object in action. Currently I have added a function to sfUser class that loads that object from database, which means every time I call that function I make a call to db. I'd like to know where to store this object so that I can access it without having to query db every time I need it. Again, we're talking about a single request, not several requests or something that would require session.
Can I save it in sfContext somehow? Any other places so that it can be available everywhere?
You can store it in your model's Table class, because tables are always accessed as singletones.
class sfGuardUserTable extends PluginsfGuardUserTable
{
protected $specialUser = null;
public function getSpecialUser()
{
if (null === $this->specialUser)
{
$this->specialUser = $this->findOneById(1);
}
return $this->specialUser;
}
}
Now, you can use this in actions and components like this:
$u = sfGuardUserTable::getInstance()->getSpecialUser();
And you will always end up with one query.
you can configure Doctrine cache so that the result of this specific request is always cached. What if so good about it is that if you use, say, the APC backend, you will have it cached across requests. You also get query caching as a bonus (this is not result caching, read the link I provided carefully)!

How can I store user information in MVC between requests

I have an MVC2-site using Windows authentication.
When the user requests a page I pull some user information from the database. The class I retrieve is a Person class.
How can get this from the database when the user enters the site, and pick up the same class without touching the db on all subsequent page requests?
I must admit, I am pretty lost when it comes to session handling in ASP.net MVC.
You can store that kind of information in HttpContextBase.Session.
One option is to retrieve the Person object from your database on the first hit and store it in System.Web.HttpContext.Current.Cache, this will allow extremely fast access and your Person data will be temporarily stored in RAM on the web server.
But be careful: If you are storing significantly large amount of user data in this way, you could eat up a lot of memory. Nevertheless, this will be perfectly fine if you only need to cache a few thousand or so. Clearly, it depends upon how many users you expect to be using your app.
You could add like this:
private void CachePersonData (Person data, string storageKey)
{
if (HttpContext.Current.Cache[storageKey] == null)
{
HttpContext.Current.Cache.Add(storageKey,
data,
null,
Cache.NoAbsoluteExpiration,
TimeSpan.FromDays(1),
CacheItemPriority.High,
null);
}
}
... and retrieve like this:
// Grab data from the cache
Person p = HttpContext.Current.Cache[storageKey];
Don't forget that the object returned from the cache could be null, so you should check for this and load from the database as necessary (then cache).
First of all, if you are using a load balanced environment, I wouldn't recommend any solution that you try without storing it in a database, because it will eventually fail.
If you are not in a load balancing environment, you can use TempData to store your object and then retrieve it in the subsequent request.
HttpContext.Current.Session[key];

Data Access Layer - static list objects and caching

i am devloping a site using .net MVC
i have a data access layer which basically consists of static list objects that are created from data within my database.
The method that rebuilds this data first clears all the list objects. Once they are empty it then add the data. Here is an example of one of the lists im using. its a method which generates all the UK postcodes. there are about 50 methods similar to this in my application that return all sorts of information, such as towns, regions, members, emails etc.
public static List<PostCode> AllPostCodes = new List<PostCode>();
when the rebuild method is called it first clears the list.
ListPostCodes.AllPostCodes.Clear();
next it re-bulilds the data, by calling the GetAllPostCodes() method
/// <summary>
/// static method that returns all the UK postcodes
/// </summary>
public static void GetAllPostCodes()
{
using (fab_dataContextDataContext db = new fab_dataContextDataContext())
{
IQueryable AllPostcodeData = from data in db.PostCodeTables select data;
IDbCommand cmd = db.GetCommand(AllPostcodeData);
SqlDataAdapter adapter = new SqlDataAdapter();
adapter.SelectCommand = (SqlCommand)cmd;
DataSet dataSet = new DataSet();
cmd.Connection.Open();
adapter.FillSchema(dataSet, SchemaType.Source);
adapter.Fill(dataSet);
cmd.Connection.Close();
// crete the objects
foreach (DataRow row in dataSet.Tables[0].Rows)
{
PostCode postcode = new PostCode();
postcode.ID = Convert.ToInt32(row["PostcodeID"]);
postcode.Outcode = row["OutCode"].ToString();
postcode.Latitude = Convert.ToDouble(row["Latitude"]);
postcode.Longitude = Convert.ToDouble(row["Longitude"]);
postcode.TownID = Convert.ToInt32(row["TownID"]);
AllPostCodes.Add(postcode);
postcode = null;
}
}
}
The rebuild occurs every 1 hour. this ensures that every 1 hour the site will have fresh set of cached data.
the issue ive got is that occasionally if during a rebuild, the server will be hit by a request and an exception is thrown. The exception is "Index was outside the bounds of the array." it is due to when a list is being cleared.
ListPostCodes.AllPostCodes.Clear(); - // throws exception - although its not always in regard to this list.
Once this exception is thrown application dies, All users are affected. I have to restart the server to fix it.
i have 2 questions...
If i utilise caching instead of static objects would this help ?
Is there any way i can say "while the rebuild is taking place, wait for it to complete until accepting requests"
any help is most appricaiated ;)
truegilly
1 If i utilise caching instead of
static objects would this help ?
Yes, all the things you do are easier done by the caching functionality that is build into ASP.NET
Is there any way i can say "while the
rebuild is taking place, wait for it
to complete until accepting requests"
The common pattern goes like this:
You request data from the Data layer
If the Datlayer sees that there is data in the cache, then it serves the data from cache
If no data is in the cache the data is requested from the db and put into cache. After that it is served to the client
There are rules (CacheDependency and Timeout) when the cache is to be cleared.
The easiest solution would be you stick to this pattern: This way the first request would hit the database and other requests get served from the cache. You trigger the refresh by implementing an SQLCacheDependency
You have to make sure that your list is not modified by one thread while other threads are trying to use it. This would be a problem even if you used the ASP.NET cache since collections are just not thread-safe. One way you can do this is by using a SynchronizedCollection instead of a List. Then make sure to use code like the following when you access the collection:
lock (synchronizedCollection.SyncRoot) {
synchronizedCollection.Clear();
etc...
}
You will also have to use locking when you read the collection. If you are enumerating over it, you should probably make a copy before doing so as you don't want to lock for a long time. For example:
List<whatever> tempCollection;
lock (synchrnonizedCollection.SyncRoot) {
tempCollection = new List<whatever>(synchronizedCollection);
}
//use temp collection to access cached data
The other option would be to create a ThreadSafeList class that uses locking internally to make the list object itself thread-safe.
I agree with Tom, you will have to do synchronization to make this work. One thing that would improve the performance is not clearing the list until you actually receive the new values from the database:
// Modify your function to return a new list instead of filling the existing one.
public static List<PostCode> GetAllPostCodes()
{
List<PostCode> temp = new List<PostCode>();
...
return temp;
}
And when you rebuild the data:
List<PostCode> temp = GetAllPostCodes();
AllPostCodes = temp;
This makes sure that your cached list is still valid while GetAllPostCodes() is executing. It also has the advantage that you can use a read-only list which makes the synchronization a bit easier.
In your case you need to refresh the data every one hour.
1) IT should use cache with absolute expiration set to 1 hour, so it expires after every 1 hour. Check the Cache before using it, by doing a NULL check.If its NULL get the data from DB and populate the Cache.
2) With above approach the disadvantage is that data can be stale by 1 hour. So if u want most updated data at all times, use SQLCacheDependency (PUSH). so whenever there is a change in the select command u r using, cache will be refreshed from the database with updated data.

Resources