Unable to clear cache in EF - asp.net-mvc

Iam facing a problem while using factory model in MVC.
As i update and try to display the data from the same table, the update is being performed in the database but the updated data is not being fetched from the database.
I feel that it is fetching the data from the Entities and displaying the data.
I used Modelstate.clear() outputcache etc., but none of it worked.
code used:
For Update:
public virtual void Update(TObject TObject)
{
var entry = Context.Entry(TObject);
DbSet.Attach(TObject);
entry.State = EntityState.Modified;
}
calling Update method in my service and saving changes:
Registry.RepositoryFactory.GetUsersRepository().Update(userobj);
Registry.Context.SaveChanges();
Fetching data after save:
Select:
public virtual IQueryable<TObject> All()
{
return DbSet.AsQueryable();
}
I am able to update in the database, but as it try to retrieve the data immediately from the same table it is not hitting the database, i think it is fetching the data from the cache.
Any pointers are welcome.
Thanks in advance,
Girish.

I have followed the link provided by Damon, the problem is that Refresh occurs. But it is taking few seconds(2 or 3). The page has to load immediately.
The solution that has worked for me is that, while fetching the data from the repository, i am setting the entity using Set. And then i used Refresh method before fetching the data.
Code used:
DbSet set = ((DbContext)Context).Set();
((IObjectContextAdapter)Context).ObjectContext.Refresh(System.Data.Objects.RefreshMode.StoreWins, set);
return DbSet as IQueryable;

I'm going to guess that you are re-using the same EF Context object for both the Update and the Select. If the Context is not disposed of between these events you will end up with a stale context and data will be returned from the EF cache.
Make sure you are disposing of the EF context between calls with the best practise being to surround it in a Using statement. An alternative to this is to call Refresh() on the Context (See this question). You'll still need to dispose of the context at some point because otherwise it will continue to grow and your application will get slower and slower.
I've answered a similar question here.

Related

How to fix connection close issue in synchronised method?

In our application, we are using grails framework and SQL server for database. We have multiple sites and those sites can have multiple users (a few users) and if they are accessing the same method via AJAX that can cause issue so we made the that method as synchronized method and to minimize the database interaction we are storing data in map on site basis since all the user from one site will get the same data, and if the data is older than 10 seconds we get the data from database and update the map object. Here we are getting a lot of database connection close issues on the very first line of synchronized method where we are getting site object from database. What is the issue here and how we can resolve the issue?
def synchronized getData(params){
Site site = Site.get(params.siteId)
// Here we are checking whether site data does not exists in map
// or the data expired (10 second older data) then we get data from
// database and update the map object
// Then here we create new list object from the data in map object
return list
}
Difficult to figure out the exact problem without more information here. Several things stand out...
I'm not especially familiar with using the synchronized keyword in front of a service method, I would recommend trying the synchronized annotation with a static object key:
private static final myLock = new Object()
#Synchronized("myLock")
void getData() {
//do stuff
}
or synchronizing explicitly within the method
void getData() {
synchronized(myLock) {
//do stuff
}
}
I don't know if that's related to your connection closing issues, but worth a try.
But also notably, grails and hibernate provide caching of database retrieves, so if you're loading the same data that's been loaded into hibernate cache, you don't need to cache this in a Map locally... grails is already doing that for you. Site site = Site.get(params.siteId) will NOT make a database call if it's been called recently and is already cached by the framework.
I would strongly suggest running some performance checks just making that call vs. caching in a Map object, especially if you're expiring in ~10s anyway.

DBContext (entity framework) and pre-loaded entities

I use code first in a web application where I have a form to upload text files and import the data into my database.
Each file may have up to 20.000+ records for import.
To speed things up I preload some entities so not to ask the DbContext every time. Then when I create an object for insert, I do for example:
myNewObject.Category = preloadedCategories.First(p => p.Code == code);
I have read some articles on the web because EF is extremey slow on batch inserts, so what I do is:
first use Configuration.AutoDetectChangesEnabled = false;
then every 1000 records I dispose the object and make a new one.
BUT! since the preloaded entities where loaded from a db context that was disposed, after making a new DbContext, I have a problem with preloadedCategories.First(p => p.Code == code). When I make a SaveChanges(), EF tries to also save the preloadedCategories.First(p => p.Code == code) object and fails.
So how can I achive this? I don't want to aks the DbContext every time to load some (non changing) objects. Is it possible?
thanks
When dealing with a large number of records in EF, a few things will help
As #janhartmann states, use .AsNoTracking()
As you stated, use Configuration.AutoDetectChangesEnabled = false, which will require the next point
Use context.Categories.Entry(category).State = EntityState.Modified to attach a disconnected entity to a context and mark is as modified
Also make check that preloadedCategories is no longer an IQuerable and that the data really is local and not trying to lazy load from the database.
If there are no changes to your Category object and you just want to link your myNewObject to an existing category, you have two options
Set the foreign key on myNewObject instead of the navigation property
Use context.Products.Entry(myNewObject).State = EntitySate.Added instead of context.Products.Add(myNewObject) to avoid it adding the entire graph of navigation properties
Good luck

Is NonShared DbContext in MVC a bad practice?

It is an MVC application with Entity Framework Code First for the ORM and MEF as the IoC.
If I mark the DbContext with PartCreationPolicy.Shared there is an error saying the object already exists in the container every time I try to perform an edit.
But what if I simply mark the DbContext with PartCreationPolicy.NonShared so it gets created for every request?
Is there a terrible performance impact for that?
Update
Here is the code for save:
Provider IRepository<Provider>.Put(Provider item)
{
if (item.Id == Guid.Empty)
{
item.Id = Guid.NewGuid();
this.Providers.Add(item);
}
else this.Entry<Provider>(item).State = EntityState.Modified;
return item;
}
And this is the error when on Shared
An object with the same key already exists in the ObjectStateManager.
The ObjectStateManager cannot track multiple objects with the same
key.
You should definitely use PartCreationPolicy.NonShared. Everything you can read about context lifecycle management, whether it's linq to sql, entity framework, or NHibernate (sessions), will agree upon one thing: the context should be short-lived. An easy rule of the thumb is: use it for one unit of work, which means: create a context, do stuff, call SaveChanges once, dispose. Most of the times this rule works well for me.
A shared (or singleton) context is the pattern that hits performance, because the context gets bloated over time. The change tracker needs to track more and more objects, relationship fixup will get slower. And you will find yourself refreshing (re-loading) entities time and again.

Breeze BeforeSaveEntityonly only allows update to Added entities

Don't know if this is intended or a bug, but the following code below using BeforeSaveEntity will only modify the entity for newly created records (EntityState = Added), and won't work for modified, is this correct?
protected override bool BeforeSaveEntity(EntityInfo entityInfo)
{
var entity = entityInfo.Entity;
if (entity is User)
{
var user = entity as User;
user.ModifiedDate = DateTime.Now;
user.ModifiedBy = 1;
}
...
The root of this issue is that on the breeze server we don’t have any built in change tracking mechanism for changes made on the server. Server entities can be pure poco. The breeze client has a rich change tracking capability for any client side changes but when you get to the server you need to manage this yourself.
The problem occurs because of an optimization we perform on the server so that we only update those properties that are changed. i.e. so that any SQL update statements are only made to the changed columns. Obviously this isn’t a problem for Adds or Deletes or those cases where we update a column that was already updated on the client. But if you update a field on the server that was not updated on the client then breeze doesn't know anything about it.
In theory we could snapshot each entity coming into the server and then iterate over every field on the entity to determine if any changes were made during save interception but we really hate the perf implications especially since this case will rarely occur.
So the suggestion made in another answer here to update the server side OriginalValuesMap is correct and will do exactly what you need.
In addition, as of version 1.1.3, there is an additional EntityInfo.ForceUpdate flag that you can set that will tell breeze to update every column in the specified entity. This isn't quite as performant as the suggestion above, but it is simpler, and the effects will be the same in either case.
Hope this helps.
I had the same problem, and I solved that doing this:
protected override bool BeforeSaveEntity(EntityInfo entityInfo)
{
if(entityInfo.EntityState== EntityState.Modified)
{
var entity = entityInfo.Entity;
entityInfo.OriginalValuesMap.Add("ModificationDate", entity.ModificationDate);
entity.ModificationDate = DateTime.Now;
}
}
I think you can apply this easily to your case.

Changes not reflected in Database while using entity framework

I am accessing my database through ADO.NET Entity framework in MVC 3 Application.
I am updating my database through Stored Procedure.
But the changes are not reflected at run time.I mean to say i am able to see the changes only after restarting it.
What is the reason for the problem and How can i avoid it ?
I am using Repository pattern So at repository My code look like this
Ther Is One Function Which Save Changes
public void SaveNewAnswer(AnswerViewModel answer,string user)
{
SurveyAdminDBEntities _entities = new SurveyAdminDBEntities();
_entities.usp_SaveNewAnswer(answer.QuestionId, answer.AnswerName, answer.AnswerText, answer.AnswerOrder, answer.Status, user);
_entities.SaveChanges();
}
Data Retreival Code
public IEnumerableGetMultipleChoiceQuestions(string questionId)
{
SurveyAdminDBEntities _entities = new SurveyAdminDBEntities();
_entities.AcceptAllChanges();
_entities.SaveChanges();
return _entities.usp_GetMultipleChoiceQuestions(Int32.Parse(questionId));
}
But Changes are not reflected till the time i don't clode the session of the browser and run it again .
Please help !
Thank You In advance
Are you calling context.SaveChanges() on your Entities (DbContext/ObjectContext) object? Are you using a transaction that you haven't committed?
If you have an uncommitted transaction in your sproc, you can try creating your own entity transaction and seeing if committing your transaction will commit the nested transaction as well. The problem is that calling SaveChanges() automatically begins and commits a transaction, so this may not be any different than that.
I would also call _entities.AcceptAllChanges() in your save operation.
public void SaveNewAnswer(AnswerViewModel answer,string user)
{
SurveyAdminDBEntities _entities = new SurveyAdminDBEntities();
_entities.Connection.Open();
System.Data.Common.DbTransaction tran = _entities.Connection.BeginTransaction();
try
{
_entities.usp_SaveNewAnswer(answer.QuestionId, answer.AnswerName, answer.AnswerText, answer.AnswerOrder, answer.Status, user);
_entities.SaveChanges(); // automatically uses the open transaction instead of a new one
tran.Commit();
}
catch
{
tran.Rollback();
}
finally
{
if (_entities.Connection.State == System.Data.ConnectionState.Open)
_entities.Connection.Close();
_entities.AcceptAllChanges();
}
}
Is your stored procedure doing an explicit commit? Things run in a database session will be available for that session, but not available to any other session until the action is committed.
When you pull data out of your database into your context that data is kept in memory, separate from the actual database itself.
You will see the changes if you create a new context object instance and load the data from the database with it.
It's good practice to not use the same instance of your context object but create them on an as needed basis for individual transactions with the database. In your case if you're updating via function imports instead of the context.SaveChanges() method then you need to refresh your context with the updated data after you commit those changes.
Add this to your connect string (assuming sql 2005)
transaction binding=Explicit Unbind;
if the data is no longer available after session reset, then the problem is indeed with a transaction, if the data is then available after reset, then your problem is something different and we'll likely need more details.

Resources