breeze EntityManager metadata load time issue - breeze

It's being quite slow loading the DOM of the application since loading of breeze metadata is quite slow. Is there any tips to make it load faster and optimize?
Please take a look at following link to see how much it affects loading time.
https://dl.dropboxusercontent.com/u/2781659/8-27-2013%201-02-38%20PM.jpg

The problem with my code that it started querying breeze manager on load.
Breeze loads metadata if metadata is not available during query execution.
I changed my code such that no query fires until the metadata loads.
my code change follow.
during viewmodel load I invoke loadMetadata() function with callback supplied. once the metadata is loaded my callback method would fire initial query.
var manager;
var store;
function loadMetadata(callback)
{
manager = new breeze.EntityManager(serviceName);
store = manager.metadataStore;
store.fetchMetadata(serviceName, callback);
}

See Loading metadata with breeze is slow for a tip.
But you have other issues. I can think of no reason why the same EntityManager INSTANCE would ask for same metadata twice. Perhaps you are creating a new EM each time? If that's what you want to do, you can share the same metedatastore across EMs.
You want to find out what is making it slow in the first place. I doubt that has anything to do with Breeze on either client or server.

Related

How to fix connection close issue in synchronised method?

In our application, we are using grails framework and SQL server for database. We have multiple sites and those sites can have multiple users (a few users) and if they are accessing the same method via AJAX that can cause issue so we made the that method as synchronized method and to minimize the database interaction we are storing data in map on site basis since all the user from one site will get the same data, and if the data is older than 10 seconds we get the data from database and update the map object. Here we are getting a lot of database connection close issues on the very first line of synchronized method where we are getting site object from database. What is the issue here and how we can resolve the issue?
def synchronized getData(params){
Site site = Site.get(params.siteId)
// Here we are checking whether site data does not exists in map
// or the data expired (10 second older data) then we get data from
// database and update the map object
// Then here we create new list object from the data in map object
return list
}
Difficult to figure out the exact problem without more information here. Several things stand out...
I'm not especially familiar with using the synchronized keyword in front of a service method, I would recommend trying the synchronized annotation with a static object key:
private static final myLock = new Object()
#Synchronized("myLock")
void getData() {
//do stuff
}
or synchronizing explicitly within the method
void getData() {
synchronized(myLock) {
//do stuff
}
}
I don't know if that's related to your connection closing issues, but worth a try.
But also notably, grails and hibernate provide caching of database retrieves, so if you're loading the same data that's been loaded into hibernate cache, you don't need to cache this in a Map locally... grails is already doing that for you. Site site = Site.get(params.siteId) will NOT make a database call if it's been called recently and is already cached by the framework.
I would strongly suggest running some performance checks just making that call vs. caching in a Map object, especially if you're expiring in ~10s anyway.

Child navigation properties missing in imported entities in custom initializer

I have a custom entity definition like:
var Card = function () {};
var cardInitializer = function (card) {
// card.fields is defined in the metadata.
// card._cfields is an in-memory only field
// that breeze will not, and should not, track.
// Thus it is being added in the initializer
card._cfields = card.fields.slice();
};
When the data loads from the server everything is fine. The card.fields array has the corresponding data.
EDITED: Added more info and code of how manager is being set up
But when the data is round-tripped in local storage via .exportEntities and importEntities, the child data defined in the metadata, represented by the property card.fields in this example, is not loaded (the Array has length 0) during the initializer call, though it is subsequently available on the entity after load has completed.
Here is how the manager is being initialized:
var metadataStore = new breeze.MetadataStore();
metadataStore.importMetadata(options.metadata);
var queryOptions = new breeze.QueryOptions( {
fetchStrategy: breeze.FetchStrategy.FromLocalCache
});
var dataService = new breeze.DataService({
serviceName: "none",
hasServerMetadata: false
});
manager = new breeze.EntityManager({
dataService: dataService,
metadataStore: metadataStore,
queryOptions: queryOptions
});
entityExtensions.registerExtensions(manager, breeze);
var entities = localStorage[storage];
if(entities && entities !== 'null'){
manager.importEntities(entities);
}
Wow. You ask for free support from the harried developer of a free OSS product that you presumably value and then you shit on him because you think he was being flippant? And downgrade his answer.
Could you have responded more generously. Perhaps you might recognize that your question was a bit unclear. I guess that occurred to you because you edited your question such that I can see what you're driving at.
Two suggestions for next time. (1) Be nice. (2) Provide a running code sample that illustrates your issue.
I'll meet you half way. I wrote a plunker that I believe demonstrates your complaint.
It shows that the navigation properties may not be wired up when importEntities calls an initializer even though the related entities are in cache.
They do appear to be wired up during query result processing when the initializer is called.
I cannot explain why they are different in this respect. I will ask.
My personal preference is to be consistent and to have the entities wired up. But it may be that there are good reasons why we don't do that or why it is indeterminate even when processing query results. I'll try to get an answer as I said.
Meanwhile, you'll have to work around this ... which you can do by processing the values returned from the import:
var imported = em2.importEntities(exported);
FWIW, the documentation is silent on this question.
Look at the "Extending Entities" documentation topic again.
You will see that, by design, breeze does not know about any properties created in an initializer and therefore ignores such properties during serialization such as entity export. This is a feature not a limitation.
If you want breeze to "know" about an unmapped property you must define it in the entity constructor (Card)... even if you later populate it in the initialized function.
Again, best to look at the docs and at examples before setting out on your own.

Unable to clear cache in EF

Iam facing a problem while using factory model in MVC.
As i update and try to display the data from the same table, the update is being performed in the database but the updated data is not being fetched from the database.
I feel that it is fetching the data from the Entities and displaying the data.
I used Modelstate.clear() outputcache etc., but none of it worked.
code used:
For Update:
public virtual void Update(TObject TObject)
{
var entry = Context.Entry(TObject);
DbSet.Attach(TObject);
entry.State = EntityState.Modified;
}
calling Update method in my service and saving changes:
Registry.RepositoryFactory.GetUsersRepository().Update(userobj);
Registry.Context.SaveChanges();
Fetching data after save:
Select:
public virtual IQueryable<TObject> All()
{
return DbSet.AsQueryable();
}
I am able to update in the database, but as it try to retrieve the data immediately from the same table it is not hitting the database, i think it is fetching the data from the cache.
Any pointers are welcome.
Thanks in advance,
Girish.
I have followed the link provided by Damon, the problem is that Refresh occurs. But it is taking few seconds(2 or 3). The page has to load immediately.
The solution that has worked for me is that, while fetching the data from the repository, i am setting the entity using Set. And then i used Refresh method before fetching the data.
Code used:
DbSet set = ((DbContext)Context).Set();
((IObjectContextAdapter)Context).ObjectContext.Refresh(System.Data.Objects.RefreshMode.StoreWins, set);
return DbSet as IQueryable;
I'm going to guess that you are re-using the same EF Context object for both the Update and the Select. If the Context is not disposed of between these events you will end up with a stale context and data will be returned from the EF cache.
Make sure you are disposing of the EF context between calls with the best practise being to surround it in a Using statement. An alternative to this is to call Refresh() on the Context (See this question). You'll still need to dispose of the context at some point because otherwise it will continue to grow and your application will get slower and slower.
I've answered a similar question here.

MVC 3 - Sessionless controllers datastore options

i've been reading about sessionless controllers lately and it seems an interesting idea, since it improves perfomance and let ajax calls to be asynchronous, as usually they should be.
However, i can't figure a nice way to store data that would previously stored by a session. I have a lot of single-fetch data that i get once and walks with it through several pages. My first thought was to use MemoryCache, but reading this post i begin to doubt it, since IIS can let go my data anytime.
Because of this, i got a little confused on what should i do to store data in a session like way. I read a couple of thing about NoSQL and MongoDB, but wouldn't that be the same as to fetch data all the time i need it?
Can you give me some clarifications and technologies i can use to serve as temporary datastore?
Have you considered using the HttpContext.Cache? As you're saying in a session like way, there is no reason you couldn't create a cache key based upon the sessionid of the current request:
// cache key
var cacheKey = string.Format("{0}-{1}", "SomeKey", Session.SessionID);
// save to cache
HttpContext.Cache.Insert(cacheKey, <yourobject>, null, Cache.NoAbsoluteExpiration, TimeSpan.FromMinutes(20));
From there it would simply be a matter of passing along the sessionid and retrieving at a later time:
HttpContext.Cache[cacheKey]

Is it OK to open a DB4o file for query, insert, update multiple times?

This is the way I am thinking of using DB4o. When I need to query, I would open the file, read and close:
using (IObjectContainer db = Db4oFactory.OpenFile(Db4oFactory.NewConfiguration(), YapFileName))
{
try
{
List<Pilot> pilots = db.Query<Pilot>().ToList<Pilot>();
}
finally
{
try { db.Close(); }
catch (Exception) { };
}
}
At some later time, when I need to insert, then
using (IObjectContainer db = Db4oFactory.OpenFile(Db4oFactory.NewConfiguration(), YapFileName))
{
try
{
Pilot pilot1 = new Pilot("Michael Schumacher", 100);
db.Store(pilot1);
}
finally
{
try { db.Close(); }
catch (Exception) { };
}
}
In this way, I thought I will keep the file more tidy by only having it open when needed, and have it closed most of the time. But I keep getting InvalidCastException
Unable to cast object of type 'Db4objects.Db4o.Reflect.Generic.GenericObject' to type 'Pilot'
What's the correct way to use DB4o?
No, it's not a good idea to work this way. db4o ObjectContainers are intended to be kept open all the time your application runs. A couple of reasons:
db4o maintains a reference system to identify persistent objects, so it can do updates when you call #store() on an object that is already stored (instead of storing new objects) . This reference system is closed when you close the ObjectContainer, so updates won't work.
Class Metadata would have to be read from the database file every time you reopen it. db4o would also have to analyze the structure of all persistent classes again, when they are used. While both operations are quite fast, you probably don't want this overhead every time you store a single object.
db4o has very efficient caches for class and field indexes and for the database file itself. If you close and reopen the file, you take no advantage of them.
The way you have set up your code there could be failures when you work with multiple threads. What if two threads would want to open the database file at exactly the same time? db4o database files can be opened only once. It is possible to run multiple transactions and multiple threads against the same open instance and you can also use Client/Server mode if you need multiple transactions.
Later on you may like to try Transparent Activation and Transparent Persistence. Transparent Activation lazily loads object members when they are first accessed. Transparent Persistence automatically stores all objects that were modified in a transaction. For Transparent Activation (TA) and Transparent Persistence (TP) to work you certainly have to keep the ObjectContainer open.
You don't need to worry about constantly having an open database file. One of the key targets of db4o is embedded use in (mobile) devices. That's why we have written db4o in such a way that you can turn your machine off at any time without risking database corruption, even if the file is still open.
Possible reasons why you are getting a GenericObject back instead of a Pilot object:
This can happen when the assembly name of the assembly that contains the Pilot object has changed between two runs, either because you let VisualStudio autogenerate the name or because you changed it by hand.
Maybe "db4o" is part of your assembly name? One of the recent builds was too agressive at filtering out internal classes. This has been fixed quite some time ago. You may like to download and try the latest release, "development" or "production" should both be fine.
In a presentation I once did I have once seen really weird symptoms when db4o ObjectContainers were opened in a "using" block. You probably want to work without that anyway and keep the db4o ObjectContainer open all the time.
It is ok to reopen the database multiple times. The problem would be performance and loosing the "identity". Also you can't keep a reference to a result of a query and try to iterate it after closing the db (based on you code, looks like you want to do that).
GenericObjects are instantiated when the class cannot be found.
Can you provide a full, minimalist, sample that fails for you?
Also, which db4o version are you using?
Best

Resources