I try to save many objects in coredata, but get this crash:
Communications error: <OS_xpc_error: <error: 0x19b354af0> { count = 1, contents =
"XPCErrorDescription" => <string: 0x19b354e50> { length = 22, contents = "Connection interrupted" }
}>
Message from debugger: Terminated due to memory issue
I use MagicalRecord:
[MagicalRecord saveInBackgroundWithBlock:^(NSManagedObjectContext *localContext){
for (int i = 0; i < json.count; i++) {
[Product parseWithData:((NSMutableArray *)json)[i]];
}
}];
Product.m
+ (void)parseWithData:(NSDictionary *)dictionary {
NSString *xml_id = [dictionary[#"XML_ID"] isKindOfClass:[NSString class]] ? dictionary[#"XML_ID"] : #"";
Product *product = [Product getProductWithXML_id:xml_id];
if (!product)
product = [Product MR_createEntity];
product.xml_id = xml_id;
product.code = [dictionary[#"Code"] isKindOfClass:[NSString class]] ? dictionary[#"Code"] : #"";
...
}
Can you suggest me, how can i save it?
When i save my objects in loop to core data - memory grow very fast
It seems to be a memory issue.
Try surrounding the inner part of your for loop with
autoreleasepool {
...
}
You need to paginate the way you get the data and/or save it.
By paginate, i mean :
download the first 1000 (for example, it depends on the content really)
When its done, save the 1000 you just go
When that is done, get the next 1000, save it again, and so on.
You need to know the number you're trying to get and use the (if I remember correctly ) SetLimit: on the parse method, and SetSkip. The skip skips the X first elements, and the limit is the max number of items that will be downloaded.
That way, you skip 0 with limit 1000, then recall the method with skip += limit, and you'll get the second 1000 chunk, and so on. The last chunk will obviously be smaller than 1000.
Doing this will drastically increase the time taken, but that could be done seamlessly in the background ; but it will be spread over much enough to need less memory.
Do it, and see if it makes a big difference. If not, you could always reduce to 500 instead of 1000, or completely change your architecture ; maybe you don't even need ALL the items right now !
Related
Our app has found great success recently. Part of the app consists of a huge list of products. Products that we have stored on Firebase’s Realtime Database. However our Data set for the list we populate has now exceeded 5,000+ objects. Which of course has put great strain on our Realtime Database, causing our loading time of these products to push 1 minute + on the initial load and 18 secs+ after caching using Firebase’s persistent data cache.
I’ve been reading other questions pertaining to huge data sets and utilizing Firebase and the short answer is “don’t do it”.
However, what wasn’t addressed, and the “pickle” we’re in is what “can” we do? Is there somewhere else where we can host our data that will query our data exponentially faster?
Because of our use-case, pagination isn’t an option. The JSON is around 12MB. Any advice or “We ran into the same roadblock and this is what we did” would be greatly appreciated! Thanks!
How we query for data:
[_reference
observeSingleEventOfType:FIRDataEventTypeValue
withBlock:^(FIRDataSnapshot *snapshot) {
self.dataArray = [NSMutableArray array];
self.postCountNew = 0;
for (snapshot in snapshot.children) {
[_dataArray addObject:snapshot.value];
int timeInterval =
[now timeIntervalSinceDate:[_dateFormatter dateFromString:snapshot.value[#"Date"]]];
if (timeInterval < 86400 && timeInterval >= 0 && timeInterval != 0) {
_postCountNew++;
}
}
[self.tableView reloadData];
completionBlock (YES);
}];
I am using SDwebimage Framework for caching images, so downloading number of images increases daily, so after 3-4 days of using application, causes increasing cache,which causes application performance degrading, is there anyway to improve performance
Take a look at this SO post here
You can set the period for how long the cache will last, and then it will be flushed. Also in addition, you can play around with the maxCacheSize value in SDImageCache.m in the cleanDiskWithCompletionBlock.
// If our remaining disk cache exceeds a configured maximum size, perform a second
// size-based cleanup pass. We delete the oldest files first.
if (self.maxCacheSize > 0 && currentCacheSize > self.maxCacheSize) {
// Target half of our maximum cache size for this cleanup pass.
const NSUInteger desiredCacheSize = self.maxCacheSize / 2;
// Sort the remaining cache files by their last modification time (oldest first).
NSArray *sortedFiles = [cacheFiles keysSortedByValueWithOptions:NSSortConcurrent
usingComparator:^NSComparisonResult(id obj1, id obj2) {
return [obj1[NSURLContentModificationDateKey] compare:obj2[NSURLContentModificationDateKey]];
}];
// Delete files until we fall below our desired cache size.
for (NSURL *fileURL in sortedFiles) {
if ([_fileManager removeItemAtURL:fileURL error:nil]) {
NSDictionary *resourceValues = cacheFiles[fileURL];
NSNumber *totalAllocatedSize = resourceValues[NSURLTotalFileAllocatedSizeKey];
currentCacheSize -= [totalAllocatedSize unsignedIntegerValue];
if (currentCacheSize < desiredCacheSize) {
break;
}
}
}
}
You can simply set a global cache value for your shared instance of SDImageCache.
Just call the following on your AppDelegate:
// limit SDImage cache to 50 MGB
SDImageCache.shared().config.maxCacheSize = 1_000_000 * 50
I've been having this bug for several weeks already. I searched on many forums every reply on duplicates and I implemented some of the normal approaches and still it doesn't work properly.
So to give you some context I'm working on a recipe application that scraps html recipes from the web and stores it in core data, simple right? Well when the client asked adding support for iCloud Sync I though it was going to be easy specially working on iOS 7 only which solves most of the problems for you.
The problems arises when the app populates initial data in the application. I have two related entities called MainCategory[e1] and Category[e2], there is one to many relationship between them (e1 <->>> e2).
The first the app starts it will create 5 Main Categories and for each Main Category it will add 5 Categories
+ (BOOL)initialLoad
{
DLog(#"Initial Load");
//Create main and sub categories to database
NSDictionary * categoriesDic = #{
CAT_MEAL_TYPE: #[C_STARTER,C_MAINS,C_DESSERT,C_SOUPS,C_SALAD],
CAT_INGREDIENT: #[C_BEEF,C_CHICKEN,C_PASTA,C_SALMON,C_CHOCOLATE],
CAT_CUISINE : #[C_CHINESE,C_FRENCH,C_INDIAN,C_ITALIAN,C_MOROCCAN],
CAT_SEASON : #[C_CHRISTMAS,C_SUNDAY_ROAST,C_DINNER,C_BBQ,C_NIBBLES],
CAT_DIET : #[C_WHEATFREE,C_VEGETARIAN,C_LOW_FAT,C_LOW_GI,C_DAIRY_FREE]
};
NSArray * mainCategoryKeys = #[CAT_MEAL_TYPE,CAT_INGREDIENT,CAT_CUISINE,CAT_SEASON,CAT_DIET];
for(NSString * eachMainCategoryName in mainCategoryKeys)
{
//Create Main category
MainCategory * eachMainCategory = [MainCategory mainCategoryWithName:eachMainCategoryName];
NSArray * subCategories = [categoriesDic objectForKey:eachMainCategoryName];
//Create Sub categories and adds them to main category
for(NSString * eachCategoryName in subCategories)
{
/*Category got renamed to zCategory given it's a reserver name in the framework and
can not be used */
zCategory * eachCategory = [zCategory categoryWithName:eachCategoryName];
[eachMainCategory addCategoriesObject:eachCategory];
}
}
[((AppDelegate *)[UIApplication sharedApplication].delegate) saveContext];
return TRUE;
}`
Then after saving the context all this initial data will sync with the database in iCloud, so far so good. The problem comes when on the second device it runs the same initialLoad code and gets sync once again. The result is getting double MainCategories and Categories as many of you know this problem.
After reading several threads about how to remove them I used the dateCreated approach where you add a NSDate property to each entity so every time you create one instance it will have a timestamp to track which one is older and which one is newer. Then I simply add an observer from NSNotificationCenter checking the iCloud import notification NSPersistentStoreCoordinatorStoresDidChangeNotification and runs a timerCheck that after 5 seconds will execute on the mainThread a clean duplicates method.
- (void)checkTimer{
if(self.cleanTimer)
{
[self.cleanTimer invalidate];
self.cleanTimer = nil;
}//schedule timer to clean iCloud duplicates of database
self.cleanTimer = [NSTimer scheduledTimerWithTimeInterval:5 target:self selector:#selector(cleanDuplicates:) userInfo:nil repeats:FALSE];
}
- (void)cleanDuplicates:(NSTimer*)timer{
[self performSelectorOnMainThread:#selector(cleanCron) withObject:nil waitUntilDone:TRUE];}
I'm invalidating the timer every time checkTimer method gets call in order to restart it again because you normally get several NSPersistentStoreCoordinatorStoresDidChangeNotification when content gets updated/inserted/deleted, this way I know it will run once after all the notifications have gone through.
btw cleanCron just calls a class method cleanDuplicates
- (void)cleanCron
{
[CTFetchCoreData cleanDuplicates];
}`
Here is where the non magic happens, I get all the MainCategories which will be 10 given they have been duplicated and order them with the oldest ones at the beginning, then it iterates and save them in an dictionary with their name as the key so whenever it finds another MainCategory with the same name it just deletes it. Btw in the relationship e1<->>e2 there is a cascade delete rule so every time you delete a MainCategory item it deletes all the related Categories with it so there shouldn't be a problem.
+ (BOOL)cleanDuplicates
{
#synchronized(self){
//Fetch mainCategories from coreData
NSArray * mainCategories = [CTFetchCoreData fetchAllMainCategories];
// Clean duplicate Main Categories
NSMutableDictionary * uniqueMainCatDic = [NSMutableDictionary dictionary];
// Sorts the array with the oldest dateCreated one
mainCategories = [mainCategories sortedArrayUsingComparator:^NSComparisonResult(MainCategory* obj1,MainCategory * obj2) {
if(obj1.dateCreated == nil || obj2.dateCreated == nil)
{
DLog(#"ERROR Date Created");
}
return [obj1.dateCreated compare:obj2.dateCreated];
}];
// if there are more than five MainCategories it procedes the clenaup
if(mainCategories.count > 5)
{
for(MainCategory* eachMainCat in mainCategories)
{
MainCategory * originalMainCat = [uniqueMainCatDic objectForKey:eachMainCat.name];
if( originalMainCat == nil)
{
DLog(#"-> %# = %#",eachMainCat.name, eachMainCat.dateCreated);
[uniqueMainCatDic setObject:eachMainCat forKey:eachMainCat.name];
}else{
// Clean duplicate Categories
[[self managedObjectContext] deleteObject:eachMainCat];
DLog(#"x %# = %#",eachMainCat.name, eachMainCat.dateCreated);
}
}
DLog(#"Cleaning Main Categories");
}
}
[[AppDelegate sharedInstance] saveContext];
return TRUE;
}
It turns out that after I run it on the second device I will get this output :
Sesame[4145:60b] -> Cuisine = 2014-02-06 16:15:38 +0000
Sesame[4145:60b] -> Meal = 2014-02-06 17:15:54 +0000
Sesame[4145:60b] x Meal = 2014-02-06 17:15:54 +0000
Sesame[4145:60b] -> Ingredients = 2014-02-06 17:15:54 +0000
Sesame[4145:60b] x Ingredients = 2014-02-06 17:15:54 +0000
Sesame[4145:60b] x Cuisine = 2014-02-06 17:15:54 +0000
Sesame[4145:60b] x Cuisine = 2014-02-06 17:15:54 +0000
Sesame[4145:60b] -> Occasion = 2014-02-06 17:15:54 +0000
Sesame[4145:60b] -> Diet = 2014-02-06 17:15:54 +0000
Sesame[4145:60b] x Diet = 2014-02-06 17:15:54 +0000
which means that the same MainCategories are getting deleted, they have the same timestamp! I'm wondering how iCloud gets the information merged.
Please if you know a better approach to clean duplicated apart from the dateCreated property please tell me because I've tried it a lot without luck, there should be a better approach.
Thanks in advance!
Update :
Finally I've managed to solve my problem after all, crazy as it sounds I was getting duplicate instances from iCloud! that's what the dates were the same. I just added an if to check if both dates are the same then don't delete the MainCategory, and so next time you open your app Core Data will refix the merge and update the database with the correct instances and different date values as it was supposed to be.
I can't see anything obviously wrong with your code, though I would recommend using UUIDs instead of dates for ordering your duplicates. But that is unlikely to be related to what you are seeing.
To be honest, it looks to me that Core Data is really messing things up. (Eg It also seems there are 3 Cuisine categories.)
I experienced this type of issue when working with Core Data sync if I tried to delete the cloud data files, and didn't give it time to thoroughly remove the files from all devices. You end up with old transaction logs in there, which trigger extra objects to be inserted.
Core Data also tries to handle all merging on its own. How that happens is anyone's guess.
Core Data + iCloud is a bit unusual in that it is one of the only sync frameworks which has no concept of global identity. There are actually good reasons for Apple not doing it, which are too subtle to discuss here, but it does make it difficult for developers. Deduping post-merge is an ugly solution IMO. Your store has to become invalid before it can become valid again.
I much prefer the approach of frameworks like Wasabi Sync, TICDS, and Ensembles, which all have a concept of global identity and do not require deduping as a result.
(Disclosure: I founded and develop the Ensembles framework)
Also avoid using this
NSMutableDictionary * uniqueMainCatDic = [NSMutableDictionary dictionary];
rather use
NSMutableDictionary * uniqueMainCatDic = [[NSMutableDictionary alloc] init];
I think your weirdness of duplicates may go away if you always alloc the mutable dictionary. It took me weeks to figure this out - not sure if its a bug.
All:
I am recording a movie, using AVCaptureMovieFileOutput. As various events occur, I wish to store the event's title/time in the QuickTime movie being written. Thus I might have 20-30 data points that I wish to associate with a particular movie.
My strategy is to use metadata, but I have not been having much luck. Can someone please tell me, first of all:
a) Can I store arbitrary metadata, or just those keys and values as defined in AVMetadataFormat.h? I would like to be able to store an array.
b) If I can store an arbitrary array, what key does the trick? If not, could I store my metadata in a comment field (ugly, but I could parse 20-30 points quickly enough).
c) The code shown below does not appear to work, as no matter what I put in for the item.key (AVMetadataQuickTimeMetadataKeyArtist, AVMetadataCommonKeyArtist, or all sorts of other things ending in Artist) I never see anything in iTune's Get Info window.
- (IBAction)recordEvent:(id)sender {
NSLog(#"Record a metadata point here ...");
// is there any metadata associated with the file yet?
NSArray * existingMetaData = self.aMovieFileOutput.metadata;
NSMutableArray * newMetadataArray = nil;
if(existingMetaData){
newMetadataArray = [existingMetaData mutableCopy];
} else {
newMetadataArray = [[NSMutableArray alloc]init];
}
AVMutableMetadataItem * item = [[AVMutableMetadataItem alloc]init];
item.keySpace = AVMetadataKeySpaceCommon;
item.key = AVMetadataQuickTimeMetadataKeyArtist;
item.value = #"Enya, really!"; // in practice this will be the title of (UIButton *)sender
item.time = CMTimeMake(0.0,1.0);
[newMetadataArray addObject:item];
self.aMovieFileOutput.metadata = newMetadataArray;
}
Any advice would be greatly appreciated.
Thanks!
Storing metadata in QuickTime file via AVCaptureMovieFileOutput & AVMutableDataItems only allows you to store values for keys predefined in AVMetadataKeySpaceCommon keyspace, i. e. AVMetadataCommonKey* like keys
All the other data is ignoring
I am getting a dead store warning when I analyze my project but the project does not crash.
Here is what I am doing
NSString *graphUrl = nil;
if ([graphArray count] == 1)
{
objTrial = [graphArray objectAtIndex:0];
graphUrl = #"http://chart.apis.google.com/chart?cht=s:nda&chf=bg,s,FFFFFF&chs=";
graphUrl = [graphUrl stringByAppendingString:#"&chd=t:"];
graphUrl = [graphUrl stringByAppendingString:objTrial.highValue];// get the dead store error here
}
else
{
//someother operation is done and a value is loaded to aURL
}
I get a dead store warning as mentioned in the code.. How can I prevent this?
It would be great if someone could help me out in this
The warning is telling you that the store that you do in the first line gets thrown away (i.e assigning an empty string to the variable and then reassigning it afterwards without using the original value). Just change the first line to the following and the warning should go away:
NSString *aUrl;
Edit:
you should change the line where you use it also:
aURL = [aValue copy];
"dead store" means something that's not used, or rather something useless.
You get it when you have a variable defined that you never do anything with. So, the Analyzer tells you that you have wasted some storage.
Here you haven't used the aUrl object after assigning it.
It won't cause any problems other than a few bytes of wasted memory. Of course if it's a large object that could be more.
Perhaps someone could chip in with knowledge of compilers, as compiler optimization might take care of dead stores in any case.
Dead Store is a value that is assigned but never used. There is nothing to worry about it. But if you can't control yourself from worrying ;-) you can change your code to,
NSString aUrl = nil;
if ([anArray count] == 1) {
// a value is store in aValue
// then that value is appended to aURL
aURL = [aURL stringByAppendingString:aValue];
} else {
aUrl = #"";
//someother operation is done and a value is loaded to aURL
}