Magical Record, multiple configurations and Restkit - ios

I am using Magical Record in a fairly large IOS project. I use configurations to separate a large seed database from user data. Since Magical Record doesn't support configurations, I deconstructed Magical Record's setupCoreDataStackWithAutoMigratingSqliteStoreNamed method and replaced it with the following:
+(void) RB_setupMultipleStores:(NSString *) seedStoreName userStore:(NSString *) userStoreName
/* change persistent store to one with multiple configurations. Assumes Magical Record is initialized to perform auto migration. */
{
NSError * error= nil;
[MagicalRecord cleanUp]; //Tear down Magical Record
NSManagedObjectModel * model = [NSManagedObjectModel MR_defaultManagedObjectModel];
NSURL *seedURL = [NSPersistentStore MR_urlForStoreName:[seedStoreName stringByAppendingString:#".sqlite"]];
NSPersistentStoreCoordinator * coordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:model];
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], NSMigratePersistentStoresAutomaticallyOption,
[NSNumber numberWithBool:YES], NSInferMappingModelAutomaticallyOption,
nil];
NSPersistentStore * seedStore =[coordinator
addPersistentStoreWithType:NSSQLiteStoreType
configuration:#"Seed"
URL:seedURL
options:options
error:&error];
if (!seedStore || error)
{
NSLog(#"Error setting up seed store:%# for %#", [error localizedDescription], seedURL);
exit(-1);
}
NSURL *userURL = [NSURL URLForDocumentDirectoryWithAppendedPath:[userStoreName stringByAppendingString:#".sqlite"]];
NSPersistentStore * userStore = [coordinator
addPersistentStoreWithType:NSSQLiteStoreType
configuration:#"User"
URL:userURL
options:options
error:&error];
if (!userStore || error)
{
NSLog(#"Error setting up user store:%# for %#", [error localizedDescription], userURL);
exit (-1);
}
[NSPersistentStoreCoordinator MR_setDefaultStoreCoordinator:coordinator];
//Bring back Magical Record with updated coordinator.
[NSManagedObjectContext MR_initializeDefaultContextWithCoordinator:coordinator];
[[NSManagedObjectContext MR_defaultContext] setUndoManager:nil];
}
Now I will be adding Restkit to the mix. I'd need to share the object model and persistent store, and I'd rather use one set of contexts rather than have two different stacks.
I see five potential approaches:
Modify https://github.com/blakewatters/RKMagicalRecord to support multiple configurations. This looks trivial but it requires I use a category to surface some private methods, and the good MR developers recommend against explicitly setting MR default and root saving contexts.
Create the Magical Record contexts first, and then assign them to Restkit. Will this even work? Does it make sense at all?
Initialize Restkit and Magical Record from the same NSPersistentStoreCoordinator. Does this make sense?
Create two separate stacks with different but similar NSPersistentStoreCoordinators.
Create my own stack and contexts, and make all calls to Restkit and MR using those contexts.
Can anyone recommend any of these or any other approaches? Each requires significant effort to even test. I'm about to head down the #1 path.
Thanks...

Fundamentally, CoreData has ways to deal with this already. You can have multiple coordinators point to the same store, or whatever your scenario is. It's up to each framework to allow you to be used against a stack that framework itself did not create or setup.
MagicalRecord boils down to a collection of helper methods for fetching and saving. You don't need to use ANY of the setup methods for MagicalRecord to work its magic on your fetches. The setup methods are there to help you get going faster on new projects, as well as providing a single access point for a "default" contexts for use when you perform fetches on the main thread/queue. For all other use, use the explicit inContext: parameter every time. Using that parameter, you can use whatever MOC you want, everything will work. This is by design. MagicalRecord was never written to replace CoreData. It was written to make the simple tasks easier.
With that, you could just let RestKit deal with your entire CoreData stack and use MagicalRecord only for the convenience APIs for fetching and saving. This would save you having to do any of those solutions and just get back to tackling your app specific issues...

Related

Core Data - Lightweight migration doesn't work

I am new to Core Data, and I am currently maintaining an application using Core Data.
In the need of publishing a new release of the application soon, I have to add en Entity to the data model.
I have followed this tutorial Lightweight tutorial which was very useful but also I have read all the Apple documentation but also this amazing article Core Data Migration in order to understand globaly how it works.
Although I had to add only one entity to the data model, I heard that a Lightweight migration was OK in this situation.
It's only 1 new Entity (without attributes) that I have to link to the already existing parent Entity.
What I have done so far :
The application is currently on the version 3 of the datamodel
I have created a new data model (version 4) from the version 3
I have chosen data model version 4 as current data model
I have created my new Entity (whithout attribute), and linked it to the parent Entity.
I have created the generated class object
Then I modified my UI
Build and run, it works, cool. BUT when I download the current version from the AppStore, and when I install the new recently made Archive/IPA from TestFlight, (install over the old one -> migration scenario) the Application run without my new Entity/Datamodel.
From the Apple documentation, it is very clear that adding Entity is supported by Core Dara for Lightweight Migration.
I know this is not an easy process, but I feel like I have followed everything perfectly.
How can I test the migration without each time archive, publish on TestFlight etc...
If you need any additional informations in order to clearly understand my issue and/or write a more elaborated answer, feel free to ask in the comment and I will edit my question.
Thank you in advance.
EDIT :
Here are the code about the NSPersistentStoreCoordinator from the AppDelegate.
- (NSPersistentStoreCoordinator *)persistentStoreCoordinator {
// The persistent store coordinator for the application. This implementation creates and return a coordinator, having added the store for the application to it.
if (_persistentStoreCoordinator != nil) {
return _persistentStoreCoordinator;
}
// Create the coordinator and store
_persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:[self managedObjectModel]];
NSURL *storeURL = [self persistentStoreURL];
NSError *error = nil;
NSString *failureReason = #"There was an error creating or loading the application's saved data.";
if (![_persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:storeURL options:#{NSMigratePersistentStoresAutomaticallyOption:#YES, NSInferMappingModelAutomaticallyOption:#YES} error:&error]) {
// Report any error we got.
NSMutableDictionary *dict = [NSMutableDictionary dictionary];
dict[NSLocalizedDescriptionKey] = #"Failed to initialize the application's saved data";
dict[NSLocalizedFailureReasonErrorKey] = failureReason;
dict[NSUnderlyingErrorKey] = error;
error = [NSError errorWithDomain:#"YOUR_ERROR_DOMAIN" code:9999 userInfo:dict];
// Replace this with code to handle the error appropriately.
// abort() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development.
DDLogError(#"Unresolved error %#, %#", error, [error userInfo]);
abort();
}
return _persistentStoreCoordinator;
}
I don't know how can I effectively know by tests or logs that CoreData use the new data model (version 4) and has performed the migration successfully.
I know it works when I build from xCode but it's not the same situation than an update from the AppStore.
To set up an app for a lightweight migration, you need to insert the following lines into your CoreData framework where the Persistent Store is declared. These settings enable the options that support lightweight migrations (these are from a Swift 3.0 app so they may vary a bit if you're in 2.3):
NSMigratePersistentStoresAutomaticallyOption as NSObject: true
NSInferMappingModelAutomaticallyOption as NSObject: true
Once these lines are in place, CoreData will perform lightweight migrations correctly whenever they're required, including adding new entities, attributes, and relationships so you should be OK as long as you don't do anything that requires more action on your part - like changing the name of an entity or property.

'NSObjectInaccessibleException' - CoreData sharing with TodayViewExtension

I've kinda implemented a today view extension with CoreData sharing in my app, I have multiple problems (like only one object showing when I have three) and a big one, "Terminating app due to uncaught exception 'NSObjectInaccessibleException', reason: 'CoreData could not fulfill a fault for '0xd0000000001c0004". Now, this only happens when I have the application open in the background, which leads me to believe that my app is leaving the store in a bad state, but that really shouldn't be happening. I'm using Persistent Stack, external 'library' to manage all of the CoreData (instead of AppDelegate) which is readable on https://gist.github.com/mluisbrown/7015953
CoreData Fetching from TodayView Extension:
-(void)viewDidLoad{
self.persistentStack = [[PersistentStack alloc] initWithStoreURL:self.storeURL modelURL:self.modelURL];
self.managedObjectContext = self.persistentStack.managedObjectContext;
}
- (NSURL*)storeURL
{
//NSURL* documentsDirectory = [[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:YES error:NULL];
// return [documentsDirectory URLByAppendingPathComponent:#"WhatIOwe.sqlite"];
NSURL *directory = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:#"group.com.bittank.io"];
NSURL *storeURL = [directory URLByAppendingPathComponent:#"WhatIOwe.sqlite"];
return storeURL;
}
- (NSURL*)modelURL
{
return [[NSBundle mainBundle] URLForResource:#"WhatIOwe" withExtension:#"momd"];
}
- (NSFetchedResultsController *)fetchedResultsController {
self.persistentStack = [[PersistentStack alloc] initWithStoreURL:self.storeURL modelURL:self.modelURL];
self.managedObjectContext = self.persistentStack.managedObjectContext;
if (_fetchedResultsController != nil) {
return _fetchedResultsController;
}
NSFetchRequest *fetchRequest = [[NSFetchRequest alloc] init];
NSEntityDescription *entity = [NSEntityDescription
entityForName:#"OweInfo" inManagedObjectContext:managedObjectContext];
[fetchRequest setEntity:entity];
NSSortDescriptor *sort = [[NSSortDescriptor alloc]
initWithKey:#"details.date" ascending:NO];
[fetchRequest setSortDescriptors:[NSArray arrayWithObject:sort]];
[fetchRequest setFetchBatchSize:20];
NSFetchedResultsController *theFetchedResultsController =
[[NSFetchedResultsController alloc] initWithFetchRequest:fetchRequest
managedObjectContext:managedObjectContext sectionNameKeyPath:nil
cacheName:#"Root"];
self.fetchedResultsController = theFetchedResultsController;
_fetchedResultsController.delegate = self;
return _fetchedResultsController;
}
As suggested, I've tried a merge policy in my persistent stack:
[self.managedObjectContext.persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType
configuration:nil
URL:self.storeURL
options:#{ NSPersistentStoreUbiquitousContentNameKey : #"WhatIOwe",
NSMigratePersistentStoresAutomaticallyOption : #YES,
NSInferMappingModelAutomaticallyOption : #YES,
NSMergeByPropertyObjectTrumpMergePolicy : #YES}
error:&error];
Another observation, on configuring my NSManagedObjectContext, passing:
[psc addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:self.storeURL options:nil error:&error]; allows the extension to read the store (but still throw the error I'm having), but passing
[psc addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:self.storeURL options:#{ NSPersistentStoreUbiquitousContentNameKey : #"iCloudStore",
NSMigratePersistentStoresAutomaticallyOption : #YES,
NSInferMappingModelAutomaticallyOption : #YES,
} error:&error]; will result in the extension not reading any data whatsoever.
Side-note: psc is passed as
__weak NSPersistentStoreCoordinator *psc = self.managedObjectContext.persistentStoreCoordinator;
self.persistentStoreCoordinator = self.managedObjectContext.persistentStoreCoordinator;
What is happening here is that you have two different processes accessing the same Core Data store, each with it's own NSPersistentStoreCoordinator.
One process has modified the store - most likely a delete. The other process has no way of knowing that this delete occurred, and already had an object in memory that pointed to the now deleted data. When that process tries to access that object it must go to the store to get the data ("firing a fault"), which no longer exists.
Core Data is not designed for this kind of use, and the capabilities of extensions are very different than applications. If your extension is able to write to the same data that the application can you may reconsider your approach and make the extension only able to read the data, and never hold the data in memory for long. This will at least mitigate the most common ways to run into these problems.
Replying to above answer and the question:
"Core Data is not designed for this kind of use"
It totally is. The assessment is correct: Something has likely been deleted in the actual app, and the extension is not aware. Fortunately CoreData provides a way to deal with this case. Check out the stalenessInterval property of NSManagedObjectContext. It defines how long your in memory cache is good for. If you're having problems because your in memory cache goes out of date from disk store change because an external process is changing them, simply set the staleness interval to 0 in your extension, and that will tell CoreData to always fetch new values from the store and ignore the in memory cache.
If you're holding a reference to an object in memory, and that object is deleted in the store, you still may have issues, so always check to make sure the object you are accessing has not been deleted.
There are a few other options if you want to get more detail. You could send notifications from your main app to your extension when things get saved to provide a manual trigger for reloading your data. You could also send specific object ids across that have been deleted or modified and use the refreshObject... method to manually refresh. Or check out mergeChangesFromContextDidSaveNotification:. You might be able to manually serialize the dictionary that expects and then send it across.
You could also have the parent app handle all database accesses and send results back via notifications. This might be unnecessary.
All of this requires a bit of work, but you're going to run into that with any
database system where the database is being accessed across multiple processes, and there is caching.
There are multiple issues with your code which are likely leading to a confused Core Data state. I can't be certain that they're causing your problem, but at the moment things are messed up badly enough that trying to debug this specific error is getting ahead of things. These problems include:
Confusion about how you're sharing data between your app and the extension.
You can do this using iCloud, or you can do it by using an app group to share the persistent store directly. You appear to be attempting both, which is not just unnecessary but also likely to cause problems keeping the extension up to date.
If you use iCloud to share data, you do not need the app group, because both the app and the extension will get their data from iCloud. You don't share a local persistent store in this case, instead the data is transferred via iCloud.
If you use an app group to share the persistent store file, you have no need of iCloud. The app and the extension both access the same file, which is located in the group container. Each can both read and write it.
Conflicting iCloud setups. You're using different values for NSPersistentStoreUbiquitousContentNameKey in different places. Even assuming that iCloud is working properly, this is going to prevent you from sharing data. If the app and extension are going to share via iCloud, they need to access the same cloud store, but you seem to be directing them to use separate cloud data stores.
You're not actually using the merge policy you're aiming for. You're passing NSMergeByPropertyObjectTrumpMergePolicy as one of the options when adding the persistent store file, but this isn't actually a valid option there. I would have expected at least a console message about this, but if there isn't one then it means Core Data is just silently ignoring that key. You set a merge policy by setting the value of the mergePolicy attribute on your managed object context. With no merge policy, you're falling back on the default NSErrorMergePolicy.
Unusual, suspicious code when adding the persistent store. In most cases with Core Data you'd add the persistent store and then later on create one or more managed object contexts. You appear to be creating the context first and only later adding a persistent store. That's not necessarily wrong but it's very unusual. If I were working on this code, it'd be a red flag to look over the life cycle of the Core Data stack very carefully. Especially if, as your code comments indicate, you're not getting any data at all in some cases.
Again I'm not sure if the above is directly causing this specific error, but they're going to be causing problems of various types and it wouldn't be surprising if the resulting confused state leads to this error.

iOS Core Data data insert

I have an app which asynchronously downloads a JSON file and then it should insert those objects within Core Data for persistent storage. Regarding the insert, is it a good idea to do it from the main thread? What if there are thousands of objects? Should I do the inserts on a different thread? Could you provide me with some snippets regarding this matter? Regarding the fetching of the objects after I've saved them, should I also use a different thread?
My code for inserting into Core Data is:
- (void) insertObjects:(NSArray*)objects ofEntity:(NSString *)entityName
{
NSString *key;
NSManagedObject *managedObject;
NSError *error;
for(NSDictionary *dict in objects){
managedObject = [NSEntityDescription insertNewObjectForEntityForName:entityName inManagedObjectContext:_managedObjectContext];
for(key in dict){
[managedObject setValue:dict[key] forKey:key];
}
}
[_managedObjectContext save:&error];
}
PS: The objects are of the same entity. The project runs on iOS 7.0 or higher.
Since I can't comment yet..
What iOS Versions do you plan to support? If 5 and higher, this might help Concurrency stack
Summary of the link:
you create a context of private concurreny type to access your physical data
based on this you create a context of main concurreny type
on top of this you use private concurrency type stores again.
Don't forget so save in every store, otherwise, the data seems to be saved while the app is running, but after restart it is lost.
And yes, you want to do it an extra thread, since it would otherwise block the UI if there are to many items.

What is the fastest way to load a large CSV file into core data

Conclusion
Problem closed, I think.
Looks like the problem had nothing to do with the methodology, but that the XCode did not clean the project correctly in between builds.
It looks like after all those tests, the sqlite file that was being used was still the very first one that wasn't indexed......
Beware of XCode 4.3.2, I have nothing but problems with Clean not cleaning, or adding files to project not automatically being added to the bundle resources...
Thanks for the different answers..
Update 3
Since I invite anybody to just try the same steps to see if they get the same results, let me detail what I did:
I start with blank project
I defined a datamodel with one Entity, 3 attributes (2 strings, 1 float)
The first string is indexed
In did finishLaunchingWithOptions, I am calling:
[self performSelectorInBackground:#selector(populateDB) withObject:nil];
The code for populateDb is below:
-(void)populateDB{
NSLog(#"start");
NSPersistentStoreCoordinator *coordinator = [self persistentStoreCoordinator];
NSManagedObjectContext *context;
if (coordinator != nil) {
context = [[NSManagedObjectContext alloc] init];
[context setPersistentStoreCoordinator:coordinator];
}
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"input" ofType:#"txt"];
if (filePath) {
NSString * myText = [[NSString alloc]
initWithContentsOfFile:filePath
encoding:NSUTF8StringEncoding
error:nil];
if (myText) {
__block int count = 0;
[myText enumerateLinesUsingBlock:^(NSString * line, BOOL * stop) {
line=[line stringByReplacingOccurrencesOfString:#"\t" withString:#" "];
NSArray *lineComponents=[line componentsSeparatedByString:#" "];
if(lineComponents){
if([lineComponents count]==3){
float f=[[lineComponents objectAtIndex:0] floatValue];
NSNumber *number=[NSNumber numberWithFloat:f];
NSString *string1=[lineComponents objectAtIndex:1];
NSString *string2=[lineComponents objectAtIndex:2];
NSManagedObject *object=[NSEntityDescription insertNewObjectForEntityForName:#"Bigram" inManagedObjectContext:context];
[object setValue:number forKey:#"number"];
[object setValue:string1 forKey:#"string1"];
[object setValue:string2 forKey:#"string2"];
NSError *error;
count++;
if(count>=1000){
if (![context save:&error]) {
NSLog(#"Whoops, couldn't save: %#", [error localizedDescription]);
}
count=0;
}
}
}
}];
NSLog(#"done importing");
NSError *error;
if (![context save:&error]) {
NSLog(#"Whoops, couldn't save: %#", [error localizedDescription]);
}
}
}
NSLog(#"end");
}
Everything else is default core data code, nothing added.
I run that in the simulator.
I go to ~/Library/Application Support/iPhone Simulator/5.1/Applications//Documents
There is the sqlite file that is generated
I take that and I copy it in my bundle
I comment out the call to populateDb
I edit persistentStoreCoordinator to copy the sqlite file from bundle to documents at first run
- (NSPersistentStoreCoordinator *)persistentStoreCoordinator
{
#synchronized (self)
{
if (__persistentStoreCoordinator != nil)
return __persistentStoreCoordinator;
NSString *defaultStorePath = [[NSBundle mainBundle] pathForResource:#"myProject" ofType:#"sqlite"];
NSString *storePath = [[[self applicationDocumentsDirectory] path] stringByAppendingPathComponent: #"myProject.sqlite"];
NSError *error;
if (![[NSFileManager defaultManager] fileExistsAtPath:storePath])
{
if ([[NSFileManager defaultManager] copyItemAtPath:defaultStorePath toPath:storePath error:&error])
NSLog(#"Copied starting data to %#", storePath);
else
NSLog(#"Error copying default DB to %# (%#)", storePath, error);
}
NSURL *storeURL = [NSURL fileURLWithPath:storePath];
__persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:[self managedObjectModel]];
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], NSMigratePersistentStoresAutomaticallyOption,
[NSNumber numberWithBool:YES], NSInferMappingModelAutomaticallyOption, nil];
if (![__persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:storeURL options:options error:&error])
{
NSLog(#"Unresolved error %#, %#", error, [error userInfo]);
abort();
}
return __persistentStoreCoordinator;
}
}
I remove the app from the simulator, I check that ~/Library/Application Support/iPhone Simulator/5.1/Applications/ is now removedI rebuild and launch again
As expected, the sqlite file is copied over to ~/Library/Application Support/iPhone Simulator/5.1/Applications//Documents
However the size of the file is smaller than in the bundle, significantly!
Also, doing a simple query with a predicate like this predicate = [NSPredicate predicateWithFormat:#"string1 == %#", string1]; clearly shows that string1 is not indexed anymore
Following that, I create a new version of the datamodel, with a meaningless update, just to force a lightweight migration
If run on the simulator, the migration takes a few seconds, the database doubles in size and the same query now takes less than a second to return instead of minutes.
This would solve my problem, force a migration, but that same migration takes 3 minutes on the iPad and happens in the foreground.
So hat's where I am at right now, the best solution for me would still be to prevent the indexes to be removed, any other importing solution at launch time just takes too much time.
Let me know if you need more clarifications...
Update 2
So the best result I have had so far is to seed the core data database with the sqlite file produced from a quick tool with similar data model, but without the indexes set when producing the sqlite file. Then, I import this sqlite file in the core data app with the indexes set, and allowing for a lightweight migration. For 2 millions record on the new iPad, this migration stills take 3 minutes. The final app should have 5 times this number of records, so we're still looking at a long long processing time.
If I go that route, the new question would be: can a lightweight migration be performed in the background?
Update
My question is NOT how to create a tool to populate a Core Data database, and then import the sqlite file into my app. I know how to do this, I have done it countless times. But until now, I had not realized that such method could have some side effect: in my case, an indexed attribute in the resulting database clearly got 'unindexed' when importing the sqlite file that way.
If you were able to verify that any indexed data is still indexed after such transfer, I am interested to know how you proceed, or otherwise what would be the best strategy to seed such database efficiently.
Original
I have a large CSV file (millions of lines) with 4 columns, strings and floats.
This is for an iOS app.
I need this to be loaded into core data the first time the app is loaded.
The app is pretty much non functional until the data is available, so loading time matters, as a first time user obviously does not want the app to take 20 minutes to load before being able to run it.
Right now, my current code takes 20 min on the new iPad to process a 2 millions line csv file.
I am using a background context to not lock the UI, and save the context every 1,000 records
The first idea I had was to generate the database on the simulator, then to copy/paste it in the document folder at first launch, as this is the common non official way of seeding a large database. Unfortunately, the indexes don't seem to survive such a transfer, and although the database was available after just a few seconds, performance is terrible because my indexes were lost. I posted a question about the indexes already, but there doesn't seem to be a good answer to that.
So what I am looking for, either:
a way to improve performance on loading millions of records in core data
if the database is pre-loaded and moved at first startup, a way to keep my indexes
best practices for handling this kind of scenario. I don't remember using any app that requires me to wait for x minutes before first use (but maybe The Daily, and that was a terrible experience).
Any creative way to make the user wait without him realizing it: background import while going through tutorial, etc...
Not Using Core Data?
...
Pre-generate your database using an offline application (say, a command-line utility) written in Cocoa, that runs on OS X, and uses the same Core Data framework that iOS uses. You don't need to worry about "indexes surviving" or anything -- the output is a Core Data-generated .sqlite database file, directly and immediately usable by an iOS app.
As long as you can do the DB generation off-line, it's the best solution by far. I have successfully used this technique to pre-generated databases for iOS deployment myself. Check my previous questions/answers for a bit more detail.
I'm just starting out with SQLite and I need to integrate a DB into one of my apps that will have a lot of indexed data in a SQLite database. I was hoping I could do some method where I could bulk insert my information into a SQLite file and add that file to my project. After discovering and reading through your question, the provided answer and the numerous comments, I decided to check out the SQLite source to see if I could make heads or tails of this issue.
My initial thought was that the iOS implementation of SQLite is, in fact, throwing out your indices. The reason is because you initially create your DB index on x86/x64 system. The iOS is an ARM processor, and numbers are handled differently. If you want your indexes to be fast, you should generate them in such a way that they are optimized for the processor in which they will be searched.
Since SQLite is for multiple platforms, it would make since to drop any indices that have been created in another architecture and rebuild them. However, since no one wants to wait for an index to rebuild the first time it is accessed, the SQLite devs most likely decided to just drop the index.
After digging into the SQLite code, I've come to the conclusion that this is most likely happening. If not for the processor architecture reason, I did find code (see analyze.c and other meta-information in sqliteint.h) where indices were being deleted if they were generated under an unexpected context. My hunch is that the context that drives this process is how the underlying b-tree data structure was constructed for the existing key. If the current instance of SQLite can't consume the key, it deletes it.
It is worth mentioning that the iOS Simulator is just that-- a simulator. It is not an emulator of the, hardware. As such, your app is running in a pseudo-iOS device, running on an x86/x64 processor.
When your app and SQLite DB are loaded to your iOS device, an ARM-compiled variant is loaded, which also links to the ARM compiled libraries within iOS. I couldn't find ARM specific code associated with SQLite, so I imagine Apple had to modify it to their suit. The could also be part of the problem. This may not be an issue with the root-SQLite code, it could be an issue with the Apple/ARM compiled variant.
The only reasonable solution that I can come up with is that you can create a generator application that you run on your iOS machine. Run the application, build the keys, and then rip the SQLite file from the device. I'd imagine such a file would work across all devices, since all ARM processors used by iOS are 32-bit.
Again, this answer is a bit of an educated guess. I'm going to re-tag your question as SQLite. Hopefully a guru may find this and be able to weigh in on this issue. I'd really like to know the truth for my own benefit.

NSPersistentStoreCoordinator - how to handle schema incompatibility errors?

Every time I change the Core Data model for my app, it generates an unrecoverable error at the next startup: "The model used to open the store is incompatible with the one used to create the store".
The only reliable way to avoid this I have found is to manually delete the app and let Xcode reinstall it, or use other techniques to manually blow away the Core Data store .sqlite store file. This is obviously not viable for shipping out to users.
Apple's default App Delegate template for initializing the NSPersistentStoreCoordinator includes this comment:
__persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:[self managedObjectModel]];
if (![__persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:storeURL options:nil error:&error]) {
/*
TODO: Replace this implementation with code to handle the error appropriately.
...
If you encounter schema incompatibility errors during development, you can reduce their frequency by:
* Simply deleting the existing store:
[[NSFileManager defaultManager] removeItemAtURL:storeURL error:nil]
* Performing automatic lightweight migration by passing the following dictionary as the options parameter:
[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], NSMigratePersistentStoresAutomaticallyOption, [NSNumber numberWithBool:YES], NSInferMappingModelAutomaticallyOption, nil];
*/
I have not been able to find any sample or instructions on how to "handle the error appropriately" though.
I would be happy to simply blow away the database in this case and let the app regenerate it from online data. This is what I do when I blow the database away manually. But how can I do that automatically in response to this error condition?
Are there any good samples explaining best practices?
Best way is to use lightweight migration. See Apple documentation: http://developer.apple.com/library/mac/#documentation/Cocoa/Conceptual/CoreDataVersioning/Articles/vmLightweightMigration.html.
When you need to make changes in model in new app version, you create new model. You do this in Xcode - just select Your current model and select from menu Editor/Add model version… Without this, automatic migration will not work.

Resources