iCloud Core Data Reliability & Timing - ios

I have been attempting to implement iCloud with my Core Data based small business apps. Been using a GitHub method called Ubiquity Store Manager (USM) and more generic Apple code example methods. It almost seems to work...but there are 2 major issues that I can't seem to consistently address:
Timing - When the context is saved to the Ubiquity container it is beyond your control to determine when it is upload to iCloud. If two transactions are saved in less than 3-5 seconds often they will be uploaded to iCloud in the reverse chronological order they were entered/saved. For example: trans1 at 8:01:01 and trans2 at 8:01:04, trans2 will often upload and download onto other devices BEFORE trans1. If these are simple records like appointments or contacts, probably not a big deal. With parent-child related records it's a very big deal as the child records arrive before and parents and are effectively "lost" in iCloud. I have tried a timer between transactions 5-7 second delay will eliminate the problem, but is there a better way to handle this?
Reliability - When testing on 2 devices after a pause of as little as 2 minutes, if 2 successive transactions are saved frequently the first transaction will not be displayed on the 2nd device. If a "wake up" transaction is created prior to the entry of the real transaction then the reliability can be restored. Again, this is a kluggy solution, does any one have a better way to handle this?
Key Value iCloud transaction are almost instantaneous, error free and bulletproof. How can this be achieved using Core Data or is Core Data just not appropriate for complex (multiple relationship) business transactions?
Thanks for any help or ideas!

Related

How to keep one unique object synced across devices with iCloud?

The original question was "How to keep one unique object synced across devices in near real-time with iCloud?" and based on the comment which seems impossible to be done with iCloud alone.
What if the acceptable delay is less than 30 seconds? would that be something possible with iCloud or CloudKit? How it can be achieved?
I have a user object which contains multiple properties update by the user at a relatively high frequency (1 update per second), and it needs to be synced across devices.
NSPersistentCloudKitContainer
I have tried a regular Core Data entity, but it seems hard only to allow one Core Data entity instance. A simple fetch and update would simply result in multiple instances of the same object unless adding some custom merging logic in place.
NSUbiquitousKeyValueStore
I also tried to explore sync UserDefault with NSUbiquitousKeyValueStore on the documentation page; there is a big highlight of not using it for value change very frequently.
The key-value store is intended for storing data that changes infrequently. As you test your devices, if the app on a device makes frequent changes to the key-value store, the system may defer the synchronization of some changes to minimize the number of round trips to the server.

The best way to handle erratic data on iOS

I am working on an application where I have a connection to a database. The database contains from 300MB to 4GB worth of data as each customer has their own database. My issue that I am having is in gathering the data, because of the potential database size, just downloading and storing the information locally isn't possible. The data can get quite complex and can vary. For an example:
A customer has a Job and they want to search for that job from the app.
I then fetch a list of jobs matching the search criteria.
The customer sees the job they want to view and I start the gathering process.
This job can potentially touch many tables, sometimes repeatedly..
There is the jobs table, a relational table to map to a person. Then there is another table that contains non-customer relational information, then there are calendar events associated to the job, which in tun can associate different people. Then there are emails attached to the job, which in turn can bring in additional people and events.
So I have a working model that gathers all of this information. The problem I have is that I cannot figure out a great method of signaling to my view that the data is completely downloaded. My initial thought was to use the NotificationCenter to message when the certain parts of the task were finished, allowing the core Job object to notify the view when everything was complete.
I know this is a pretty generalized question, but I'm honestly stumped as to how to take an unknown number of table results and translate that into a notice that my app can actually use.
My initial recommendation would be Core Data. It's designed for this kind of problem. No, I'm not saying to download the entire database into Core Data. I'm saying to use Core Data to manage your object model, because that's what it's good at.
As you receive data from the server, compose it into NSManagedObjects and stick them in the data store. On the UI side, create an NSFetchedResultsController to keep you informed as the data updates asynchronously. You don't necessarily need to persist this store. You could just keep it in memory and throw it away whenever you're done with the query, but keeping it on disk could be a nice caching solution. Again, don't think of Core Data as "a local database." Think of it as a model persistence engine that you can query for objects.
One advantage of this model is that you can provide the best available data to the user as it becomes available. But say you really don't want to get the information until it's all available. That's fine, too. Just let the network side keep updating its context, and then only save it when everything's complete. That way NSFetchedResultsController gets a single atomic update. The nice things with Core Data is that it has these concepts built in, so you can adjust your update strategy without requiring massive redesign.
The Notification Center will work great for this.
Post the notification at logical points in your data load to trigger a UI update for your users.

Core data iCloud transaction logs

I'm testing Core Data and iCloud with UIManagedDocument and ubiquity options (NSPersistentStoreUbiquitousContentNameKey and NSPersistentStoreUbiquitousContentURLKey).
Everything is working OK. My devices get synched without problems and in an reasonable time. The DB is small (below 100K).
As i said I'm testing the app and making a lot of changes to the db and as result, a lot of transaction logs are generated. The problem I have is that if I delete and reinstall the app on one of the devices used for testing (without deleting iCloud data) the app take a very long time to open the document. openWithCompletionHandler takes minutes, sometimes never ends. If I turn on debugging (-com.apple.coredata.ubiquity.logLevel 3) i can see that there is a long wait and after that the DB is reconstructed with transaction logs.
If i remove iCloud data and reinsert the data on first device the second one sync without problems. Because of that I think that the reason for the delay is a high number of transaction logs (20-30 while testing as I can see on developer.icloud.com)
According to Managing Core Data iCloud Transaction Logs will handle core data automatically, but I can't see any deletion. Perhaps that needs some more time.
My questions are: Do transaction logs gets consolidated ever ? Can I force the consolidation of logs ? Another recommended option ?
I only store the subset of essential information needed for syncing in iCloud Core Data file. I have another local file with full DB, so I can reconstruct the iCloud DB without any major loss of information. Perhaps I could delete iCloud DB when I detect a bunch of logs and re-create it. Do you think this is a good option ?
Thank you for helping.
Do transaction logs gets consolidated ever ?
That is how it's supposed to work.
Can I force the consolidation of logs ?
No. There is no API that directly affects the existence of transaction logs. The iCloud system will consolidate them at some point, but there's no documentation regarding when that happens, and you can't force it.
Another recommended option ?
You can limit the number of transaction logs indirectly-- save changes less frequently. A transaction log corresponds to saving changes in Core Data. It may not make much of a difference because, honestly, 20-30 transaction logs is not very many. You might be able to reduce the number of log files but you'll still have the same amount of data in them.
Transaction logs aren't really your problem. As you observed, there's a long wait before iCloud starts running through the transaction logs. During that delay, iCloud is communicating with Apple's servers and downloading the transaction logs. Some of this is affected by network speed and latency, and the rest is just the way iCloud is.

Bootstrapping data at application startup with Simperium

As someone that experienced the pain of iCloud while trying to prototype iCloud enabling one of our CoreData apps, Simperium looks very promising, but I'm interested in seeing how it handles some of the sharp edges.
One issue I came across was how to gracefully handle bootstrapping data when the application starts up. The first time a user launches our app, we will load some default data into our CoreData database. If a user launches the app first on the iPhone and then later on the iPad, they will end up getting the bootstrap data duplicated on both devices because of syncing. With iCloud, the solution was to hook into the iCloud merge process.
How would I handle this with Simperium?
There are at least a couple ways to do this.
You can hardcode the simperiumKey for each seeded object. For example, in a notes app, if every new user gets a welcome note, you can locally create that note with the simperiumKey of welcomeNote. This will ensure that only one welcome note will ever exist in that user's account (on any device). With this approach, there can be some redundant data transfer, so it's best if there's not a large amount of seeded data. On the other hand, this approach is good if you want data to be immediately available to new users even if they're offline when they first launch your app.
With Simperium, you also have the option to use a server process. You can seed new user accounts with data by using a Python or Ruby listener that runs some code when accounts are created. This is a good approach if there's a large amount of data, but has the disadvantage that users need to be online before the seeded data will transfer (and of course the transfer itself will take some time).
There are subtleties with these approaches. With the first approach, using the welcomeNote example, if your user deletes the welcomeNote and subsequently reinstalls your app in the future, the welcomeNote will get resurrected (but never duplicated) because it's being created locally. This is often acceptable. With the second approach, the welcomeNote would be seeded once and only once, so it will never get resurrected even if your app is reinstalled.

iOS app - architecture/sample for synchronizing CoreData against Web-Service

I am looking for either a sample app or a more architectural discussion to build an app, which maintains a local persistent store (CoreData) and keeps it sync against a Web-Service like Flickr. In my case it is Salesforce, but the pattern should be similar to many apps for Flickr, Twitter, IMAP and so on.
Sample questions:
where are the best points to invoke the syncing?
what are proven datastructures to maintain local changes
- maintain a "changed" BOOL in the local store for every unsynched change; I would prefer a field level flag against a record level flag)?
Of course I have to optimize this on my own, knowing the amount of records (100's) and changes (10's per day) and the probability of conflicts (low in my case on a field level).
Here's how I would approach this:
Start by modeling a local CoreData/Sqlite database that mirrors your online database.
Add an NSDate lastModified property to every row of each table. This will allow me to track changes at the record level, instead of field level. This helps reduce sync complexity, and in most real world scenarios, record level syncing is sufficient.
Perform an automatic sync when the app starts, and also provide a prominent "Sync" button in your navigation bar. This way the user always has an updated dataset when the app launches after a long period, and can sync the latest changes over the course of a day. I would avoid doing background sync while the app is being used. This will make your app more complex and error-prone when you're trying to tackle other things. So postpone working on background/automatic sync until you have the rest working.
Once I have my sync working reasonably well at launch and on-demand, I would try and support background sync. I would also try and eliminate the "Sync" button, so the user never has to think about syncing - (it's always up to date as far as the user is concerned). But this would be a longer term enhancement, which i would attempt only after I have "on-demand" syncing working rock solid.
Hope this helps you get started. I would love to hear if you think differently about any of this.

Resources