Migrating iCloud Core Data manually - ios

I have an app that reads wind readings at sites around the world. I decided to use iCloud and Core Data using a shoe-box style app.
The wind readings update hourly, after a few weeks of using the app I realised this was a bad idea as iCloud/Core Data just fills up with megabytes of transactions and restoring a device takes 10 minutes to download the store to a fresh device.
My solution to this was to use Core Data configurations so that the "sites" were stored in the iCloud store but the hourly changing "wind readings" which get deleted after 12 hours were stored in a local store. If it makes it easier to imagine, it works similar to RSS "sites" and "entries" which change hourly.
This all works great but I can't work out how to write migration code for the 2.0 version of my app. After reading how configurations work I had to remove the parent/child relationship between sites and wind readings and use fetch requests to link them up using a common siteIdentifier UUID.
Doing it this way I assume I cannot use light-weight migrations? Also loading up the versioned .momd model file just gives me the latest model so how do I get hold of the original model file to load up the store and do everything manually.
On the other hand, is this just too complicated and I would be better removing iCloud support or there is another way you'd recommend?

You should be able to use a lightweight migration in this situation.
The reason is, as far as your 'iCloud' configuration is concerned you are just deleting an entity and dropping a properties (i.e. dropping a table and a column). Automatic migration can handle that just fine.
However...
There is a catch. It won't copy the data you have to the 'local' configuration first. Therefore you will need to do that manually before the migration. Here are the basic steps:
Determine if this migration needs to occur.
Copy the sqlite file to "local.sqlite".
Stand up the iCloud configuration, this will delete the readings.
Stand up the local configuration, this will delete the sites.
Test, test, test again, and keep testing.

Related

Periodic iCloud backup of SQLite database

Let me get this out of the way right now: yes, it was almost certainly a mistake to not use Core Data. However, I was new to iOS development when I made these decisions, and I had no idea I'd be hamstrung like this. Moreover, the app is intended to also run on Android (eventually), so I avoided platform-specific APIs wherever possible.
I have an iOS app that stores data in a local SQLite database file. The data stored in the file is provided by the user, so it's important that it be kept safe. I had plans to "do this later", and later is now here. I am quickly coming to the realization that it won't be as straightforward as I had hoped...
I now understand that it won't be possible to seamlessly synchronize data across devices, and I'm willing to accept that limitation until I manage to migrate to Core Data. However, in the meantime I'd at least like the SQLite database to be backed up periodically so users can feel safe using the app on a single device. I was thinking I would do this:
periodically (e.g. once a week) copy the SQLite file from local storage into cloud storage, thus ensuring it is backed up
when the app starts, if the local store is missing or corrupted but the file exists in the cloud storage, ask the user if they would like to copy it over
The biggest problem with this approach is that the user could run the app on multiple devices and therefore the data stored in iCloud could be from any one of those devices, but only one. To combat that, I thought I could just use a per-device, unique name for the file in cloud storage. I would generate this using UIDevice.identifierForVendor.
So my startup logic would be:
Determine the unique name for the cloud file.
Is the local file missing or corrupted, and if so, does the cloud file exist?
2.1. Ask the user if they would like to restore from the cloud file. Make it really hard for them to say no because doing so will lose all their data.
2.2. If they say yes, copy the cloud file to the local file storage.
Open the local database file.
And running in the background I would occasionally copy the database file from local to cloud storage.
I would like to know whether this a sensible approach until I do Core Data integration. Also, are there any hidden "gotchas" that I'm perhaps missing?
UPDATE: as #TomHarrington pointed out in a comment, it turns out my database file is already sitting in /Documents, which is backed up to iTunes and any iCloud account. So my question morphs into this:
Should I simply ensure my database has a device-specific name so that it is not clobbered by the app running on another device connected to the same iCloud account?
I'm going to answer my question, since I ended up going down this path and finding a MASSIVE blocker. There is a bug in the UIDevice.identifierForVendor API that causes it to regenerate every time a new version of the app is installed! See here. This of course rules out using it as a device identifier. sigh
I think I'm SOL with that approach. Instead, I might generate a GUID on first execution and use that as my identifier. Problem is, I need to store that somewhere that isn't backed up to iCloud.
Ugh, I may just give up here and say my app can't be run on multiple devices until Core Data integration is done.
UPDATE: I ended up generating an identifier on first run and storing it in the keychain (as a local entry only so it isn't backed up to iCloud).

Multi dimension Lookup Table

I need to define a large amount of data to be stored within an app and used as a lookup table. For instance, I have an array of manufacturer names, each with a mfg code. Each manufacturer can make different products, each with their own code as well.
A,7 could be deciphered to mean
Manufacturer: Apple(A)
Product: MacMini(7)
I see several ways of defining this, but I'm not sure which would be best.
Option 1) #define these constants in a separate header file such as:
#define MFG_APPLE #"A"
#define MFG_DELL #"B"
#define PRODUCT_MAC_MINI 7
#define PRODUCT_INSPIRON 2
Option 2) create a dictionary object filled with dictionary objects to allow me to index through them easier.
Option 3) use core data to create a database of these mfgs and products and relationships.
If option 2 or 3 is suggested, are there easy ways to pre-populate these data structures instead of hard-coding them to populate during program startup?
Option 4) Create a web service to tie this back to a server, where the data can be updated more often. A JSON query will send the mfg and product codes to the server, where it can respond with the mfg and product names.
You should consider the following: If the database is shipped with the app, you will have to release an update for the app each time the database must be updated. So the question is, how frequently will you have to update the data? If it's fine to update the database once every couple of months or maybe just once a year, shipping the database with your app might be an option, if you need to update it every month or even weekly, you should definitely host the database somewhere on the web; releasing an update in such short intervals is not a feasible option.
Another thing you should consider: If the database exists solely as a web service and each look up requires a JSON call to the server, it won't be possible to perform a lookup if the user is offline (currently has no network access for whatever reason). Also each lookup costs the user traffic, so if the user has a monthly limit, yet needs to perform plenty of lookups a day, using your app may quickly cause him to exceed that monthly limit, leaving him without any Internet service (or a very throttled one) in the end.
From my experience, it is best to host such a database online, yet cache it for offline access if possible. The app itself ships with a database copy, that was up-to-date the day you built the app for distribution. Each time the app is started, and maybe once a day in case the app is never quit, it will query a web server for the current "version" of the database. If this version is newer than the one shipped with the app, it tries to downloads a copy of this database to its local cache and then switches to the cached copy for future lookups. If the cached copy gets lost (caches may be flushed by the system at any time), it will have to re-download it. In the meantime, it can use the shipped database, which is outdated, yet better than nothing. If download is not possible (e.g. not enough free space is available on the device), the app may want to make online queries directly if the user currently is online, fall back to the out-dated shipped database if he is offline, and retry to download a cache copy at some later time (maybe the device will have more free space available at that time).
So basically your app will have a work flow as follows:
START
A locally cached copy exists? If NO Goto 6.
The locally cached copy is up-to-date? If NO Goto 5.
Perform the lookup using the local cached copy. Goto 12.
Delete the outdated cached copy. Goto 1.
The shipped database is up-to-date? If NO Goto 8.
Perform the lookup using the shipped database. Goto 12.
Download the updated database.
Download succeeded? If YES Goto 4.
The user is currently online? If NO Goto 7.
Perform the lookup using a JSON webservice. Goto 12.
END
If you only add more entries to the database in the future, yet existing entries will never change, there is also another, even much better option: You have simply two databases. One that ships with the app and one that only contains the updates (new entries added) after the last app release. This shrinks the amount of data that needs to be downloaded and cached dramatically. In that case your app must always perform two lookups, one in the shipped database (which is always performed first), and if nothing is found there, in the downloaded cached copy, which does not contain the entries already found in the shipped database (or directly online, if no cached copy is available, yet the user currently has Internet access). Each time you release a new update of the app, it will get a new full copy of the database, hence you can reset the update database back to zero entries and only keep adding new entries there (or you can keep different update databases lying around on the server for different app versions that had different databases shipped with them, if you don't think that is too much hassle to manage).
The update database for download may even be created dynamically by the server, that would of course be the best option. E.g. after shipping the app, you add 3 vendors and 30 products to the database, and every vendor and product has a unique ID (that is strictly increasing with each new entry added), then the app can tell the server that the highest vendor it knows has ID X and the highest product has ID Y, in which case the server sends out an update database with all vendors and products whose IDs are higher than X and Y.
All these decisions influence on the database format to use. Generally it sounds a lot like a job for CoreData, yet if you want dynamic update databases, the updates should be delivered in a different format (JSON, XML, CVS, or something else a server can easily generate) and be converted to CoreData by the app after the download is completed, since dynamically generating CoreData databases on a server is rather hard and definitely not recommend.

Core Data Migration - Migrating selected data from a previous version

We're performing our first iOS app update, and also our first Core Data migration.
It seems more complicated than the examples of the Standard and Lightweight Core Data migrations i've seen online, but perhaps i'm missing something.
Our scenario is that we've updated the .xcdatamodel (simply added a new field), and also a lot of the reference data used in our app (stored in our Core Data database), but we need to retain some user data (stored in the same Core Data database).
I've added multiple versions of the model definition into our .xcdatamodelld file, and have played around with a Lightweight Core Data migration process (using a Mapping Model (an .xcmappingmodel file)), which successfully updates the model, but I can't see any obvious way in which it would allow us to import selected data (the user's data) from a previous version of the database into a new one bundled with the next version of the app (containing our updated reference data).
Any advice on how to approach this scenario would be very much appreciated.
Thanks in advance, Ted
Your users' database will be upgraded "in place". There won't be any migration or importing/exporting necessary. When the user runs the new version of your app, the existing database will be upgraded with the new fields. I'm not sure if this answers your question, but there won't be any "importing" going on.
In the end we've worked around this situation by putting the user's data into a plist file (there's a fairly limited amount of this), and retaining the Core Data database to use solely for reference data in the system, so it can be overwritten in future without worry.
A lightweight migration updates the data model on first run, and then a one off migration call creates and populates the user data plist file, renames the v1 core data persistent store *_migrated.sqlite, copies the v2 sqlite database from the bundle into the documents dir, then resets the MOM, and sets the MOM, MOC and Persistent Store to nil, so that the next time Core Data starts it uses the v2 sqlite database as its Persistent Store.
Phew. I hope this makes some sense to anyone reading it, feel free to ask for any other details, but it was honestly a lot simpler than it all sounds!

iOS - how to structure database to conform to iCloud backup rules

I've been having trouble getting an app submitted to the App Store. This is due to the fact that that database, which is updatable, is too large for the iCloud backup limitations. Most of the data in the db is static, but one table records the user's schedule for reviewing words (this is a vocabulary quiz).
As far as I can tell, I have two or three realistic options. The first is to put the whole database into the Library/Cache directory. This should be accepted, because it's not backed up to iCloud. However, there's no guarantee that it will be maintained during app updates, per this entry in "Make App Backups More Efficient" at this url:
http://developer.apple.com/library/IOs/#documentation/iPhone/Conceptual/iPhoneOSProgrammingGuide/PerformanceTuning/PerformanceTuning.html
Files Saved During App Updates
When a user downloads an app update, iTunes installs the update in a new app directory. It then moves the user’s data files from the old installation over to the new app directory before deleting the old installation. Files in the following directories are guaranteed to be preserved during the update process:
<Application_Home>/Documents
<Application_Home>/Library
Although files in other user directories may also be moved over, you should not rely on them being present after an update.
The second option is to put the data into the NSDocuments or NSLibrary directory, as mark it with the skipBackupFlag. However, one problem is this flag doesn't work for iOS 5.0 and previous per this entry in "How do I prevent files from being backed up to iCloud and iTunes?" at
https://developer.apple.com/library/ios/#qa/qa1719/_index.html
Important The new "do not back up" attribute will only be used by iOS 5.0.1 or later. On iOS 5.0 and earlier, applications will need to store their data in <Application_Home>/Library/Caches to avoid having it backed up. Since this attribute is ignored on older systems, you will need to insure your app complies with the iOS Data Storage Guidelines on all versions of iOS that your application supports
This means that even if I use the "skipBackupFlag", I'll still have the problem that the database is getting backed up to the cloud, I think.
So, the third option, which is pretty much of an ugly hack, is to split the database into two. Put the updatable part into the NSLibrary or NSDocuments directory, and leave the rest in application resources. This would have the small, updatable part stored on the cloud, and leave the rest in the app resources directory. The problem is that this splits the db for no good reason, and introduces possible performance issues with having two databases open at once.
So, my question is, is my interpretation of the rules correct? Am I going to have to go with option 3?
p.s. I noticed in my last post cited urls were edited to links without the url showing. How do I do this?
Have you considered using external file references as described in https://developer.apple.com/library/IOS/#releasenotes/DataManagement/RN-CoreData/_index.html . Specifically, refer to "setAllowsExternalBinaryDataStorage:" https://developer.apple.com/library/IOS/documentation/Cocoa/Reference/CoreDataFramework/Classes/NSAttributeDescription_Class/reference.html#//apple_ref/occ/instm/NSAttributeDescription/setAllowsExternalBinaryDataStorage: . Pushing out large data into a separate file can help reduce database size .

Syncing a local sqlite file to iCloud

I store some data in my iOS app directly in a local .sqlite file.  I chose to do this instead of CoreData because the data will need to be compatible with non-Apple platforms.
Now, I'm trying to come up with the best way to sync this file over iCloud.  I know you can't sync it directly, for many reasons.  I know CoreData is able to sync its DBs, but even ignoring that using CD would essentially lock this file into Apple platforms (I think? I've only looked into CD a bit), I need the iCloud syncing of this file to work across ALL of iCloud's supported platforms - which is supposed to include Windows.  I have to assume that there won't be any compatibility for the CoreData files in the Windows API.  Planning out the best way to accomplish this would be a lot easier if Apple would tell us any more than "There will be a Windows API [eventually?]"
In addition, I'll eventually need to implement at least one more sync service to support platforms that iCloud does not.  It would be helpful, though not required, if the method I use for iCloud can be mostly reused for future services.
For these reasons, I don't think CoreData can help me with this.  Am I correct in thinking this?
Moving on from there, I need to devise an algorithm for this, or find an existing one or an existing 3rd party solution.  I haven't stumbled across anything yet. However, I have been mulling over a couple possible methods I could implement:
Method 1:
Do something similar to how CoreData syncs sqlite DBs: send "transaction logs" to iCloud instead and build each local sqlite file off of those.
I'm thinking each device would send a (uniquely named) text file listing all the sql commands that that device executed, with timestamps.  The device would store how far along in each list of commands it has executed, and continue from that point each time the file is updated. If it received updates to multiple log files at once, it would execute each command in timestamp order.
Things could get 'interesting' efficiency-wise once these files get large, but it seems like a solvable problem.  
Method 2:
Periodically sync a copy of the working database to iCloud.  Have a modification timestamp field in every record.  When an updated copy of the DB comes through, query all the records with newer timestamps than some reference time and update the record in the local DB from the new data.
I see many potential problems with this method:
-Have to implement something further to recognize record deletion.
-The DB file could get conflicts. It might be possible to deal with them by handling each conflict version in timestamp order.
-Determining the date to check each update from could be tricky, as it depends on which device the update is coming from.
There are a lot of potential problems with method 2, but method 1 seems doable to me...
Does anyone have any suggestions as to what might be the best course of action? Any better ideas than my "Method 1" (or reasons why it wouldn't work)?
Try those two solutions from Ray Wenderlich:
Exporting/Importing data through mail:
http://www.raywenderlich.com/1980/how-to-import-and-export-app-data-via-email-in-your-ios-app
File Sharing with iTunes:
http://www.raywenderlich.com/1948/how-integrate-itunes-file-sharing-with-your-ios-app
I found it quite complex but helped me a lot.
Both method 1 and method 2 seem doable. Perhaps a combination of the two in fact - use iCloud to send a separate database file that is a subset of data - i.e. just changed items. Or maybe another file format instead of sqlite db - XML/JSON/CSV etc.
Another alternative is to do it outside of iCloud - i.e. a simple custom web service for syncing. So each change gets submitted to a central server via JSON/XML over HTTP, and then other devices pull updates from that.
Obviously it depends how much data and how many devices you want to sync across, and whether you have access to an appropriate server and/or budget to cover running such a server. iCloud will do that for "free" but all it really does is transfer files. A custom solution allows you to define your syncing model as you wish, but you have to develop and manage it and pay for it.
I've considered the possibility of transferring a database file through iCloud but I think that I would run into classic problems of timing - slow start for the user - and corrupted databases if the app is run on multiple devices simultaneously. (iPad/iPhone for example).
Sooo. I've had to use the transaction logs method. It really is difficult to implement, but once in place, seems ok.
I am using Apple's SharedCoreData sample as the base for this work. This link requires an Apple Developer Account.
I did find a much much better solution from Tim Roadley however this only works for IOS and I needed both IOS and MacOS.
rant> iCloud development really has to get easier and more stable! /rant

Resources