I've already managed to program a webapp for personal use I'm really satisfied with. Not being something meant for public usage and distribution, I didn't want to go through the hassle of jailbreaking my device just to be able to run my own application, so I made this seamlessy looking and behaving webapp (and of course I've added it to the other apps saving it as a "Home application")
Since the start time can be a bit slow and I'm constantly pushing my data from and to a remote server, can I force the usage of html5 offline browsing (with a cache manifest) even when I am online? Also, I'm thinking of persisting the data as local storage and from time to time synch it to the server. Since I've never used html5 local storage, how much reliable is it? Can I lose my data?
Is this a viable pattern to quickly create a personal iPhone app? Thanks
Yes, you can force the usage.
so basically you should a very simple checking :
if(localStorage["mycontent"]!==null)
{
// do it offline.
}else
{
// retrieve from server database
}
For your question regarding :
Also, I'm thinking of persisting the data as local storage and from
time to time synch it to the server. Since I've never used html5 local
storage, how much reliable is it? Can I lose my data?
The answer is it depends. If the data is static (or can only be changed by you and not other user ) it's reliable. You also have to take note when a data can be considered expired so localstorage can be filled with refreshed data from the server.
But take note that cleaning history is also remove your data, so only use Localstorage as a cache/mirror of the data in the server.
window.localStorage.setItem('x',y);
window.localStorage.getItem('x';
window.localStorage.removeItem('x');
Lets you store, read and delete persistent data in HTML5. See https://developer.mozilla.org/en-US/docs/Web/API/Web_Storage_API/Using_the_Web_Storage_API
But note that on IOS Safari puts this data in the cache folder which on occasions gets expunged. So do plan a server sync and restore of this data if important.
Alternatively use a local SQLite database for a more persistent persistence....
Related
Let me get this out of the way right now: yes, it was almost certainly a mistake to not use Core Data. However, I was new to iOS development when I made these decisions, and I had no idea I'd be hamstrung like this. Moreover, the app is intended to also run on Android (eventually), so I avoided platform-specific APIs wherever possible.
I have an iOS app that stores data in a local SQLite database file. The data stored in the file is provided by the user, so it's important that it be kept safe. I had plans to "do this later", and later is now here. I am quickly coming to the realization that it won't be as straightforward as I had hoped...
I now understand that it won't be possible to seamlessly synchronize data across devices, and I'm willing to accept that limitation until I manage to migrate to Core Data. However, in the meantime I'd at least like the SQLite database to be backed up periodically so users can feel safe using the app on a single device. I was thinking I would do this:
periodically (e.g. once a week) copy the SQLite file from local storage into cloud storage, thus ensuring it is backed up
when the app starts, if the local store is missing or corrupted but the file exists in the cloud storage, ask the user if they would like to copy it over
The biggest problem with this approach is that the user could run the app on multiple devices and therefore the data stored in iCloud could be from any one of those devices, but only one. To combat that, I thought I could just use a per-device, unique name for the file in cloud storage. I would generate this using UIDevice.identifierForVendor.
So my startup logic would be:
Determine the unique name for the cloud file.
Is the local file missing or corrupted, and if so, does the cloud file exist?
2.1. Ask the user if they would like to restore from the cloud file. Make it really hard for them to say no because doing so will lose all their data.
2.2. If they say yes, copy the cloud file to the local file storage.
Open the local database file.
And running in the background I would occasionally copy the database file from local to cloud storage.
I would like to know whether this a sensible approach until I do Core Data integration. Also, are there any hidden "gotchas" that I'm perhaps missing?
UPDATE: as #TomHarrington pointed out in a comment, it turns out my database file is already sitting in /Documents, which is backed up to iTunes and any iCloud account. So my question morphs into this:
Should I simply ensure my database has a device-specific name so that it is not clobbered by the app running on another device connected to the same iCloud account?
I'm going to answer my question, since I ended up going down this path and finding a MASSIVE blocker. There is a bug in the UIDevice.identifierForVendor API that causes it to regenerate every time a new version of the app is installed! See here. This of course rules out using it as a device identifier. sigh
I think I'm SOL with that approach. Instead, I might generate a GUID on first execution and use that as my identifier. Problem is, I need to store that somewhere that isn't backed up to iCloud.
Ugh, I may just give up here and say my app can't be run on multiple devices until Core Data integration is done.
UPDATE: I ended up generating an identifier on first run and storing it in the keychain (as a local entry only so it isn't backed up to iCloud).
I have an iOS app that timestamps some events and loads them in UI. To persist the data I use CloudKit which is quite fine and fast enough for my use case.
1st problem is, when an event occures I save it instantly to cloudkit therefore if internet is not reachable I have to handle the situation. I had the idea of having a local sqlite db and sort of keep track of connectivity and deal with syncing manually. Here is where 2nd problem comes in in which any inconsistency of the data must be handled by me (e.g. if the user edits the data while offline).
I found YapDatabase library doing the exact things but I feel the learning curve is a little bit steep.
Another idea could be using sqlite db but saving the file in the icloud drive. This way it will be persisted automatically while I have access to the app sqlite. Any change in the db tables will change the file singnature and fire icloud drive sync.
What's the best choice?
I have an iOS app that uses Core Data and ParcelKit to sync with Dropbox. However, the Dropbox Datastore API only allows records of 100K so it will not sync images that I store in the database. Any other workaround than storing images as separate files with filenames stored in the base? It is a little fragile when user can alter the content of the imagefile-folder thus braking the link to the database.
You should not store large images in the Core Data persistent store. Apple recommends that you should only store small images, such as thumbnails, perhaps 20K max. If you go beyond that, performance will eventually degrade significantly.
Thus, you cannot really avoid storing the images in separate files and storing their name/location in Core Data. This is the recommended pattern.
I do not see why you think this is fragile. Presumably you will store the images in the app sandbox there is no way the user can fiddle with them unless the iPhone is jailbroken.
The Dropbox sync should be managed independently from this setup.
FYI Dropbox just killed the Datastore API and will take it off line in 2016. :-(
You should monitor this ParcelKit Issue:
Dropbox Datastore is Deprecated #34
https://github.com/overcommitted/ParcelKit/issues/34
I am creating an app where data needs to be displayed right away from the local datastore. I fire off a background thread when the app starts to determine if iCloud is available.
I looked everywhere and can't find the solution to this: when iCloud becomes available, can I change the "options" on the persistentStore to start using iCloud transactions?
I'm not sure what the proper approach is in this situation. Everything I try causes the application to crash.
Originally I had it so the iCloud checking wasn't in a background thread and the app worked fine, but occasionally timed out.
You have not to know when iCloud becomes available. You just work with data but you don't send them directly to iCloud. iOS does it instead you. So only it knows when and how it should send data.
No, you can't change the options on an NSPersistentStore object once it exists. You can only specify the options when adding the persistent store to the NSPersistentStoreCoordinator. The closest you could get to changing options would be to tear down the entire Core Data stack and start over with different options.
That wouldn't help, though, because:
Even if you have detected that iCloud is available (I'm guessing using NSFileManager, either via its ubiquityIdentityToken or by calling URLForUbiquityContainerIdentifier:), your call to addPersistentStoreWithType:configuration:URL:options:error: might still block for a while. If there's new data available in iCloud, it doesn't start downloading until you add the persistent store, and that method doesn't return until the download process is finished. And sometimes, iCloud just makes that method block for a while for no readily apparent reason.
If you let the user make any changes to the data while using non-iCloud options, those changes will not get automatically sent to the cloud later. Core Data only sends changes to iCloud when the data changes while iCloud is active-- which makes it generate a transaction. You'd have to load and re-save any changes the user made, or those changes would never make it to the cloud.
You have, unfortunately, hit on one of the major stumbling points when using Core Data with iCloud. You can't make the full data store available until Core Data finishes communicating with iCloud-- because your call to add the persistent store doesn't return until then. And you can't do anything to speed up that process. This is just one of the headaches you'll run into if you continue trying to use iCloud with Core Data.
Depending on the nature of your data you might be able to use two data stores, one purely local and one synced via iCloud. You could make the purely local data store available while the iCloud one tries to get its act together well enough to be useful. If you stick with one data store though, you're stuck with the delay.
I have a lot of data present in the database present in my webserver. Each time I starts the ipad application after downloadiding, I want all these data to be copied into the sqlite database present in my application. Then using this data, the application should work.
We are now using xml's and sometimes on 3g it takes about 20minutes which is completely unacceptable. After the 1st time it syncs using time log and all. And it works without any problem.
Is there any other way I could get all the data and make it populated into my sqlite db?
If it is a large database it might be worth doingin the background. And even better if it was just over wifi (otherwise you'll be eating up your users data)
What I usually do is have a local copy of the database shipping with the app, so the user can use that, and update it in the background. It might be worth creating some pages where you just present the updated content, download that, then update your database accordingly. Rather than downloading everything all the time.
This would depend entirely on your implementation however.
On initial app launch you could download just enough data to make the app functional. Then download the rest of the data in the background.
This is a common strategy used by Sync Frameworks and works pretty well. I have personally tried with with synchronizing more than a thousand objects using OpenMobster's sync service.
Now what data to download initially is to be decided by the requirements of the App
Thanks