Restkit + Rails API or TouchDB + CouchDB? - ios

Any of you out there got the answer to what model is more trivial to iOS cloud data driven apps?
Restkit + RESTful API or TouchDB + CouchDB?

There is really hard to answer your question without know what you want to do. You can certainly make both works for different use cases.
There are a couple of things to consider:
If offline mode is important.
TouchDB is a full functional nosql datastore running on the device, and it allows user to read and write data even without connection. Restkit needs a connection to fully work.
Size the dataset to replicate
TouchDB will replicate data to the device, and it is easier if you have relative small dataset. The size is measured by how many documents in your database, and also the size of documents.
Also, most of time device only need to do a full replication when app starts (initial replication), so you can get around this (embedded most of data into the app apk itself for example) and only replicate delta.
By the way, you can certainly use both, and get the benefits of both.

Employing CouchDB+TouchDB completely takes the hassle of sync away from you. You don't need to care about sync, it just works. You get a notification upon sync, update your UI, that's it.
Replacing the Core Data stack with TouchDB is also fairly easy. The model objects basically stay the same, only that they now inherit from CouchModel instead of NSManagedObject. It's nearly trivial.
Querying is slightly different from Core Data. You define a set of views (indexes) that slice and sort your data along different criteria, and then query these indexes with a start and end key. So, there's no explicit query language, but that's no inconvenience, really.
I've moved a Core Data app to TouchDB, and it was completely painless. In about 3 days, I had CRUD and sync up and running.

Related

Is it good practice to store items retrieved from a web service as Core Data objects?

I've heard of iOS developers retrieving items from a RESTful web service and storing them as Core Data objects right away. I can see why that may be useful if you want to save or cache these items so the user can see them later (e.g. Facebook feed), but are there any other reasons to do so? I have items in my web service that are invalid within an hour, so caching is out of the question. If it's a good practice to do so, why?
For me, there are 2 reasons to stock datas in local:
Better UX: first, show old contents, then do an update in background for example, then update your application UI when new contents are availables.
Work offline whenever online mode is impossible.
Even if your items are invalid within an hour, if you do not cache items in local, your application has to call to webservice to retrieve these items, and it takes time.
Caching almost never hurts and CoreData is a very nice way to cache data which comes in as a pile of similar records.
I am one of those devs you mentioned who store almost anything using CoreData. Because I do, a lot of useful code and selfmade frameworks has summed up over time which make working with CoreData and RESTful apis a breeze. And if connecting an api to CoreData is just a matter of a few lines of code, there really isn't any reason not to.
While I cannot share my libraries, I'd strongly recommend taking a look at RestKit, which does pretty much the same - mapping a RESTful api to CoreData. And if you're not used to CoreData yet, fear not. It is a very powerful tool and getting used to it is definitely worth the while!

SQLite vs Core Data: saving large amounts of data

I'm making a statistical analysis app which needs to store large amounts of data locally. To give you an idea of what I'm talking about, I've made this illustration (not exactly the right information, but very similar):
The app will keep track of several thousand destinations, with the destinations population, temperature, number of cars etc. These numbers will be represented in a graph so that you can look at the numbers "development" over time. This will go over a long period of time, in other words: thousand of dates for each data-type for each thousands of cities.
To achieve this I need to save large amounts of data, and it is preferred to be done locally (is that crazy?). I'm stuck between digging deep into the foundation of Core data, or using my already decent skills in SQLite.
If you suggest I should use SQLite, could you refer to how you would implement this into your app (some code perhaps?).
If you suggest I should use core data (mainly for performance), please show me how you would implement this type of data model with entities, attributes, relationships etc. I can imagine using dictionaries saved in the core data would be a good solution?
Thank you in advance.
If you're going with SQLite with Swift - I highly recommend using this project. I am using it in my current project and it is absolutely fantastic and works perfectly (I'm not affliated in any way with the project or author). You actually drag that project into your project and it becomes a subproject, then you just set it up as 1. target dependency, 2. framework to link with, 3. copy framework (build phases), and it just works. Then you can handle your database with brilliantly constructed Swift interfaces rather than ugly cumbersome libsqlite calls.
I have used it for modest amounts of data. A few databases and multiple tables. Clean and intuitive. So far I haven't found a single bug of any kind. And Stephen Celis, the author, was responsive when I asked a question about a feature that wasn't documented (but actually is present and works, it turns out). It's a prodigious effort.
Its so clean and tightly integrated with Swift that, if I didn't know better, I'd think Apple itself added SQLite support to the Swift language.
https://github.com/stephencelis/SQLite.swift
Core Data is an object persistence model-- and there's your answer really, because every object has a little overhead, so having thousands of objects in memory at one time is problematic. If you already know SQL then that's another plus.
(Discussion of overall merits of core data is outside the scope of this question. The "music" app pulls it off using core data with thousands of items, I think, but because it only needs to display a subset of items. Map Kit drops down to C, and quite handles itself impressively with tens of thousands of single-digit byte items, which you can see running instruments with a Map Kit app.)
I've used SQLite in iOS and it's not a problem, being C-based and Objective C being a strict superset of C. There are various "wrappers" but look at these carefully, in case they take you back to every-row-an-object. I didn't use one and found the SQLite C setup to be tricky but fine. EDIT: I used this tutorial, which is dated now for the Objective C but a clear overview of how to integrate SQLite.
The only catch is the bundle/documents distinction in iOS will catch you if you ship with large amounts of data and want to save to that database: you can't modify files in the bundle, so you need to create or copy an SQL database to the documents folder 1 time, maybe on first app launch, to use it. Since you can't delete from the bundle, now you have your database twice. A problem inherent to the way iOS is set up.

UIManagedDocument vs NSManagedObject: Performance

I'm trying to write a iOS note taking app that is blazingly fast for a large number of notes and that syncs without ever blocking the UI. (Don't worry, it's just a learning project, I know there are a billion note apps for iOS). I have decided to use Core Data (mostly because of the excellent posts by Brent Simmons about Vesper). I know UIManagedDocument can do async reads and writes and has a lot of functionality built in, so I'm wondering if there is any information on which would be faster for a fairly simple notes app. I can't really find a lot of information about people using UIManagedDocuments for anything other than a centralized, basically singleton, persistent store. Is it suitable for 1000s of documents? Would it be faster or slower than just a database of NSManagedObjects? It seems like most information I can find about Core Data is oriented towards people using NSManagedObject, so any information about UIManagedDocuments being used in production apps would be really helpful. At this point, the only thing I can think of is to just write the whole app both ways, load 10,000 notes into it, and see what happens.
Update
To clarify, I'm not learning iOS development and Objective-C, the "learning project" mostly means that I've never used Core Data and would like to learn how to write a really performant Core Data application.
UIManagedDocument is designed/intended for document based applications. One UIManagedDocument instance per document. If you are not building a document based application then you should not be using UIManagedDocument.
Everything that people like about UIManagedDocument can be accomplished with very little effort using the Core Data stack directly. UIManagedDocument abstracts you away from what your persistence layer is doing. Something you really do not want.
If you want a high performance Core Data application you do not want to be using UIManagedDocument. You will run into issues with it. It will do things at random times and cause performance issues.
You are far, far, better off learning the framework properly.
In the case of Vesper, those are not documents; they are too small. Think of documents as Word files, or Excel files. Large complicated data structures that are 100% isolated from each other.
Also, whether you use a UIManagedDocument or not, you will be using NSManagedObject instances. NSManagedObject, NSManagedObjectContext, NSPersistentStoreCoordinator are all foundational objects in Core Data. UIManagedDocument is just an abstraction layer on top.
Finally, Core Data is not a database. Thinking of it that way will get you into a jam. Core Data is an object model that can persist to disk and one of the persistence formats happens to be SQLite.
Update (Running Into Problems)
UIManagedDocument is an abstraction on top of Core Data. To use UIManagedDocument you actually need to learn more about Core Data than if you just used the primary Core Data stack.
UIManagedDocument uses a parent/child context internally. Don't understand that yet? See the point above. What it also means is that your requests for it to save are "taken under advisement" as opposed to being saved right then and there. This can lead to unexpected results if you don't understand the point of it or don't want it to save when it feels like it.
UIManagedDocument uses asynchronous saves and at most you can request that it save. Doesn't mean it is going to save now, nor does it mean you can easily stop and wait for the save to complete. You need to trust that it will complete it. In addition, it may decide to save at an inopportune moment.
When you start looking for performance gains with Core Data you tend to want to build the stack in a very specific way to maximize the benefit to your application. That is application dependent and with the abstracts in UIManagedDocument you get limited very quickly.
Even in a situation where I was building a document based application I would still not use UIManagedDocument. Just to much behind the curtain.
Performance is likely to come down to how the rest of your code is implemented, and not necessarily the difference between a UIManagedDocument or a NSManagedObject.
Think of UIManagedDocument as a specific niche implementation of CoreData, that already has the parts of CoreData you want baked into it's structure to save you (the developer) a little bit of time writing code. It's been purpose built for handling UIDocuments, and multi-threading.
Under the hood, it's likely that UIManagedDocument is using CoreData as well as you would (assuming you know what you're doing), but it's theoretically possible you could cut some corners by knowing the exact and inane details of your implementation.
If you're new to CoreData, or GCD and NSOperationQueue, then you'll likely save a ton of developer time by leveraging UIManagedDocument instead of rolling your own.
A very apt analogy would be to using a NSFetchedResultsController to run your UITableView, rather then rolling your own CoreData and UITableView implementation.
If you're new to objective-C, I'd recommend you bang out something functional with UIManagedDocument at first. Later you get lost in the weeds of dispatch_asynch() and NSFetchRequest, and pick up a few milliseconds of performance here and there.
Cheers!
Just store a plain text file on the disk for each note.
Keep a separate database (using sqlite or perhaps just -[NSDictionary writeToFile:atomically]) to store metadata, such as the last user modification date and the last server sync date for each note.
Periodically talk to the server, asking it for a list of stuff that has changed since the last time you did a sync, and sending any data on your end that has changed in the same time period.
This should be perfectly fast as long as you have less than a million note files. Do all network and filesystem operations on an NSOperationQueue.

What type of data store should I use for my ios app?

I am pretty new to ios and using servers so forgive me.
I am building an ios app for research. I need to monitor things that the user does and then push it up to a server for analysis (yes, with user and IRB permission). On the client's side I need to keep quite a bit of data that won't really change except in the case of pulling an updated version from the server, and then a minimal amount of user-specific data. Most of the data I will collect needs to be pushed to a server for analysis and then can be deleted from the client side.
I am struggling to figure out what kind of data store I need to use, especially since I am not quite sure how the pushing and pulling from the server process works yet. Does it make sense to use Core Data? XML? SQLite? I like the Core Data idea, but I am not sure what kind of problems I will run into when I need to send large amounts of data to it and from it from the server. I imagine I might need to send data in a different form than it is probably stored in on either end - so what kind of overhead am I likely to run into in the process of converting that data? Is there a good format to save stuff in that would work well for me on both ends AND for sending the data?
As you can probably tell, I could use some advice. Thanks!
Core Data is probably the way to go.
Either Core Data or SQLite is likely to be great for this sort of app. Core Data actually uses SQLite behind the scenes. But Core Data has some advantages over SQLite and really is the preferred iOS database technology.
Regarding your performance concerns, I wouldn't worry about it. Core Data (or SQLite) is plenty fast enough. The bandwidth to the server will be the gating factor, so you should be fine there.
It sounds like your data structure is likely to be rich enough or large enough that I wouldn't contemplate other approaches (plists, NSUserDefaults, other file formats, etc.).

Keeping Core Data Objects in multiple stores

I'm developing an iOS application using Core Data. I want to have the persistent store located in a shared location, such as a network drive, so that multiple users can work on the data (at different times i.e. concurrency is not part of the question).
But I also want to offer the ability to work on the data "offline", i.e. by keeping a local persistent store on the iPad. So far, I read that I could do this to some degree by using the persistent store coordinator's migration function, but this seems to imply the old store is then invalidated. Furthermore, I don't necessarily want to move the complete store "offline", but just a part of it: going with the simple "company department" example that Apple offers, I want users to be able to check out one department, along with all the employees associated with that department (and all the attributes associated with each employee). Then, the users can work on the department data locally on their iPad and, some time later, synchronize those changes back to the server's persistent store.
So, what I need is to copy a core data object from one store to another, along with all objects referenced through relationships. And this copy process needs to also ensure that if an object already exists in the target persistent store, that it's overwritten rather than a new object added to the store (I am already giving each object a UID for another reason, so I might be able to re-use the UID).
From all I've seen so far, it looks like there is no simple way to synchronize or copy Core Data persistent stores, is that a fair assessment?
So would I really need to write a piece of code that does the following:
retrieve object "A" through a MOC
retrieve all objects, across all entities, that have a relationship to object "A"
instantiate a new MOC for the target persistent store
for each object retrieved, check the target store if the object exists
if the object exists, overwrite it with the attributes from the object retrieved in steps 1 & 2
if the object doesn't exist, create it and set all attributes as per object retrieved in steps 1 & 2
While it's not the most complicated thing in the world to do, I would've still thought that this requirement for "online / offline editing" is common enough for some standard functionality be available for synchronizing parts of persistent stores?
Your point of views greatly appreciated,
thanks,
da_h-man
I was just half-kidding with the comment above. You really are describing a pretty hard problem - it's very difficult to nail this sort of synchronization, and there's seldom, in any development environment, going to be a turn-key solution that will "just work". I think your pseudo-code description above is a pretty accurate description of what you'll need to do. Although some of the work of traversing the relationships and checking for existing objects can be generalized, you're talking about some potentially complicated exception handling situations - for example, if updating an object, and only 1 out 5 related objects is somehow out of date, do you throw away the update or apply part of it? You say "concurrency" is not a part of the question, but if multiple users can "check out" objects at the same time, unless you plan to have a locking mechanism on those, you would start having conflicts when trying to make updates.
Something to check into are the new features in Core Data for leveraging iCloud - I doubt that's going to help with your problem, but it's generally related.
Since you want to be out on the network with your data, another thing to consider is whether Core Data is the right fit to your problem in general. Since Core Data is very much a technology designed to support the UI and MVC pattern in general, if your data needs are not especially bound to the UI, you might consider another type of DB solution.
If you are in fact leveraging Core Data in significant ways beyond just modeling, in terms of driving your UI, and you want to stick with it, I think you are correct in your analysis: you're going to have to roll your own solution. I think it will be a non-trivial thing to build and test.
An option to consider is CouchDB and an iOS implementation called TouchDB. It would mean adopting more of a document-oriented (JSON) approach to your problem, which may in fact be suitable, based on what you've described.
From what I've seen so far, I reckon the best approach is RestKit. It offers a Core Data wrapper that uses JSON to move data between remote and local stores. I haven't fully tried it yet, but from what the documentation reads, it sounds quite powerful and ideally suited for my needs.
You definetly should check these things:
Parse.com - cloud based data store
PFIncrementalStore https://github.com/sbonami/PFIncrementalStore - subclass of NSIncrementalStore which allows your Persistent Store Coordinator to store data both locally and remotely (on Parse Cloud) at the same time
All this stuff are well-documented. Also Parse.com is going to release iOS local datastore SDK http://blog.parse.com/2014/04/30/take-your-app-offline-with-parse-local-datastore/ wich is going to help keep your data synced.

Resources