Save NSUndoManager transactions one by one - ios

I need to save changes not only locally into Core Data, but on server too.
My concern is, in my case user can do bunch of interaction in a short time. Between interaction there is not enough time to receive success message returned from server. So either I lock the GUI, until next message returns - this is the case now -, or choose a different approach.
My new approach would be to let user do many interactions and put transactions onto undo stack provided by NSUndoManager, enabled on NSManagedObjectContext, BUT save / commit ONLY that transaction for which success message was received. How can I move undo "cursor" one at a time, commit records one by one, although context contains already planty of unsaved changes?

NSUndoManager is not really suited to this task. You can tell it to undo or redo actions, but you can't inspect those actions or selectively save data in the current undo stack.
What I've done in the past is create my own queue of outgoing changes. Whenever changes are saved locally, add those changes to a list of un-synced outgoing changes. Then use a different queue to handle processing that queue by sending them to the server and, if the server reports success, clearing out those changes. You can use NSManagedObjectContextWillSaveNotification and/or NSManagedObjectContextDidSaveNotification to monitor changes and update the outbound queue.
This means that the iOS device may have queued changes that the server doesn't know about, especially if the network is unreliable or unavailable. That's pretty much unavoidable in those situations though, unless you do something awful like refuse to let people make new changes until the network comes back up.

Related

Synchronizing data between local data and database

I wanted to ask for the best solution in synchronizing large amounts of data between the local data storage and a database. (Offline and Online synchronization)
let's say i have two tables and these tables have customers.
Now if i change some data e.g name for the customer, i have to send this to the server before i can pull the server side update, otherwise there would be a conflict.
Let's image the app is offline and has no internet connection so all the changes are made locally.
What i have been doing up till now is save the initial data pull locally and set queue states for the changes. If i change the name for the customer i would set the customer queue state to AWAITING and once the application gets internet and is back online, i would run a query and get all customers that have queue state AWAITING and send that to the database for synchronization, before i can do a data pull.
I believe this model is called Eventual consistency which i think is not very good, and wonder if there is a better way to do this.
I had a different idea where i would create some Queue where every change is stored in JSON and once the app gets internet the queue would be run and all of these API calls would be sent, but this would work weirdly if let's say i change the customers name from John to Matt, then back from Matt to John, and at this point 2 new api calls would be added to the queue. First where i change to Matt and next where i change back to John, which is what we started with and is therefor not needed, but these changes would be very hard to keep track of, and as it is already now it feels like the synchronization is very unstable.
Do you guys know any better approaches for synchronization between (Offline and Online) local changes and the serverside database?

Can RealmCollectionChange be used as a way of syncing data back to server?

I noticed that in Realm Swift, there is a RealmCollectionChange
https://realm.io/docs/swift/latest/#realm-notifications
It seems to contain the objects that have changed. Can I use that notification block to add code to sync the data back to a back end database?
Is the notification block running on the main queue?
For sure you can use the provided notification mechanisms to propagate changes to a server. You should make sure though, that your requests to the server doesn't cause new changes once the server responds, otherwise you can run into a situation where you would be constantly notified about new updates, as also seen in the related docs section User-Driven Updates.
The notification block is ran on the thread on which you add it. But these APIs are only available to auto-updating Realms which require a runloop. By default only the main thread has a runloop, if you don't run any additional yourself on dedicated background threads.
Be aware that synchronizing is a non-trivial problem and using these notifications alone won't give you a full solution for every challenge involved into that problem space.

Realm database locking?

On sync I overwrite all my local data with the server's data. For this I first call realm.delete(realm.objects(MyObj)) for all my objects. Then I save the response's objects with realm.add(obj, update: false). Everything is in a single transaction. The payload can take a while to process but it's not big enough to justify implementing pagination.
Can the user use the app normally during this process? Can they store new items that are deleted in the clearing part of the transaction, or that would trigger an error or be overwritten during the adding part? If yes how can I avoid this?
Realm uses a Multi-Version-Concurrency-Control algorithm. This uses locks to ensure exclusive write, while other threads can keep reading previous versions of the data. We have an article on our blog, which explains how that works in more depth.
Be aware that what you attempt to solve here is a non-trivial challenge.
Can they store new items that are deleted in the clearing part of the transaction, or that would trigger an error or be overwritten during the adding part?
While the background transaction is in progress, other write transactions would be blocked. If you do these writes from the main thread, you would block the main thread. If you do them from background threads, they would queue up and be executed after your sync transaction is completed.
The objects, which are deleted in the beginning would become inaccessible (which you can check via invalidated), because write transactions always operate on the latest version. If your objects have consistent primary keys across your sync operations, you can utilize those to re-fetch them and redo all modifications to the fresh instances. But note, that you need to store the primary keys (and all other object data) into memory, before beginning the write transaction, which implies an implicit refresh.

Responsive design: slow internet communication vs. high frequency of user interactions

Sometimes operations are depending on each other, user can not update record until he inserted it. How can client changes be synchronised / uploaded to server, without blocking the GUI, if user interactions are more frequent than internet communication allows it?
In my recent version of my app, I store changes in Core Data and same time send change to backend, and until success message returns, I block GUI to keep client and server storage in a consistent state. But I know it is not a good approach especially if there is a lot of control in the same GUI, and user could play with them manipulate them quickly with short delays. Because it is annoying, that you have to wait.
What general approach do you recommend to build a responsive app which is not depending and able to hide the relative slowness of internet communication. Any good tutorial about it?
This is a theoretic question, and expecting theoretic answer.
A good way is to have this setup with parent and child managed object contexts:
SavingContext (background, saves to persistent store)
MainContext (main thread, child context of saving context)
WorkerContext (background, child context of main context)
Thanks Marcus Zarra.
UI initiated changes in the model get saved right away in the main context. You then send them to the backend in a spawned worker context. You can have several of these without problem.
Once the response comes from the server, you save the changes in the worker context which "pushes" them up to the main context. Here you can define a merge policy that resolves any conflicts (for details please ask a new question).
The main context can now update the UI with the new information (if any).

Core data with REST api [duplicate]

Hey, I'm working on the model layer for our app here.
Some of the requirements are like this:
It should work on iPhone OS 3.0+.
The source of our data is a RESTful Rails application.
We should cache the data locally using Core Data.
The client code (our UI controllers) should have as little knowledge about any network stuff as possible and should query/update the model with the Core Data API.
I've checked out the WWDC10 Session 117 on Building a Server-driven User Experience, spent some time checking out the Objective Resource, Core Resource, and RestfulCoreData frameworks.
The Objective Resource framework doesn't talk to Core Data on its own and is merely a REST client implementation. The Core Resource and RestfulCoreData all assume you talk to Core Data in your code and they solve all the nuts and bolts in the background on the model layer.
All looks okay so far and initially I though either Core Resource or RestfulCoreData will cover all of the above requirements, but... There's a couple of things none of them seemingly happen to solve correctly:
The main thread should not be blocked while saving local updates to the server.
If the saving operation fails the error should be propagated to the UI and no changes should be saved to the local Core Data storage.
Core Resource happens to issue all of its requests to the server when you call - (BOOL)save:(NSError **)error on your Managed Object Context and therefore is able to provide a correct NSError instance of the underlying requests to the server fail somehow. But it blocks the calling thread until the save operation finishes. FAIL.
RestfulCoreData keeps your -save: calls intact and doesn't introduce any additional waiting time for the client thread. It merely watches out for the NSManagedObjectContextDidSaveNotification and then issues the corresponding requests to the server in the notification handler. But this way the -save: call always completes successfully (well, given Core Data is okay with the saved changes) and the client code that actually called it has no way to know the save might have failed to propagate to the server because of some 404 or 421 or whatever server-side error occurred. And even more, the local storage becomes to have the data updated, but the server never knows about the changes. FAIL.
So, I'm looking for a possible solution / common practices in dealing with all these problems:
I don't want the calling thread to block on each -save: call while the network requests happen.
I want to somehow get notifications in the UI that some sync operation went wrong.
I want the actual Core Data save fail as well if the server requests fail.
Any ideas?
You should really take a look at RestKit (http://restkit.org) for this use case. It is designed to solve the problems of modeling and syncing remote JSON resources to a local Core Data backed cache. It supports an offline mode for working entirely from the cache when there is no network available. All syncing occurs on a background thread (network access, payload parsing, and managed object context merging) and there is a rich set of delegate methods so you can tell what is going on.
There are three basic components:
The UI Action and persisting the change to CoreData
Persisting that change up to the server
Refreshing the UI with the response of the server
An NSOperation + NSOperationQueue will help keep the network requests orderly. A delegate protocol will help your UI classes understand what state the network requests are in, something like:
#protocol NetworkOperationDelegate
- (void)operation:(NSOperation *)op willSendRequest:(NSURLRequest *)request forChangedEntityWithId:(NSManagedObjectID *)entity;
- (void)operation:(NSOperation *)op didSuccessfullySendRequest:(NSURLRequest *)request forChangedEntityWithId:(NSManagedObjectID *)entity;
- (void)operation:(NSOperation *)op encounteredAnError:(NSError *)error afterSendingRequest:(NSURLRequest *)request forChangedEntityWithId:(NSManagedObjectID *)entity;
#end
The protocol format will of course depend on your specific use case but essentially what you're creating is a mechanism by which changes can be "pushed" up to your server.
Next there's the UI loop to consider, to keep your code clean it would be nice to call save: and have the changes automatically pushed up to the server. You can use NSManagedObjectContextDidSave notifications for this.
- (void)managedObjectContextDidSave:(NSNotification *)saveNotification {
NSArray *inserted = [[saveNotification userInfo] valueForKey:NSInsertedObjects];
for (NSManagedObject *obj in inserted) {
//create a new NSOperation for this entity which will invoke the appropraite rest api
//add to operation queue
}
//do the same thing for deleted and updated objects
}
The computational overhead for inserting the network operations should be rather low, however if it creates a noticeable lag on the UI you could simply grab the entity ids out of the save notification and create the operations on a background thread.
If your REST API supports batching, you could even send the entire array across at once and then notify you UI that multiple entities were synchronized.
The only issue I foresee, and for which there is no "real" solution is that the user will not want to wait for their changes to be pushed to the server to be allowed to make more changes. The only good paradigm I have come across is that you allow the user to keep editing objects, and batch their edits together when appropriate, i.e. you do not push on every save notification.
This becomes a sync problem and not one easy to solve. Here's what I'd do: In your iPhone UI use one context and then using another context (and another thread) download the data from your web service. Once it's all there go through the sync/importing processes recommended below and then refresh your UI after everything has imported properly. If things go bad while accessing the network, just roll back the changes in the non UI context. It's a bunch of work, but I think it's the best way to approach it.
Core Data: Efficiently Importing Data
Core Data: Change Management
Core Data: Multi-Threading with Core Data
You need a callback function that's going to run on the other thread (the one where actual server interaction happens) and then put the result code/error info a semi-global data which will be periodically checked by UI thread. Make sure that the wirting of the number that serves as the flag is atomic or you are going to have a race condition - say if your error response is 32 bytes you need an int (whihc should have atomic acces) and then you keep that int in the off/false/not-ready state till your larger data block has been written and only then write "true" to flip the switch so to speak.
For the correlated saving on the client side you have to either just keep that data and not save it till you get OK from the server of make sure that you have a kinnf of rollback option - say a way to delete is server failed.
Beware that it's never going to be 100% safe unless you do full 2-phase commit procedure (client save or delete can fail after the signal from the server server) but that's going to cost you 2 trips to the server at the very least (might cost you 4 if your sole rollback option is delete).
Ideally, you'd do the whole blocking version of the operation on a separate thread but you'd need 4.0 for that.

Resources