Is there any way to add transactionality to NSUserDefaults? I would need something like the well known begin - commit - revert functions on database handlers, thus I could revert a modification on the user defaults in some cases. Of course other users of this user defaults would be blocked from writing during the transaction.
Note1: synchronize method of the above class does not do this thing because:
according to the doc, it is called from time to time also by the framework
there is no "revert"
Note2: I saw dictionaryRepresentation and registerDefaults with that I could implement my own transaction mechanism (holding a copy of the old defaults in the memory / even saved to a plist during the transaction). But maybe there is a ready solution for this?
My use case:
I have a wizard-like flow of screens where the user can edit some settings on each screen. As of the current implementation these settings are stored immediately in the defaults as the user moves to the next screen of the wizard. Now this wizard can be interrupted by some other events (even the user can choose to exit/cancel the wizard at any screen) and in this case I would like to roll back the modifications.
One possible solution is to defer setting the values until the end of your wizard. This can be easily done for example using a proxy that will record the messages sent to it and then replay them on the real NSUserDefaults. Recording the messages should be pretty simple:
- (void) forwardInvocation: (NSInvocation*) invocation
{
[invocations addObject:invocation];
}
Where invocations is a mutable array. Replaying the messages back is also simple:
- (void) replayOnTarget: (id) target
{
for (NSInvocation *op in invocations)
[op invokeWithTarget:target];
}
This way the wizard does not have to know anything about the transactions. It would get the recording proxy instead of the expected NSUserDefaults instance and send the messages as usual. After the calling code knows the wizard succeeded, it can replay the messages from the proxy on the shared user defaults. (I have added some sample code on GitHub.)
Maybe this is overkill, but since the recording proxy is generic and can be used in other cases, maybe it’s not bad. Same thing can also be done using blocks:
[transaction addObject:[^{
[defaults setObject:… forKey:…];
} copy]];
Where transaction is a mutable array, again. When the wizard succeeds, you would simply iterate over the array and execute the stored blocks.
Related
I have a login view controller in which the user enters their preferences like whether or not he wants to activate certain UI features.
I store these as variables whose getters and setters directly access UserDefaults, here is an example of one of these:
class Preferences {
static var likesSpaghetti : Bool {
set (likesSpaghetti) {
UserDefaults.standard.set(likesSpaghetti, forKey: "likesSpaghetti")
}
get {
return UserDefaults.standard.bool(forKey: "likesSpaghetti")
}
}
}
So that whenever I want to set any of these I simply write something like this:
Preferences.likesSpaghetti = false
Now, my question is: Can I set these variables every time the user flicks the on/off switch or should I keep the preference represented as a local variable and then only set:
Preferences.likesSpaghetti = spaghettiSwitch.isOn
when the user segue's away from the loginViewController? Is every access of UserDefault instant and quick? or is it laggy and should be used mercifully?
Edit after closing this question: So I learned to not prematurely optimize, and that it is probably ok within the scope of a few dozen elements. So I should be fine. I'm going to just update every time the user modifies anything so that my code is a lot easier to read and maintain.
Thanks everyone!
Your code is just fine. Don't worry about such optimizations until you actually encounter an issue. Trust that UserDefaults is implemented smartly (because it is). There is nothing "laggy" about setting something as simple as a Bool in UserDefaults.
You also wish to review another one of my answers which is related to this question: When and why should you use NSUserDefaults's synchronize() method?
Actually userDefaults (it's originally a plist file) is used for this purpose which is storing app settings and that light-wight content creating a variable may consum memory if you have to configure many of them , besides not reflecting the new setting change directly to defaults made by user , may cause un-expectable old settings to happen at the time of change such as a localized alert or part of code (such as push notification callback) that check the same setting where the user thinks it's already reflected
Adding to both #rmaddy #Sh_Khan, if you think about the security aspect of it, NSUserDafault is exactly for the details which is app specific such as settings, preferences or some configurations which are not security sensitive while things like passwords, usernames and sensitive data are not recommended to store in UserDefaults. You should use services like keychain which is encrypted for such data.
I have an iOS application and release it two times. For better understanding, let's refer to the first release as V1 and second release as V2.
There is a UITableView display a list of events fetched from the server. For better user experience and performance I will cache the events download from the server. So next time the user enters the app I can show a list of events immediately with cached data instead of a loading indicator.
V1 app: The table view cell is configured with an event which is an NSDictionay and get the name property like this:
// Get the event name
NSString *name = [event objectForKey:#"name"];
V2 app: After event getting more complicated I decide to create a dedicated model for the event, let's call it Event and get the name property like this:
// Get the event name directly from Event model name property
NSString *name = event.name
After releasing V2, here comes the problem.
This V2 app crashes because it tries to read the name property from an event dictionary which is cached by V1 app
My question is what strategy or best practice to invalidate cached data. Some methods come to my mind but they all have flaws.
1) Clean all cached data on every release.
This is the most safer way no matter how our program differs from last version app. But throw away all the cached data may be overreacting.
2) Write more code to check if need to ignore cached data.
Taking the V2 app of mine as an example. I may write code to check the cached event data like this to ensure everything is fine.
// Validation code in V2 app
if ([cachedEvent isKindOfClass:Event.calss]) {
NSString *name = cached.name;
// other code
} else {
// Ignore the cached event and fetch event from server
}
This method works but sometimes isKindOfClass is not enough, you have to write more and more validation code. I can see it will become very troublesome to check every cached data.
Love to hear your thoughts and ways to do it, Thanks in advance.
There are several view controllers in my app where I need to sync the local contents with server using a method running in a background thread. Sometimes I need to insert data to my database on server if user has created any. The approach I am using here is to set a flag(something like isSynced = NO) on objects that I need to sync with server (there objects are in Core Data). When the syncing is complete my method will get rid of the flag(e.g. isSynced = YES) so it won't be sent again next time.
Now the problems is that the syncing method takes very long to complete(1 or 2seconds.). If now user pops out this particular view controller and swiftly comes back the previous call is still in progress and next one will be kicked off. The consequence is that there might be duplication in database.
My approach now is the make the syncing method to be called by a Singleton object:
#property (nonatomic) BOOL isSyncing;
//every time before syncing. check if object is available for syncing
if (!isSyncing) {
isSyncing = YES;
// sync server
// when complete
isSyncing = NO;
// post notification to view controller to reload table
} else {
// cancel because previous call is not finished
}
My concern is that if the call is cancelled my view controller will not be able to receive the notification is waiting for. I can fix this by posting another notification in the event of cancelation. I am wondering if this is the right to do this because I think that this problem should be pretty common in iOS development and there should be a standard way to deal with it
Your singleton approach may not be necessary. I don't see the harm in sending a database insert for each new object. You will still need to ensure each object is synched. That is, update the "isSynched" flag. Keep each object that needs to be synced in a "need to synch" list.
Then, update the "isSynced" flag by performing a background query on the database to check if the object exits. Then, use the result of the query to set the isSynched flag.
If the query result indicates the object is not in the database you then resend the object and leave it's "isSynced" flag set to NO.
If the query result indicates the object is in the database, set the "isSynced" flag to YES and remove it from your "need to synch" list.
An approach for preventing duplicate database entries is to make a unique key. For example, tag each with a hash based on the time and date. Then configure the table to ensure each key is unique.
I pretty much know why my iPad app is crashing, but I'm having trouble coming up with a scheme to get around the scenario. The app is a jigsaw puzzle app. I've got a working version that's more stable than the one in the app store, but I still have a pesky problem I can't quite lick.
The root of the problem is a clash between user activity and automated saves. The save essentially stores the state of the puzzle as property list. The property list contains, among other things, a compilation of all the palettes in the puzzle, and for each palette, the details of all the pieces on that palette. It works well, except that user activity could change these details. A palette is essentially a UIView containing puzzle pieces as subviews. A user can move pieces around on a palette or move them from palette to palette.
I have the save process working in two phases. The first phase is kicked off by a timer. At regular intervals, this phase checks to see if there is some user activity that warrants a save. It sets a property abortSave to NO and then triggers a nonrepeating timer to wait for another period of time before starting phase two.
In phase two, the save takes place as long as abortSave is NO.
Meanwhile, if the user performs any operation that affects the save, abortSave is set to YES. The idea is that the delay between phase 1 and phase 2 is longer than it takes to perform a user operation, so if abortSave is NO, then it should be safe to do a save.
This process has eliminated 95% or so of the crashes, but I'm still getting crashes.
Of course, for decent performance of the app, the user activity as well as the save operation take place in background threads.
The type of circumstance I am running into is usually a mutation during fast enumeration, or something like that. Essentially, some user action is making a change during the save process. If I copy the object being fast enumerated and then work on the copy, it doesn't help. Sometimes the error will happen on the copy statement. If the object is an array, I don't use fast enumeration but use a regular for loop to work through the array. That helps a bit.
I hope this question isn't too generic. I suppose I could post some code, but I'm not sure how helpful it really would be. And I don't want to needlessly clutter the question.
One thing that I have not done yet, would be to use a flag working the other way:
saveProcessActive set to YES right before the save happens and set to NO when it finishes. Then all the user actions would have to be stalled if saveProcessActive is YES. The problem with this scenario is that it would result in a delay of the user action, potentially visible to the user, but maybe any delay is insignificant. It would only need to be as long as the save takes until its next check of abortSave. The aborted save process would then turn saveProcessActive to NO when it acknowledged the abort request. Is there a better solution?
Making a copy of the current game state in memory should be a fast action. When you want to save, make that copy, and then hand it to your background queue to save it with dispatch_async(). Doing it this way gets rid of all the concurrency issues because each piece of data is only ever accessed on a single queue.
EDIT: Here is how I've typically addressed such issues (untested):
- (void)fireSave:(NSTimer *)timer {
id thingToSave = [self.model copyOfThingToSave];
dispatch_async(self.backgroundSavingSerialQueue, ^{
[self writeToDisk:copyOfThingToSave];
}
}
- (void)saveLater {
[self.timer invalidate];
self.timer = [NSTimer scheduledTimerWithTimeInterval:5
target:self
selector:#selector(fireSave:)
userInfo:nil
repeats:NO];
}
Now, anywhere you modify data, you call [self saveLater]. Everything here is on the main thread except for writeToDisk: (which is passed a copy of the data). Since writeToDisk: always runs on its own serial queue, it also avoids race conditions, even if you ask it to save faster than it can.
You will need to synchronize access to the data, both while saving and while altering it during normal play. As writing to file would likely take longer than making a copy, in order to minimize lock time you should make a copy while you have a lock, then release the lock and
write the data to disk. There are a few ways to do this, but the easiest is an #synchronised block:
-(void) save
{
NSDictionary *old = self.data;
NSDictionary *new;
#synchronized(old) {
new = [old copy];
}
[self writeData:new];
}
And remember to synchronize changes too:
-(void) updateDataKey:(id)key toValue:(id)val
{
NSDictionary *old = self.data;
#synchronized(old) {
old[key] = val;
}
}
data obviously doesn't need to be an NSMutableDictionary, it was just a convenient example.
I'm optimising my first iOS app before it hits the store, and noting methods which take seemingly larger amounts of time. I have a fairly simple master-detail app where entities from the Core Data SQLite are shown in a UITableView, then tapping one brings up a detail view where the user can mark it as a favorite (setting a BOOL flag in that object to YES. As soon as they hit their Favorite button, I call [NSManagedObjectContext save] to ensure their changes are reflected immediately, and in case of an unscheduled terminate, etc.
This save operation is currently taking around 205ms when testing on my iPhone 4S. There are around 4,500 entries in the database, each with a few strings, and a few boolean values (wrapped in NSNumbers).
First question: should it take 200ms to make this change? I'm only setting one boolean value, then saving the context, but I've never used Core Data before so I don't know if this is about normal.
Second question: the code I'm using is below – am I doing something wrong in the code to make the method take this long to execute?
- (IBAction) makeFavorite: (id) sender
{
[self.delegate detailViewControllerDidMakeFavorite];
[_selectedLine setIsLiked: [NSNumber numberWithBool: YES]];
[_selectedLine setIsDisliked: [NSNumber numberWithBool: NO]];
NSError *error;
if (![[[CDManager sharedManager] managedObjectContext] save:&error]) NSLog(#"Saving changes failed: %#, %#", error, [error userInfo]);
}
Perhaps I'm worrying over nothing (I am still a relatively new programmer), but on a wider note, 200ms is enough for me to at least try to address this issue, right? :)
Consider UIManagedDocument. It automatically handles saving in a background context. I especially recommend it if you are on iOS 6. If you are not passing object IDs around, or merging with other contexts, then you should be able to use it fairly easily and reliably.
Your simple use case seems tailor made for it.
1) Should the save of one change of boolean value take 200 ms?
Yes, it might take this long. You are performing an IO operation and according to the documentation:
When Core Data saves a SQLite store, SQLite updates just part of the store file. Loss of that partial update would be catastrophic, so you may want to ensure that the file is written correctly before your application continues. Unfortunately, doing so means that in some situations saving even a small set of changes to an SQLite store can take considerably longer than saving to, say, an XML store.
-
2) am I doing something wrong in the code to make the method take this long to execute?
No. you are making the save to the store (under the assumption you have no parent context).
-
3) Are 200ms enough for me to at least try to address this issue?
Yes. 200ms are a noticeable time for a human, and will be felt. you could try and perform the save in the background, but this is unsafe according to the documentation. or, move it to the end of the entire object editing.
My advise would be to read and see if you could make some compromises in your context architecture (your CoreData stack structure).
From my experience, saving in the background is not that bad.