dealloc call order - ios

I have custom hierarchy of views. Each view retains its children.
Nobody else but parent retains view.
When dealloc for view is called, it calls [children release].
When I want to destroy a view and delete related resources,
I call:
[mainView release];
[resourceManager deleteRelatedResources];
Most of the time it works well and the order of calls is:
mainView dealloc;
mainView's children's dealloc;
mainView's grandchildren's dealloc etc.
deleteRelatedResources
But sometimes (about 1% of times) I have another order:
mainView dealloc;
deleteRelatedResources
mainView's children's dealloc;
mainView's grandchildren's dealloc etc.;
I've found recommendation from Apple not to rely on dealloc' call for resource management. Is it true that dealloc of children's views could be called not right after [children release]? Are there any workarounds? (My project's gone too far to change Resource Management scheme).

The order of dealloc invocation is indeterminate; if you have code that has order dependences in the implementations of dealloc between different classes, consider that code broken.
Certainly, there are workarounds in that you can go to great lengths to try and guarantee order. But no workaround is bullet proof and you'll just be digging the hole deeper.
The only thing you can assume is that an object released in A's dealloc will always be deallocated after A. When? You can't definitively know.
(One issue is that any object is free to be retain/autoreleased at any time.)
One potentially quick fix is to add an invalidate pattern. That is, move resource management out of dealloc by implementing a method across all the classes with dependent resource management where you control the invocation of said method.
You then do something like:
[myObject invalidateAllResources]; // traverses object graph, in order, invalidating resources
[myObject release]; // do the normal release-maybe-dealloc dance

My guess is that some of mainView's child views are being retained elsewhere for some reason (animation, or for some other use) so they aren't getting dealloc'd immediately.
You could test this by checking the retainCount of the view's children when its being dealloc'd.

Related

ObjC: is an object released IMMEDIATELY when no one references it?

I'm having a problem with a view controller that's dismissed and not referenced but still in memory, just wondering in general when is the object actually released in memory when no one references it?
The way I used to test is that I installed the PVC tool from Facebook and use it to print out the view hierarchy when the view controller is presented, after it's dismissed, I make sure no one's referencing it and paused the execution so I can po the memory address of the view controller from the previous PVC tool, but I can still see the view controller instance there.
Thanks!
You appear to be confusing being released and being cleared from memory. When the class is destroyed, the memory it occupied is not zeroed, just like when you delete a file in the filesystem, the disk blocks are not zeroed either.
This would simply take up too much time and have very little benefit.
Being released simply means the memory the class occupied can now be re-used.
One way to see if the class has been destroyed is to add a log in the dealloc method:
- (void)dealloc
{
NSLog(#"I'm being destroyed");
}

Cocos-2d iOs - different memory release behaviour of cclayer

I'm pretty sure I somehow made a stupid mistake but I seem to be unable to fix it. I got a subclass of CCScene, which in turn has a subclass of cclayer as a layer, roughly looking like this:
#interface MyLayer : CCLayer {
}
// some methods
#end
#interface MyScene : CCScene {
MyLayer *_myLayer;
}
// some methods
#end
On constructing the I do the following:
-(id) init {
if (self = [super init]) {
_myLayer = [MyLayer node];
[self addChild:_myLayer];
// more stuff
}
}
I need the reference in _myLayer because I need to interact with the layer. However, this leaves me with a retain count of 2 (once in _myLayer, once as a child node of the scene). No problem so far - at least as I'm understanding it. This, however, also means, I have to release it. So, the dealloc of MyScene looks like this:
-(void) dealloc {
[_myLayer release];
_myLayer = nil;
[super dealloc];
}
If I now release the Scene during runtime, everything works fine. I folled the process of releases and it's all good, I can release the whole scene including the layer without problems. MyScene::release is called once, lowers the retain count by one when actively calling [_myLayer release], MyLayer::release is called once (through the [super dealloc] - delete all children in CCNode), everyone is happy.
However, as soon as I quite the whole game (kill the app on the device) and CCDirector::end is called, the whole thing breaks, because in fact, it tries to release the _myLayer twice - once with the explicit call, once through releasing the children.
Now I could understand that I made some kind of mistake if it would be the same in both cases - but it isn't. Once it works as expected - in the first case lowering the retain count by one, then again, releasing it, once it works differently. And I have no clue why that is the case.
I tried scrapping the [_myLayer release] alltogether. In that case _myLayer doesn't get released at all during runtime but everything works out fine during shutdown. So it's kinda consistent here, but that doesn't really help me.
First of all: retainCount is useless.
This here returns an autoreleased instance of MyLayer:
_myLayer = [MyLayer node];
It is the equivalent of:
_myLayer = [[[MyLayer alloc] init] autorelease];
If you were to leave it at that with no other code, the _myLayer would become a dangling pointer some time after the init method returns. Definitely the next time the autorelease pool is purged, which if I remember correctly happens ever frame in cocos2d. Under ARC, the ivar by itself defaults to being a strong reference, so the layer would be retained and would not deallocate as you would expect. So if you don't like autorelease and how it behaves, use ARC. ;)
Then you add the layer as child, which puts it in an array, which means this retains the layer:
[self addChild:_myLayer];
So as long as the layer remains a child of the scene, it will not deallocate.
And now, like so many before you, you were looking at the wrong place to fix the problem of the layer not releasing. By doing this in dealloc you add an extraneous release:
[_myLayer release];
Now this works fine for the moment because the actual problem is the layer not releasing, and you force it to be released here. However some time later the scene's children array will be released, which sends release to each object, which then causes the crash due to over-releasing the layer.
Hence the actual problem that you should be tracking down is why the layer doesn't deallocate. And here I sense more problems:
I can release the whole scene
If by that you mean you were sending the release message to the scene, then that's wrong. And again this would over-release the scene. Cocos2d will clean out the scene when you call replaceScene or similar methods. The scene itself is typically autoreleased as well, definitely when created through the node or scene class methods.
If that's not what you're doing and the layer doesn't release, then check if maybe you have a retain cycle. Or perhaps you're simply expecting the layers to deallocate before the scene? That doesn't necessarily have to be in this order, with autorelease no less.
You can easily create a retain cycle by having two (or more) sibling nodes holding on to each other (ie layer A retains layer B and layer B retains layer A) or by a sibling retaining its parent, for example _myLayer holding a retained reference to the scene (which btw is the same as accessing it via self.parent).
Now I'm saying to use ARC because it makes all those problems go away almost instantly, except for retain cycles. But for retain cycles it provides a very simple and effective cure: zeroing weak references.
For example you could declare the layer reference as weak in the interface and no longer worry about it retaining the layer:
__weak MyLayer *_myLayer;
Moreover when the layer is released, _myLayer will automatically be set to nil. And this happens the instance the layer has no more strong references, not at some later time as is the case with autorelease.
The positive side effect is that you can now safely do the following:
#interface LayerA : CCLayer
#property (weak) LayerB* layerB;
#end
#interface LayerB : CCLayer
#property (weak) LayerA* layerA;
#end
Under MRC this would create a retain cycle if you assign the layers accordingly and don't set them to nil before the dealloc method. The tricky thing about retain cycles is that they can not be resolved within the dealloc method since by definition all objects participating in a retain cycle won't be deallocating. A typical place to do so in cocos2d is the cleanup method.
But now with ARC there is absolutely no problem. Either or both layers will deallocate because the reference in the other layer is not retaining (weak) and when either of the layers is deallocated the reference is set to nil, so there won't be any crashes.
I have been developing exclusively with ARC for two years. I never looked back. My quota of errors relating to memory management went down to almost zero, it's ridiculous. About the only thing I occasionally have to look into is when a weak reference is nil when I don't expect it to be. Usually that's when I incorrectly assume the object has a strong reference somewhere, but doesn't.
Using ARC is such a huge timesaver that even if you really really really really absolutely badly want to learn this, you better be really really really really almost exclusively interested in how memory management used to work back in the old days of Objective-C development. You know, like when my grandma was still writing code for the very first iPhone! :)
PS: it's a myth that you give up control over memory management when using ARC. There's a lot of neat tricks you can do to gain control back, even for a short time. Mainly by using bridge casting. So even if ARC can take a couple more cycles and that may add up in a tight loop, you can still hand-optimize the code (if you have to; you'll probably never ever have to) with MRC. This is beyond the scope of this answer (I already went too far) but these options do exist, but one hardly ever needs to exercise them.
This is why I'm brash about using ARC because not doing so is borderline irresponsible. IMO.
In dealloc, dont release myLayer, it already being held for you (when you addChild). Instead, i tend to 'removeAllChildrenWithCleanup:YES', this will release myLayer. What happens here (i suspect) is that on end, the director is trying to do just what i said BUT you already released myLayer. So it gets a zombie in its array of children.

Strange behaviour while deallocation of an object

I have a view, which is subclass of UIWebView. It has a property called Contact which is a managed object. The view uses templating engine to create a html with the object and then load into UIWebView. I thought it would be a better idea to monitor the object in the view itself, such that whenever something changes in the object, the view refreshes automatically. So, observed for certain attributes of the managed object in the view itself. And then to avoid the notification coalesce, I have made it such that the reload is done with
[self performSelector:#selector(refresh) afterDelay:0 ].
It refresh the webview automatically whenever it finds the change but also gives some strange crash. The crash says [MyWebView retain] message sent to deallocated object. I know I have properly removed observing values in dealloc method. But, it seems like dealloc gets triggered after a while. I have a strange issue related to releasing the view. The view stays for a while, although the view controller is already released and then releases after may 2/3 seconds. It is really strange. I think the crash is because of this.
Please do suggest me any idea. I will be glad to hear your suggestion. There are something wrong certainly, if anybody could point me I would really be grateful.
Using the delegate design pattern can cause EXC_BAD_ACESS KERN_INVALID_ADDRESS crashes if not used properly. If you have processing that is running in background threads that use the delegate design pattern, where in the object you set SELF as the delegate then you must remove SELF as the delegate in the dealloc method (even under ARC) by setting the delegate reference to nil, or there is a possibility that the object will try to call back into your deallocated object using the delegate design pattern. So if you have something like this in your object.
[_xmlParser setDelegate:self];
you should always have a dealloc method even under ARC to prevent the possibility of a crash in the case where your object gets destroyed while still doing work. It is very common to have your object destroyed while doing work. imagine a UIViewController that shows images from the internet. If you had a FetchImage class that used the delegate design pattern to lookup images that then calls a routine on the object when the lookup finishes, it is easily for the user to pop into and out of your UIViewController while your FetchImage object is still doing work on the background thread. You might not ever notice this when testing, but if you have hundreds of users, some of them will notice because the app will crash when your object tries to call a method on the SELF reference.
If your object uses the delegate design pattern, always have this to cleanup:
#pragma mark - dealloc - cleanup delegate references to prevent callbacks into deallocated objects (EXC_BAD_ACCESS / KERN_INVALID_ADDRESS)
- (void)dealloc
{
[_xmlParser setDelegate:nil];
// for non ARC based code you would also call: [super dealloc];
}
search every class in your project, if you have setDelegate:self or delegate = self then your users are most likely experiencing race condition crashes with your app if you don't have a dealloc cleanup method as described above. If you don't have the dealloc, add it even if you never see crashes when testing. -rrh

What data is better to initialize in loadView versus init

When memory is low and views get cleaned up by the OS, my understanding is that viewDidUnload is an appropriate place to clean up objects and memory used by your UIViewController (that otherwise wouldn't get cleaned up as a function of being in the view hierarchy). This data is then re-initialized when loadView is called again to create the view. Can someone give examples of what sort of things might be cleaned up (and likewise initialized in loadView)?
I have some data I initialize in loadView currently which sets the stage for my view controller to run a complex animation script involving captions, images, etc. I figured it would make sense to release and cleanup that data if my view were to be removed by the OS (and viewDidUnload were called), but then I thought to myself, why wouldn't I just initialize that data within init and clean it up in dealloc instead of repeatedly initializing and cleaning up the same data (it doesn't change as a function of when the view is loaded or shown). Would this be a better place for it?
Basically, my thinking is:
yes, i should just initialize it in init and release in dealloc because it never changes
initializing things in loadView (and subsequently cleaning in viewDidUnload) is an appropriate practice when either that data will initialize differently based upon when the view is loaded (or even more appropriately when the view appears in viewWillAppear/viewWillDisappear) or it is a good candidate for freeing memory because it takes up a lot of memory that you'd like to see freed up if the view is not active.
Can anyone give me some clarification on my questions and/or my line of thinking?
if you're going to be going back and forth between that view and another and the viewcontroller will be kept around, you could indeed move the initialisation to init, and clean it up in dealloc. but what you would want to do is also clean it up in - (void)didReceiveMemoryWarning (be careful not to use self.view in didReceiveMemoryWarning otherwise that'll reload the view :) ). then you could use lazy loading to reload it in viewDidLoad (i.e. if it doesn't already exist then initialise the data, otherwise don't).
of course, you can't do any initialisation in init that depends on the view being present.. viewDidLoad is the place for that.

UIViewController prevent view from unloading

When my iPhone app receives a memory warning the views of UIViewControllers that are not currently visible get unloaded. In one particular controller unloading the view and the outlets is rather fatal.
I'm looking for a way to prevent this view from being unloaded. I find this behavior rather stupid - I have a cache mechanism, so when a memory warning comes - I unload myself tons of data and I free enough memory, but I definitely need this view untouched.
I see UIViewController has a method unloadViewIfReloadable, which gets called when the memory warning comes. Does anybody know how to tell Cocoa Touch that my view is not reloadable?
Any other suggestions how to prevent my view from being unloaded on memory warning?
Thanks in advance
Apple docs about the view life cycle of a view controller says:
didReceiveMemoryWarning - The default
implementation releases the view only
if it determines that it is safe to do
so
Now ... I override the didReceiveMemoryWarning with an empty function which just calls NSLog to let me know a warning was received. However - the view gets unloaded anyway. Plus, on what criteria exactly is decided whether a view is safe to unload ... oh ! so many questions!
According to the docs, the default implementation of didReceiveMemoryWarning: releases the view if it is safe to do (ie: superview==nil).
To prevent the view from being released you could override didReceiveMemoryWarning: but in your implementation do not call [super didReceiveMemoryWarning]. That's where the view is released by default (if not visible).
The default didReceiveMemoryWarning releases the view by calling [viewcontroller setView:nil], so you could override that instead.
What appears to be working for me was to override setView: to ignore setting to nil. It's kludgy, but then, this is a kludgy issue, and this did the trick:
-(void)setView:(UIView*)view {
if(view != nil || self.okayToUnloadView) {
[super setView:view];
}
}
Could it be so simple?
Even though nowhere in the documentation this is mentioned, it seems that if I exclusively retain my view in viewDidLoad, then it does not get released on Memory Warning. I tried with several consecutive warnings in the simulator and all still seem good.
So ... the trick for the moment is "retain" in viewDidLoad, and a release in dealloc - this way the viewcontroller is "stuck" with the view until the time it needs to be released.
I'll test some more, and write about the results
I don't think any of these ideas work. I tried overriding [didReceiveMemoryWarning], and that worked for some phones, but found one phone unloaded the view BEFORE that method was even called (must have been in extremely low memory or something). Overriding [setView] produces loads of log warnings so I wouldn't risk that by Apple. Retaining the view will just leak that view - it'll prevent crashes but not really work - the view will replaced next time the controllers UI is loaded.
So really you've just got to plan on your views being unloaded any time they're off-screen, which is not ideal but there you go. The best patterns I've found to work with this are immediate commit so your UI is always up-to-date, or copy-edit-copy, where you copy your model to a temporary instance, populate your views and use immediate commit with that instance, then copy the changes back to your original model when the user hits 'save' or whatever.
Because the accepted solution has problems with viewDidUnload still getting called even though the view was blocked from being cleared, I'm using a different though still fragile approach. The system unloads the view using an unloadViewForced: message to the controller so I'm intercepting that to block the message. This prevents the confused call to viewDidUnload. Here's the code:
#interface UIViewController (Private)
- (void)unloadViewForced:(BOOL)forced;
#end
- (void)unloadViewForced:(BOOL)forced {
if (!_safeToUnloadView) {
return;
}
[super unloadViewForced:forced];
}
This has obvious problems since it's intercepting an undocumented message in UIViewController.
progrmr posted an answer above which recommends intercepting didReceiveMemoryWarning instead. Based on the stack traces I've seen, intercepting that should also work. I haven't tried that route though because I'm concerned there may be other memory cleanup which would also be blocked (such as causing it to not call child view controllers with the memory warning message).

Resources