I'm working on my first ARC & Core Data project, basing this stage on Xcode's (Universal) Master-Detail template.
I note that Xcode5 has a memory display in the Debug Navigator but when using it find its graph bears few similarities with mem usage displayed in Instruments when running a Leaks&Allocations trace.
I've done the Instruments tracing with the Simulator (simulating both iPhone & iPad - in case the 'unloading' of the detail View with the latter makes a difference) and on an iPad2 & an iPodTouch. The results are broadly the same:
iPhone 6.1 simulator
Generation A--------1.13 MB
Generation B--------397.70 KB
Generation C--------76.96 KB
Generation D--------11.70 KB
Generation E--------1.56 KB
Generation F--------3.48 KB
an overall growth of c30%
where Generation A shows the growth to the loading of the Master table, and each successive Generation the growth after the Detail view has been visited and interacted with (entailing the fetching of NSManagedObjects and the creation of NSObjects, respectively).
The growth trend with the other devices was broadly similar (with the Generation A growth being iPad sim:1.42; iPad2:1.57; iPodTouch:0.94 but tailing off similarly).
According to the Debug Navigator, however, the total usage at each point comes out at:
iPhone 6.1 Debug Navigator
Generation A--------4.2 MB
Generation B--------6.9 MB--growth 2.7
Generation C--------7.1 MB--growth 0.2
Generation D--------7.8 MB--growth 0.7
Generation E--------8.0 MB--growth 0.2
Generation F--------8.4 MB--growth 0.4
an overall growth of 100%!
Referring to other similar questions, I don't have Zombies enabled.
Have others seen such discrepancies? Am I right in being inclined to trust Instruments over the Debug Navigator's summary figure?
PS. Debug Navigator's summary figure doesn't seem to be available when running the real devices (both on versions of iOS5). Is this normal?
This may not be a very good answer for you, but it is my justification for this issue with the research that I have done.
The debug navigator shows the same things as the "Activity Monitor" instrument. It is not showing current allocated memory by your app, it's showing current memory allowed to your app by the OS.
Say I create a for loop to create lots of objects in memory, but then I delete half of them because they don't fit my search criteria (bad coding, I know, but hypothetically here). The OS would get a request from your app for the full memory to create all of the objects, but after the loop when you check your allocations in instruments it shows only the saved objects because garbage collection took out the deleted ones. The OS may or may not know about the garbage collection events, but it doesn't take the memory that it just gave you away. I'm not sure of the overhead of giving/taking available memory from your app, but I'm sure they take that into account. I've noticed that if I leave my app alone long enough the OS takes some of the memory I'm not using back.
Just think of the debugging memory information as your app's full memory allotted by the OS. You may not be using all of it, but the OS gave it to you anyway (for one reason or another). This number will increase based on your app's requests/use. It will decrease due to Memory Pressure warnings or inactivity that the OS thinks it can safely recover the memory from you. It will likely never match the Instruments allocated memory information because there is always transient memory used in applications that needs to be allocated somewhere even for a short time.
Again, this is my conclusion based on when I was wondering the same thing you are. Hope it helps a little.
Related
I'm trying to debug why our SceneKit-based app is using so much memory but Xcode and Instruments / Allocations seem to have very different values for the amount of memory being used. When I look in Xcode I see something like 600 MB but when I transfer the same running session over to Instruments / Allocations, I see a very different number for persistent bytes, like 150 MB.
Which one is correct? Why the difference? Are they measuring different things?
(Regardless of whether I Transfer an Xcode debug session or start fresh in Instruments, it doesn't seem to make much difference.)
The reason that I care is that iOS is killing the app for excessive memory use (according to Xcode) but I can't seem to find the problem via Instruments.
I've tried turning off all GPU and Metal debug options but they don't seem to make a difference.
Which one is correct?
My intuition is: Instruments. It uses Dtrace to (sorry) instrument your code and watch actual allocations and deallocations as they happen, at the expense of performance. The Xcode debug navigator memory graph is more of an outside view designed to give a very general sense of what’s happening. That is exactly why the latter offers you a way to switch to the former — because that (Instruments) is where you’re going to get real measurements.
(However, let’s keep in mind that Instruments may fail to include in the total you’re seeing some virtual memory backing stores for graphics. There are plenty of WWDC videos discussing this topic in more detail. )
I know that this answer is quite late, but for the sake of future developers with the same problem, I would advise you to check the images in your assets folder. If any of your images have dimensions larger than 1000 x 1000 you should scale them down. With the example above, the image comprises of 1000000 pixels. Following how images are loading in (4 bytes per pixel), this means 4 MB of memory is used to load the image. Unbeknownst to me, I had an image of roughly 3600 * 4000 in my assets folder. Doing the maths, this was over 50 MB of memory usage!
This question already has answers here:
Problems with memory management, autorelease, permanent heap is sometimes 250+ kb on iOS
(2 answers)
Closed 9 years ago.
Hello my question is maybe General I am not asking for code etc.
I developing only for iPhones with iOS6.1 and above
When I run my application the RAM it uses only grows up(when I switching between views (I have like 15 views)).
However after I ran test with analyzer it didn't find any leaks.
also no leaks were found in instrument leaks.
Despite my application doesn't exceed 20 mb of RAM I am still worried that something may be just not ok there.
I am using ARC ,but the ram still goes up.
Is there any way I can check what can cause 1 sided ram allocation ?
If the memory continues to go up, it could be a variety of different things, but "strong reference cycle" is the prime suspect. Sadly, this won't necessarily show up in Leaks tool in Instruments, either.
Do snapshots/generations in the Allocations tool and identify what's not getting released (notably if it consists of any of your classes) and go from there. Specifically, run the app through its paces, then mark snapshot/generation, do a bit more, and then mark another snapshot/generation. Look at that second snapshot and see what's been allocated (but not released) captured since the prior snapshot, with a focus on your classes. You'll find the culprit pretty quickly that way.
See WWDC video iOS App Performance: Memory for practical demonstration.
For example, here is a healthy app that I profiled through the Instruments' "Leaks" tool, but I'm going to focus on the "Allocations" tool:
In this profile, I waited for the app to quiet down, tapped on "Mark Generation" button (resulting with "Generation A", the first flag in my timeline). I then went to a view and then dismissed it, and did "Mark Generation" again, getting "Generation B". The "Growth" column is telling me that between Generation A and B, 100kb was consumed, but not released. But I'm not worried about this yet because there might be some iOS internal cache of UIKit elements. So, I repeat this process one more time to get "Generation C". Now that's interesting, now reporting a growth of only 8.26kb, which is negligible. This, combined with a clean bill of health from the Leaks instrument makes me feel pretty good about the risk of any serious memory problems.
Now, let's contrast that with some code that has a seriously problematic "strong reference cycle":
Now this is a completely different picture, even though the process was the same "present and dismiss" process, repeated twice. This is now telling me that I had a 14mb growth between generations, and more notably, I can clearly see the problematic growth curve. What's remarkable is that while the Allocations tool is clearly catching a serious problem, the Leaks tool reports nothing.
Now, in practice, the real-world experience with the Allocations tool will probably rest somewhere between these two extremes. Your app may have its own caches or model objects that slowly take up memory, but if you're properly responding to memory warnings, you should be recovering that memory. Frankly, though, most well designed apps should not be generating memory warnings at all (usually accomplished by properly configuring caches, avoiding imageNamed where appropriate, moving to persistent storage for large or infrequently accessed data, etc.). The goal is to get to a point where the app stabilizes around some reasonable baseline memory allocation level, consistently returning back to that baseline.
But it's going to be impossible for us to advise you until you do some basic profiling of your app and diagnose the sorts of memory issues that you're having.
I'm in the process of understanding how to put instruments to better use. I just finished a leak management exercise, and instruments is reporting very few leaks. I'll figure those out later. In the mean time, my app is crashing, and it appears that its related to memory pressure.
So I looked at this in Instruments. I have Allocations and Memory Monitor in use. Allocations shows a pretty steady 3 to 4 MB Live bytes while I just let my app initialize and come to equilibrium. Overall bytes, however, jumps to over 50 MB. I didn't think much of this until I looked at the Memory Monitor and I see that memory usage goes up and down, causing memory warnings. (It seems strange to me that this doesn't show up on the allocations graph at the same time.)
The app should be at an equilibrium point, but apparently it's not. My question is how can I use instruments to help me understand why memory usages is rising and falling?
Instruments as a tool for debugging is simply excellent. From what I can understand, you have been trying to use the allocations tool, so I'll go over that. Allocations details the number of objects your application allocates during it's execution, along with their in-memory references, locations, even the calling code that allocates said objects. When instruments starts running the allocations tool, your application begins reporting all allocations as blue dots, which pile up higher and higher as your application executes (naturally, as you should be allocating more and more objects). Overall Bytes displays the amount of memory EVERY allocation your app has made added together. I want to stress this for your case: it does not mean your app is currently using 50 mb of memory!, it just means that your app has used 50 mb total. Your app is obviously limited to the amount of memory the device has, and 3-4 mb is not a lot when you consider that the first gen. iPhone had about 128mb, but for more complicated applications, the OS will usually kill off other applications before it kills yours.
As for the other allocation graph with spikes, rather than a continuous line graph, that is to detail the number of allocations going on at that point in time. Usually, the spikes can be ignored, unless there are a lot of large spikes in one small amount of time.
Anyways, to address your specific memory warning problem, it honestly depends how many memory warnings you are receiving, and at what level the warning are at. And as for your leaks, my only word of advice is: Squash them as soon as possible! When you see a leak (a red bar in the leaks tool), click on the bar and find the objects that are being leaked. When you select a leaked object, then select the right sidebar, it will show you the code that is leaking. When you double click on any part of the right sidebar, it'll even open up the specific line and class the leak originated from!
I am Profiling an Application with Instruments. The profiling is done using Allocations Tool in two ways:
By choosing Directly the Allocations when I run the App for Profiling
By Choosing Leaks when I run the App for Profiling.
In both the cases , I had Allocations tool enabled for testing. But surprisingly , I had two different kind of Out put for Allocations at these instances.
Are they supposed to behave differently? or this is a problem with Instruments.
The time I Profile the with Leaks Tool:
In Allocations Graph:
1. I get lot of Peaks in Graph, The Live bytes and overall bytes are same.
2. I get the Black flags (I think it alarms about memory warning) after 1 minutes usage. Then after a set of flags appearing, my App Crashes. (This happens at times, even when directly run the App in Device)
The time I Profile the with Allocation Tool:
In Allocations Graph:
1. I do not get peaks often as it was in the above case. The Live bytes was always way less than Overall bytes.
2. I have used for over 20 minutes and never got Black flags.
One fact I came to know is that, when Live bytes and overall bytes are equal, the NSZombieEnabled could be enabled.
Have any of you ever encountered this problem.
UPDATE 1:
I faced another problem with first case. Whenever I profiled after a short duration (compared to profiling in second case), the app got lots of Black Flags and my App Crashed. (Due to Memory warning)
And when I tried the similar step by step use of Application my Application did not crash and got no flags.
Why this discrepancy?
In the first case, you are only tracking live allocations because the "Leaks" template configures the Allocations instrument that way. In the second, you are tracking both live and deallocated allocations. (As CocoaFu said).
Both are useful, but for slightly different reasons.
Only tracking live allocations (in combination with Heapshot Analysis, typically), is a great way to analyze permanent heap growth in your application. Once you know what is sticking around forever, you can figure out why and see if there are ways to optimize it away.
Tracking all allocations, alive and dead, is a very effective means of tracking allocation bandwidth. You can sort by overall bytes and start with the largest #. Have a look at all the points of allocation (click the little arrow next to the label in the Category of the selected row), and see where all the allocations are coming from.
For example, your graph shows that there are 1.27MB of 14 byte allocations -- 9218 allocations -- over that period of time. All have been free()d [good!], but that still represents a bunch of work to allocate, fill with data (presumably), and free each one of those. It may be a problem, maybe not.
(To put this in perspective, I used this technique to optimize an application. By merely focusing on reducing the # of transient -- short lived -- allocations, I was able to make the primary algorithms of the application 5x faster and reduce memory use by 85%. Turns out the app was copying strings many, many, times.)
Not sure why your app crashed as you described. Since it is a memory warning, you should see what is most frequently allocated.
Keep in mind that if you have zombie detection enabled, that takes a lot of additional memory.
Depending on the way Allocations is instantiated there are different options. Check the options by clicking the "i" symbol in the Allocations tile.
Yes, I find this annoying also.
I want to make sure I'm reading the allocations plug in correctly. I'm testing an iPad app thats receiving memory warnings 1,2 & 3.
I want to know the current used up memory from my app, which I think it has to be the "Live Bytes" column? which marks All Allocations to 2.42 MB which I think its low.
What do the other columns report? #Transitory, Overall Bytes ?
Also if my app uses only 3 MB of memory can it be killed if I get a memory level 3 warning without releasing?
Thank you.
Don't use the Object Allocations instrument for looking at your total memory usage. It does not display the true total memory size of your application, for reasons that I speculate about in my answer here.
Instead, pair Object Allocations with the Memory Monitor instrument, the latter of which will show the true total size of your application. I'm willing to bet that it's way larger than the 2.42 MB you're seeing in Object Allocations (for example, I had an application with 700k of memory usage according to ObjectAlloc, but its actual size was ~25 MB in memory). If you are receiving memory warnings on an iPad, your application is probably chewing up quite a bit of memory.
Object Allocations is handy for telling you what you have resident in memory, but it's not an accurate indicator of the size of those items. It's also a great tool for showing you steady increases in allocated objects by using the heap shot functionality (the "Mark Heap" button on the left-hand side of the instrument).
Your memory usage looks fine. Check to see which app is being sent the memory warnings, it is probably not your app assuming your app is not in the background. The only way you should be getting memory warnings is if the app is in the background and another app needs more memory.
When I was looking at logs I noticed other apps were getting them while my app was running, other apps such as Mail or navigation apps (Navigon) do run in the background and will cause memory pressure. You might get a memory warning but should not be terminated.
For a description of the memory columns see Explanation of Live Bytes & Overall Bytes.
As #Brad points out use the memory monitor tool as well.