What's the right statistic for iOS Memory footprint. Live Bytes? Real Memory? Other? - ios

I'm definitely confused on this point.
I have an iPad application that shows 'Live Bytes' usage of 6-12mb in the object allocation instrument. If I pull up the memory monitor or activity monitor, the 'Real Memory' Column consistently climbs to around 80-90mb after some serious usage.
So do I have a normal memory footprint or a high one?
This answer and this answer claim you should watch 'Live Bytes' as the 'Real Memory' column shows memory blocks that have been released, but the OS hasn't yet reclaimed it.
On the other hand, this answer claims you need to pay attention to that memory monitor, as the 'Live Bytes' doesn't include things like interface elements.
What is the deal with iOS memory footprint!? :)

Those are simply two different metrics for measuring memory use. Which one is the "right" one depends on what question you're trying to answer.
In a nutshell, the difference between "live bytes" and "real memory" is the difference between the amount of memory currently used for stuff that your app has created and the total amount of physical memory currently attributed to your app. There are at least two reasons that those are different:
code: Your app's code has to be loaded into memory, of course, and the virtual memory system surely attributes that to your app even though it's not memory that your app allocated.
memory pools: Most allocators work by maintaining one or more pools of memory from which they can carve off pieces for individual objects or allocated memory blocks. Most implementations of malloc work that way, and I expect that the object allocator does too. These pools aren't automatically resized downward when an object is deallocated -- the memory is just marked 'free' in the pool, but the whole pool will still be attributed to your app.
There may be other ways that memory is attributed to your app without being directly allocated by your code, too.
So, what are you trying to learn about your application? If you're trying to figure out why your app crashed due to low memory, look at both "live bytes" (to see what your app is using now) and "real memory" (to see how much memory the VM system says your app is using). If you're trying to improve your app's memory performance, looking at "live bytes" or "live objects" is more likely to help, since that's the memory that you can do something about.

Seeing as how I wrote the last answer you linked to, I'll have to stand by that. If you want a total, accurate count of the current memory usage for your application, use the Memory Monitor instrument.
For reasons that I describe in this answer, Allocations hides the memory sizes of certain elements, meaning that its memory usage totals are significantly lower than your application's in-memory size. Many people find this out the hard way when they try to get their application functional on older iOS devices. On the older hardware, you had a hard memory ceiling of ~30 MB, where if you exceeded that your application was hard-killed.
Many developers (myself included) saw that we only had ~1-2 MB of live bytes in Allocations and thought we were good, until our applications started receiving memory warnings and early terminations. If you looked at Memory Monitor, you could see the true in-memory size of these applications being >20 MB, and you could see the applications being terminated the instant they crossed the 30 MB barrier in Memory Monitor.
Therefore, if you want an accurate assessment of your total application memory usage, use Memory Monitor. Allocations is great to find out the specific objects that are in memory, particularly when you use the heap shots to find things that might be accumulating (as leaks, retain cycles, or for other reasons). Just don't trust it when determining your application's actual size in memory.

'Live bytes' means memory allocated by your code (for example by malloc), so you have access to this memory. 'Real memory' shows physical amount of memory used by your app. This include also OpenGL textures, (possibly) sounds from Open AL...
Live bytes is useful to check when you allocate and release memory in your code. Real memory is good indicator for memory optimization efficiency. And it's overhead causes 'low memory' warnings.

Related

Apple Instruments slows down app when analyzing memory allocations

When running my app in the simulator and analyzing its memory allocations using Instruments, the App runs very slow, it runs at less than 1/30 of its normal speed.
The app uses about 50 MB RAM and has approximately 900,000 life objects (according to Instruments).
Could this be the reason for the slow performance?
When running in the app on the device or in the simulator without using Instruments, it performs well (except the memory issue I am trying to debug).
Do you have any idea on how to solve this issue?
Did you encounter slow performance using the Memory Allocation
instruments?
Would you consider having more than 900,000 life
objects "concerning"?
Considering your Analyzer performance issue
In your specific case monitoring the app over a long period of time will not be necessary, as you reach the state of high memory consumption very soon. You could simply stop recording at this point. Then you won't have problems navigating through the different views and statistics to find the cause of the memory issue.
Analyzing the memory issue
Slowing down is normal. 1/30 sounds quite alarming.
You probably should track how the amount of life objects and the memory usage change while you use the app.
It is difficult to decide if a certain amount of life objects at a specific point in time is critical (though 900,000 seems very high).
In general: if life objects and memory usage grow continuously and don't shrink, that is a bad sign.
If you take a look Statistics -> Object Summary (Screenshot), Live Bytes should be a lot smaller than Overall Bytes and the amount of #Living objects should be a lot smaller than the amount of #Transitory objects.
The second thing you can look at, is the Call Tree view.
It gives you a nice overview of which parts of the application are responsible for reserving large amount of memory:
Possible solutions
Once you detect the parts of your code that are responsible for reserving the large memory amount you can look for retain-cycles or you could try to use more autorelease pools in that spot.
Check that you have enough available disk space. I had 8gb left and it seems like that was too little. Instruments was extreeemely slow. Used a minute just to start and didn't quite get around at all.
I cleared out more disk space and then it suddenly went back to being fast as before.

iOS -- Using and Understanding Instruments using Allocations and Memory Monitor (Physical Memory Free)

I'm in the process of understanding how to put instruments to better use. I just finished a leak management exercise, and instruments is reporting very few leaks. I'll figure those out later. In the mean time, my app is crashing, and it appears that its related to memory pressure.
So I looked at this in Instruments. I have Allocations and Memory Monitor in use. Allocations shows a pretty steady 3 to 4 MB Live bytes while I just let my app initialize and come to equilibrium. Overall bytes, however, jumps to over 50 MB. I didn't think much of this until I looked at the Memory Monitor and I see that memory usage goes up and down, causing memory warnings. (It seems strange to me that this doesn't show up on the allocations graph at the same time.)
The app should be at an equilibrium point, but apparently it's not. My question is how can I use instruments to help me understand why memory usages is rising and falling?
Instruments as a tool for debugging is simply excellent. From what I can understand, you have been trying to use the allocations tool, so I'll go over that. Allocations details the number of objects your application allocates during it's execution, along with their in-memory references, locations, even the calling code that allocates said objects. When instruments starts running the allocations tool, your application begins reporting all allocations as blue dots, which pile up higher and higher as your application executes (naturally, as you should be allocating more and more objects). Overall Bytes displays the amount of memory EVERY allocation your app has made added together. I want to stress this for your case: it does not mean your app is currently using 50 mb of memory!, it just means that your app has used 50 mb total. Your app is obviously limited to the amount of memory the device has, and 3-4 mb is not a lot when you consider that the first gen. iPhone had about 128mb, but for more complicated applications, the OS will usually kill off other applications before it kills yours.
As for the other allocation graph with spikes, rather than a continuous line graph, that is to detail the number of allocations going on at that point in time. Usually, the spikes can be ignored, unless there are a lot of large spikes in one small amount of time.
Anyways, to address your specific memory warning problem, it honestly depends how many memory warnings you are receiving, and at what level the warning are at. And as for your leaks, my only word of advice is: Squash them as soon as possible! When you see a leak (a red bar in the leaks tool), click on the bar and find the objects that are being leaked. When you select a leaked object, then select the right sidebar, it will show you the code that is leaking. When you double click on any part of the right sidebar, it'll even open up the specific line and class the leak originated from!

High virtual memory usage + low allocations on iOS

I have code that has a low amount of active allocations (about 5 MB according to Instruments), but a high amount of system memory usage (over 100 MB). I know the code is leak-free, and I'm not seeing any allocation spikes after some optimization, but I'm still crashing due to the high amount of memory usage.
I Googled around a lot and see that I'm supposed to be using the VM Tracker instrument, which confirms my high memory usage, but I'm not sure how to address this situation. I'm using as little memory as possible, it's still too much on an iPad 1, and I don't have the knowledge or tools to figure out how to get the OS to not mark so much memory as dirty when I'm not actually using it. Where do I go from here?
Use the Profile tool and select memory +allocations. Click the VM tracker and take a snapshots. This results in a list with resident dirty and virtual memory usage per object type. This will give you an indication where to look.
I think the most common problem is that you have a lot of autoreleased objects that reside in the autoreleasepool. The following link explains more on how to handle autoreleasepools:
How does the NSAutoreleasePool autorelease pool work?

Checking iOS application used memory in instruments

I want to make sure I'm reading the allocations plug in correctly. I'm testing an iPad app thats receiving memory warnings 1,2 & 3.
I want to know the current used up memory from my app, which I think it has to be the "Live Bytes" column? which marks All Allocations to 2.42 MB which I think its low.
What do the other columns report? #Transitory, Overall Bytes ?
Also if my app uses only 3 MB of memory can it be killed if I get a memory level 3 warning without releasing?
Thank you.
Don't use the Object Allocations instrument for looking at your total memory usage. It does not display the true total memory size of your application, for reasons that I speculate about in my answer here.
Instead, pair Object Allocations with the Memory Monitor instrument, the latter of which will show the true total size of your application. I'm willing to bet that it's way larger than the 2.42 MB you're seeing in Object Allocations (for example, I had an application with 700k of memory usage according to ObjectAlloc, but its actual size was ~25 MB in memory). If you are receiving memory warnings on an iPad, your application is probably chewing up quite a bit of memory.
Object Allocations is handy for telling you what you have resident in memory, but it's not an accurate indicator of the size of those items. It's also a great tool for showing you steady increases in allocated objects by using the heap shot functionality (the "Mark Heap" button on the left-hand side of the instrument).
Your memory usage looks fine. Check to see which app is being sent the memory warnings, it is probably not your app assuming your app is not in the background. The only way you should be getting memory warnings is if the app is in the background and another app needs more memory.
When I was looking at logs I noticed other apps were getting them while my app was running, other apps such as Mail or navigation apps (Navigon) do run in the background and will cause memory pressure. You might get a memory warning but should not be terminated.
For a description of the memory columns see Explanation of Live Bytes & Overall Bytes.
As #Brad points out use the memory monitor tool as well.

How to use AQTime's memory allocation profiler in a program that uses a large amount of memory?

I'm finding AQTime hard to use because it interferes with the original program too much. If I have a program that uses, for example, 300MB of ram I can use AQTime's allocation profiler without a problem, and find out where most of the memory is being used. However I notice that running under AQTime, the original program uses more like 1GB while it's being profiled.
Right now I'm trying to reduce memory usage in a program which is using 1.4GB of memory. If I run it under AQTime, then the original program uses all of the 2GB address space and crashes. I can of course invent a smaller set of test data and estimate how the memory usage will scale with the full data set - but the reason I'm using a profiler in the first place is to try to avoid this sort of guesswork.
I already have AQTime set to 'Collect stack information - None' and all the check boxes to do with checking memory integrity are switched off, and I've tried restricting the area being profiled to just a few classes but this doesn't seem to improve anything. Is there a way to use AQTime that produces a smaller overhead? Or failing that, what other approaches are there to get a good idea of the memory being used?
The app is written in Delphi 2010 and I'm using AQTime 6.
NB: On top of the increased memory usage, running under AQTime slows the app down an awful lot, making the whole exercise not just impossible but impractical too :-P
AFAIK the allocation profiler will track memory block allocation regardless of profiling areas. Profiling areas are used to track classes instantiation. Of course memory-profiling an application that allocates a large amount of memory is a issue, you may try to use the LARGE_ADRESS_AWARE flag, and the /3GB boot switch, or use a 64 bit system (as long as you have at least 4GB of memory, or more). Also you can take snapshot of the application state before it crashes, to see where the memory is allocated. Profiling takes time, anyway, you may have to let it run for a while.

Resources