I am using the "mark generation" button of the allocation instrument.
Every snapshot generation adds about 200 KB to the growth column.
I can't find anything suspicious so could it be that the growth is due to the system libs or maybe some core data caches?
It's possible that system libraries generates some new MBs to the system heap. What is more, they should be doing this! And you cannot do anything abut that...
You could try looking at this tutorial which I was inspecting a few days ago to give myself a easy-start with Instruments.
Related
I'm trying to debug why our SceneKit-based app is using so much memory but Xcode and Instruments / Allocations seem to have very different values for the amount of memory being used. When I look in Xcode I see something like 600 MB but when I transfer the same running session over to Instruments / Allocations, I see a very different number for persistent bytes, like 150 MB.
Which one is correct? Why the difference? Are they measuring different things?
(Regardless of whether I Transfer an Xcode debug session or start fresh in Instruments, it doesn't seem to make much difference.)
The reason that I care is that iOS is killing the app for excessive memory use (according to Xcode) but I can't seem to find the problem via Instruments.
I've tried turning off all GPU and Metal debug options but they don't seem to make a difference.
Which one is correct?
My intuition is: Instruments. It uses Dtrace to (sorry) instrument your code and watch actual allocations and deallocations as they happen, at the expense of performance. The Xcode debug navigator memory graph is more of an outside view designed to give a very general sense of what’s happening. That is exactly why the latter offers you a way to switch to the former — because that (Instruments) is where you’re going to get real measurements.
(However, let’s keep in mind that Instruments may fail to include in the total you’re seeing some virtual memory backing stores for graphics. There are plenty of WWDC videos discussing this topic in more detail. )
I know that this answer is quite late, but for the sake of future developers with the same problem, I would advise you to check the images in your assets folder. If any of your images have dimensions larger than 1000 x 1000 you should scale them down. With the example above, the image comprises of 1000000 pixels. Following how images are loading in (4 bytes per pixel), this means 4 MB of memory is used to load the image. Unbeknownst to me, I had an image of roughly 3600 * 4000 in my assets folder. Doing the maths, this was over 50 MB of memory usage!
So I already asked this question on the unity forums but that didn't help me a lot :/. Today I decided to give my game a try with friends so basically I created a server so other clients/friends could connect to it. After about 20 minutes of playing my game crashed and I quickly checked task manager and my game was using about 3.5 GB of RAM! PHOTO HERE When I looked at the log it said something like this:
Unity Player [version: Unity 4.6.1f1_d1db7a1b5196]
mono.dll caused an Access Violation (0xc0000005)
in module mono.dll at 0023:101071a1.
Error occurred at 2016-07-04_163050.
C:\Users\Cooler Master\Desktop\TL\TL.exe, run by Cooler Master.
77% memory in use.
0 MB physical memory [1814 MB free].
0 MB paging file [1039 MB free].
0 MB user address space [68 MB free].
Write to location 00000000 caused an access violation.
So in my game there is something that keeps eating my RAM. It takes about 3MB per second of ram when the game is running. I found some information on internet about Resources.UnloadUnusedAssets but I don't know how to write a script like that! I knew how to write in c# but only the easy scripts like GUI,Newtorkinf,Physics etc.. Also I downloaded unity pro from kickasstorrent only to see profiler window but that didn't help me at all
Thank very much for any reply !
Also if you want to check out the game and see what's happening : http://gamejolt.com/games/thuglifealpha/61660
Thanks again! :)
Without seeing your code it is difficult to find exact issues. But I you should start from finding code where new objects are instantiated/destroyed. It causes GC to collect unused objects and allocate new memory. And soon your free memory becomes so fragmented that there is no suitable memory area to allocate new object.
I strongly recommend to read articles about optimization for Unity
I get frequent memory warnings in my application but I don't know why.
Here is the snapshot of allocation instruments.
I know that we don't have any control over virtual memory assigned to us but I am trying to understand what information does that number 26.50 MB means for a developer.
1. What does a high VM means ? Does it lead to a jetsam ? Is that cause of any other concern ?
2. Is this value dependent on device ?
3. Does a low vm means that your app is memory efficient
4. Does a high VM leads to memory warnings in your app ?
5. What cause this value to change ?
6. What steps should a developer take when they see a high vm for their app (like 300 MB) ?
7. Is VM tracker instrument related to this value ?
Anonymous VM covers a lot of things, some of which are things you want to minimize and some that are generally less important. The short version of "anonymous VM" is that it's addresses you have mapped but not named. Heap allocations get "named" which lets you track them as objects. But there are lots (and lots) of non-objecty things that fall into the "anonymous VM" bucket.
Things allocated with malloc can wind up in this region. But also memory mapped files. Your executable is a memory mapped file, but since it's never dirty, parts of it can be swapped out. So "it's complicated." But in big, vague terms, yes, you do care about this section, but you may not care about all of it very much. Heap allocations tends to track your ObjC stuff. Anonymous VM often tracks things that you don't have a lot of direct control over (like CALayer backing storage).
All that said, the Instruments output you provide doesn't look like any major problem. I suspect it's not indicative of a time you're pressuring memory. You'll need to get yourself into a memory warning situation and see what's going on then, and dig into the specifics of what is using memory.
For much more detail on this, you should watch WWDC 2013 session 704 "Building Efficient OS X Apps" which goes into depth on much of this. While iOS has a somewhat different memory system, and some OS X tools aren't available on iOS, many of the concepts still apply.
I've just been analyzing an ipad app I'm developing using Instruments. In particular I was interested in the memory usage, as I have been receiving some memory warnings.
First of all the Activity monitor reports overall some 40MB of memory used just after starting the application. This really seams like a lot to me. Especially as after the startup nothing really fancy is going on.
So I have been analyzing the app in the VM tracker.
First of all can somebody explain how to interpret the dirty memory? I mean the ipad doesn't really have virtual memory, in the sense that there is no swapping etc.
Ok the really weird thing is that I have some 40 MB of dirty memory, that is resident! Some 38MB are listed under IOKit. Under IOKit there is no further information, what that actually means.
So what exactly does IOKit do?
What could be causing this insane those huge values?
Any kind of hint is appreciated! :)
Try Heapshot Analysis, bbum has a great tutorial here.
Basically you take a Heapshot, run some procedure, take another Heapshot for several iterations. This will help find memory that lost but not a leak. I use this method often,
I have used Heapshot many times to great advantage, many thanks to bum.
What is dirty memory?
According to this session.
memory written by an app
all heap allocations
decoded image buffers
VM profile shows some info of the dirty memory
like dirty memory size. They are anonymous.
vmmap --summary App.memgraph
In this session, this Apple dev uses heap to get more info about the object sizes.
heap App.memgraph
One old application started to consume memory a lot after server update. Memory usage seems to rise with out limit until program hangs.
According to FastMM4 and EurekaLog, there's no memory leak (except 28 bytes), so I assume all memory is freed when application is shutdown.
Are there any tools or strategies suitable for tracking this kind of memory problem?
Since September 2012, there is a very simple and comfortable way to find this type of "run-time only" memory leaks.
FastMM4991 introduced a new method, LogMemoryManagerStateToFile:
Added the LogMemoryManagerStateToFile call. This call logs a summary of
the memory manager state to file: The total allocated memory, overhead,
efficiency, and a breakdown of allocated memory by class and string type.
This call may be useful to catch objects that do not necessarily leak, but
do linger longer than they should.
To discover the leak at run time, you only need these steps
add a call to LogMemoryManagerStateToFile('memory.log', '') in a place where it will be called in intervals
run the application
open the log file with a tail program (for example BareTail), which will auto-refresh when the file content changes
watch the first lines of the file, they will contain the memory allocations which occupy the highest amount of memory
if you see a class or memory type constantly has a growing number of instances, this can be the reason of your leak
The growing memory consumption is an application issue. It is not a bug, which can discover FastMM4 or EurekaLog. As from they point of view - application just correctly uses the memory.
Using AQTime, MemProof (hard to find, D7 is last supported version (?)), SleuthQA (similar to MemProof) or similar memory profilers, you can track the memory usage outside of application in real-time.
Using FastMM4, GetMemoryManagerState / GetMemoryManagerUsageSummary you can track memory usage from application. Output this information into trace file and analyze it after run. Or make simple wrapping function for one of the above procedures, which will return curent memory usage. And call it from IDE Debugger Evalute / Modify, add to Watches or call OutputDebugString, and see the current memory usage.
Note, if memory is eated by some DLL then you may not see her memory usage using (3). Use (2).
Analyzing the memory usage and the tasks performed by the application, you may discover what leads to raised memory usage.
AQTime (a commercial tool which is quite expensive) can report your memory usage, down to the line of source code that allocated each object. In the case of very large memory usage scenarios, you might want the AQTime functionality that can show the number of objects and the size (total plus individual instance size) for each object. AQTime worked great for me, starting with Delphi 7, and all later versions, including your version (2006) and the latest versions (XE and XE2).
As the program memory usage grows, AQTime can be used to grab "snapshots" of the runtime heap, you can use to understand memory usage of your application; What is being created, and how many of each object exists. Even when no leaks exist, understanding the runtime behaviour of your application in terms of the objects it creates and manages, is very important, and AQTime is the most powerful tool I know of for Delphi users.
If you are willing to upgrade to Delphi XE/XE2, you might have an included light version of AQTime already, if so, check it out. If not, I recommend you try their demo. I am unaware of any free or open source alternatives that can provide the same functionality.
Lesser functionality could be cobbled together manually by writing lots of trace messages, or using the FastMM full-debug-mode. If you could write a complete dump of your memory usage into a very large file, you might be able to write some tools to parse, and create a summary. The problem I have with FastMM in this case, is that you will be drowned in detail information, without the ability to extract exactly the summary information that helps you understand your situation. So, you can try to write your own tool to summarize the memory usage. In one application I had that used a series of components that I knew would use a lot of memory, I wrote a dialog box into my application that showed current memory usage by these large memory-blob-of-data objects.
Have you ever think about the Leak that is causing the IDE... it is so huge!!!
In my case (2GB of RAM) i do the next...
1. Open the IDE
2. Leave it minimized for near six hours
3. See how Physical memory is getting used
The result:
While IDE is oppened (remember i also do the test having it minimized) it is getting more and more RAM... till no more ram free.
It gets all 2GB RAM + all Pagefile hard disk space (i have it configured to a mas of 4GB)
In less that six hours (doing nothing on IDE) it tries to use more than 6GB.
That is called a Memory Leak casused by the IDE... i do not type any letter on IDE, do not compile anything, do not even open any project... just open IDE and minimize it... leave the computer without doing anything on it for about six hours and IDE is consuming 6GB of memory.
Of course, after that, the IDE start with annoying messages of SystemOutOfMemory... and i must kill it... then all that 6GB are freed!!!
When on the hell will this get fixed?
Please note i have all patches applied, i also tested without applying each patch/hotfix, etc...
The best i got was dissabling some options on Tools, like the one that underlines bad code, etc... so why on the hell that option has any influence... i am not typing anything on the IDE (on the tests)... and if i have it dissabled the memory leak gets reduced a lot...
Of course, if i use the IDE (write code on an opened project) without even compiling / running it... the thing goes much more worst... memory leak upto 6GB can got reached on less than an hour, sometimes occurs after 15 minutes of Copy/Paste source code.
Seems there will not be a solution in a short time!!!
So i got the next solution that works perfect:
-Close the IDE an reopen it each 15 minutes or less
Ugly solution, i know... but works!!!