We use instruments automation tool to test our ios app. (We use xcode 6.4.)
The tests include taking many screenshots which are checked then by imagemagick.
The problem I see is that memory usage of instruments application itself grows with time. It performs slower and slower and finally almost hangs. Activity Monitor shows 10 GB usage for instruments process. While physical memory is 8 GB.
I also noticed that when I uncheck option "Continuously Log Results", memory does not grow. But I need this option, since I want access to screenshot files. I don't understand why writing screenshots to disk consumes all memory.
I made a simple test to reproduce the problem:
var target = UIATarget.localTarget();
for (var i=0; i<100; i++)
target.captureScreenWithName("scr" + i);
UIALogger.logMessage("done");
If "Continuously Log Results" is checked, then each run adds hundreds of megabytes to instruments process memory usage (I think it depends on screen resolution of device). This memory is not released even if I close window with current run. Only if I quit instruments process completely, the memory is released.
Any thoughts what could be wrong will be appreciated. Seems like a memory leak in instruments to me.
Related
I'm attempting to download a large number of images using AFNetworking 2.5 and stream them to disk. According to the memory monitor in Xcode, this is causing unbounded memory growth (and eventually memory warnings and force quits) - but profiling the memory usage using the allocations instrument shows the memory usage to be stable.
I'd like to think Xcode is just wrong here, but then why would my app be getting killed by iOS?
Instruments shows this - the peaks are Core Data queries, then the rest is the images downloading - peaks at about 9.5MB, the rest sits at about 8.5MB
I've also tried Heapshot Analysis - which shows a tiny bit of growth but nowhere near the amount reported by Xcode
Xcode's memory monitor shows this - growing by multiple MB per iteration.
Is there any way to get Instruments to show me whatever Xcode is seeing? Or is there a better instrument to use to find out where all this memory is going?
Thanks!
According to the memory monitor in Xcode, this is causing unbounded memory growth (and eventually memory warnings and force quits) - but profiling the memory usage using the allocations instrument shows the memory usage to be stable.
Believe Instruments and the Allocations information - not the memory monitor in Xcode. The memory monitor graph is completely irrelevant. Ignore it.
This is not because the memory monitor in Xcode is useless or wrong. It is because memory management is completely different for a debug build than for a release build (Instruments uses a release build). This is especially true in Swift (you don't say whether you're using Swift).
Observe memory usage only on the device and only in a release build. Otherwise, you'll be completely misled.
My app is running out of memory. In XCode's memory report I can see the memory usage rise on the device to a little bit above 500Mb before it is shut down.
When profiled in Instruments (either with the allocations tool or the leaks tool) this does not happen. The process runs up to about 100Mb and balances out as it runs the memory intense portion of the task. The app does not crash when run in Instruments.
What would cause the discrepancy?
The intense process that runs is utilizing a UIWebView to determine the length of a number of pages of content. The web view is in the background and loads a page. On completion of the load it calculates the size and loads the next page until all pages have their length calculated.
Since I have been unable to get the same memory issues to occur in Instruments, I added logging to init and dealloc methods on all of the major parts and can confirm those are being allocated and released as expected.
After that, I tried assuming that allocation and deallocation was happening properly, but that I was just allocating faster than the system could reallocate memory. I tried stopping the process early before memory ran out to see if the memory usage would drop. XCode's memory report does report a small drop, but not by a significant amount -- even after letting it sit for a few minutes.
My next step is to try to simplify the process until the problem is eliminated.
Has anyone else come across something like this where an app in Instruments does something completely different than not in Instruments or have any explanation for what might cause that?
I would look at the two schemes and make sure the settings are the same. It's possible that the profiler is using a non-debug configuration and "Run" is using debug configuration.
I'd pay special attention to the "Enable Zombie Objects" in the "Diagnostics" tab of the "Run" configuration, as that can take up memory keeping track of all of the deallocated objects. Zombies are a wonderful diagnostic tool, but you want to turn that off in order to ensure you reclaim all of the memory associated with the deallocated objects.
For information on getting to the scheme configuration, see https://developer.apple.com/library/mac/recipes/xcode_help-scheme_editor/Articles/SchemeDialog.html.
I'm working on my first ARC & Core Data project, basing this stage on Xcode's (Universal) Master-Detail template.
I note that Xcode5 has a memory display in the Debug Navigator but when using it find its graph bears few similarities with mem usage displayed in Instruments when running a Leaks&Allocations trace.
I've done the Instruments tracing with the Simulator (simulating both iPhone & iPad - in case the 'unloading' of the detail View with the latter makes a difference) and on an iPad2 & an iPodTouch. The results are broadly the same:
iPhone 6.1 simulator
Generation A--------1.13 MB
Generation B--------397.70 KB
Generation C--------76.96 KB
Generation D--------11.70 KB
Generation E--------1.56 KB
Generation F--------3.48 KB
an overall growth of c30%
where Generation A shows the growth to the loading of the Master table, and each successive Generation the growth after the Detail view has been visited and interacted with (entailing the fetching of NSManagedObjects and the creation of NSObjects, respectively).
The growth trend with the other devices was broadly similar (with the Generation A growth being iPad sim:1.42; iPad2:1.57; iPodTouch:0.94 but tailing off similarly).
According to the Debug Navigator, however, the total usage at each point comes out at:
iPhone 6.1 Debug Navigator
Generation A--------4.2 MB
Generation B--------6.9 MB--growth 2.7
Generation C--------7.1 MB--growth 0.2
Generation D--------7.8 MB--growth 0.7
Generation E--------8.0 MB--growth 0.2
Generation F--------8.4 MB--growth 0.4
an overall growth of 100%!
Referring to other similar questions, I don't have Zombies enabled.
Have others seen such discrepancies? Am I right in being inclined to trust Instruments over the Debug Navigator's summary figure?
PS. Debug Navigator's summary figure doesn't seem to be available when running the real devices (both on versions of iOS5). Is this normal?
This may not be a very good answer for you, but it is my justification for this issue with the research that I have done.
The debug navigator shows the same things as the "Activity Monitor" instrument. It is not showing current allocated memory by your app, it's showing current memory allowed to your app by the OS.
Say I create a for loop to create lots of objects in memory, but then I delete half of them because they don't fit my search criteria (bad coding, I know, but hypothetically here). The OS would get a request from your app for the full memory to create all of the objects, but after the loop when you check your allocations in instruments it shows only the saved objects because garbage collection took out the deleted ones. The OS may or may not know about the garbage collection events, but it doesn't take the memory that it just gave you away. I'm not sure of the overhead of giving/taking available memory from your app, but I'm sure they take that into account. I've noticed that if I leave my app alone long enough the OS takes some of the memory I'm not using back.
Just think of the debugging memory information as your app's full memory allotted by the OS. You may not be using all of it, but the OS gave it to you anyway (for one reason or another). This number will increase based on your app's requests/use. It will decrease due to Memory Pressure warnings or inactivity that the OS thinks it can safely recover the memory from you. It will likely never match the Instruments allocated memory information because there is always transient memory used in applications that needs to be allocated somewhere even for a short time.
Again, this is my conclusion based on when I was wondering the same thing you are. Hope it helps a little.
Our app runs fine. In Instruments (on the simulator), when trying to profile memory, under Allocations I see Overall Bytes rising very fast, while Live Bytes remain small. There are no leaks and VM usage seems stable (if large).
This is not a show-stopper, but makes profiling that much more difficult. Any ideas?
Thanks!
I'm in the process of understanding how to put instruments to better use. I just finished a leak management exercise, and instruments is reporting very few leaks. I'll figure those out later. In the mean time, my app is crashing, and it appears that its related to memory pressure.
So I looked at this in Instruments. I have Allocations and Memory Monitor in use. Allocations shows a pretty steady 3 to 4 MB Live bytes while I just let my app initialize and come to equilibrium. Overall bytes, however, jumps to over 50 MB. I didn't think much of this until I looked at the Memory Monitor and I see that memory usage goes up and down, causing memory warnings. (It seems strange to me that this doesn't show up on the allocations graph at the same time.)
The app should be at an equilibrium point, but apparently it's not. My question is how can I use instruments to help me understand why memory usages is rising and falling?
Instruments as a tool for debugging is simply excellent. From what I can understand, you have been trying to use the allocations tool, so I'll go over that. Allocations details the number of objects your application allocates during it's execution, along with their in-memory references, locations, even the calling code that allocates said objects. When instruments starts running the allocations tool, your application begins reporting all allocations as blue dots, which pile up higher and higher as your application executes (naturally, as you should be allocating more and more objects). Overall Bytes displays the amount of memory EVERY allocation your app has made added together. I want to stress this for your case: it does not mean your app is currently using 50 mb of memory!, it just means that your app has used 50 mb total. Your app is obviously limited to the amount of memory the device has, and 3-4 mb is not a lot when you consider that the first gen. iPhone had about 128mb, but for more complicated applications, the OS will usually kill off other applications before it kills yours.
As for the other allocation graph with spikes, rather than a continuous line graph, that is to detail the number of allocations going on at that point in time. Usually, the spikes can be ignored, unless there are a lot of large spikes in one small amount of time.
Anyways, to address your specific memory warning problem, it honestly depends how many memory warnings you are receiving, and at what level the warning are at. And as for your leaks, my only word of advice is: Squash them as soon as possible! When you see a leak (a red bar in the leaks tool), click on the bar and find the objects that are being leaked. When you select a leaked object, then select the right sidebar, it will show you the code that is leaking. When you double click on any part of the right sidebar, it'll even open up the specific line and class the leak originated from!