(iOS) Increasing memory consumption related to GameScene - ios

When a GameScene has been presented, there are two categories which constantly increase in number of persistent#: '_NSArrayM' and 'CAMetalDrawable'. Which results in memory consumption to increase at roughly 0.1 MB every 20 seconds. However, when I set SKView.isPaused = true, memory consumption stops increasing. Is this normal or even bad?
Additionally, how can I clear the persistent data?

If you're still looking for a solution, I had a similar issue.
If you edit scheme and disable Metal API Validation and GPU frame capture, it seemed to fix it for me.

Related

SceneKit scenes lag when resuming app

In my app, I have several simple scenes (a single 80 segment sphere with a 500px by 1000px texture, rotating once a minute) displaying at once. When I open the app, everything goes smoothly. I get constant 120fps with less than 50mb of memory usage and around 30% cpu usage.
However, if I minimize the app and come back to it a minute later, or just stop interacting with the app for a while, the scenes all lag terribly and get around 4 fps, despite Xcode reporting 30fps, normal memory usage, and super low (~3%) cpu usage.
I get this behavior when testing on a real iPhone 7 iOS 10.3.1, and I'm not sure if this behavior exists on other devices or the emulator.
Here is a sample project I pulled together to demonstrate this issue. (link here) Am I doing something wrong here? How can I make the scenes wake up and resume using as much cpu as they need to maintain good fps?
I won't probably answer the question you've asked directly, but can give you some points to think about.
I launched you demo app on my iPod 6-th gen (64-bit), iOS 10.3.1 and it lags from the very beginning up to about a minute with FPS 2-3. Then after some time it starts to spin smoothly. The same after going background-foreground. It can be explained with some caching of textures.
I resized one of the SCNView's so that it fits the screen, other views stayed behind. Set v4.showsStatistics = true
And here what I got
as you can see Metal flush takes about 18.3 ms for one frame and its only for one SCNView.
According to this answer on Stackoverflow
So, if my interpretation is correct, that would mean that "Metal
flush" measures the time the CPU spends waiting on video memory to
free up so it can push more data and request operations to the GPU.
So we might suspect that problem is in 4 different SCNViews working with GPU simultaneously.
Let's check it. Comparing to the 2-nd point, I've deleted 3 SCNViews behind and put 3 planets from those views to the front one. So that one SCNView has 4 planets at once. And here is the screenshot
and as you can see Metal flush takes up to 5 ms and its from the beginning and everything goes smoothly. Also you may notice that amount of triangles (top right icon) is four times as many as what we can see on the first screenshot.
To sum up, just try to combine all SCNNodes on one SCNView and possibly you'll get a speed up.
So, I finally figured out a partially functional solution, even though its not what I thought it would be.
The first thing I tried was to keep all the nodes in a single global scene as suggested by Sander's answer and set the delegate on one of the SCNViews as suggested in the second answer to this question. Maybe this used to work or it worked in a different context, but it didn't work for me.
How Sander ended up helping me was the use of the performance statistics, which I didn't know existed. I enabled them for one of my scenes, and something stood out to me about performance:
In the first few seconds of running, before the app gets dramatic frame drops, the performance display read 240fps. "Why was this?", I thought. Who would need 240 fps on a mobile phone with a 60hz display, especially when the SceneKit default is 60. Then it hit me: 60 * 4 = 240.
What I guess was happening is that each update in a single scene triggered a "metal flush", meaning that each scene was being flushed 240 times per second. I would guess that this fills the gpu buffer (or memory? I have no idea) slowly, and eventually SceneKit needs to start clearing it out, and 240 fps across 4 views is simply too much for it to keep up with. (which explains why it initially gets good performance before dropping completely.).
My solution (and this is why I said "partial solution"), was to set the preferedFramesPerSecond for each SceneView to 15, for a total of 60 (I can also get away with 30 on my phone, but I'm not sure if this holds up on weaker devices). Unfortunately 15fps is noticeably choppy, but way better than the terrible performance I was getting originally.
Maybe in the future Apple will enable unique refreshes per SceneView.
TL;DR: set preferredFramesPerSecond to sum to 60 over all of your SceneViews.

Xcode Memory Graph - showing increasing memory use - what exactly does it show?

When watching the debug graph in xcode 6 (and probably 5 too), when running my application the memory use continues to rise as I place more of a certain object on the screen and animate it's movement. It does not seem to decrease when I remove it. Once removed I believe there are no more references to them.
See screenshot:
http://i.stack.imgur.com/SnhbK.png
However when I use Instruments to try to identify what's going on, there's only around 12mb persisting, and Total Bytes continues to rise, as expected.
See screenshot:
http://i.stack.imgur.com/VBwce.png
Is this normal behaviour? What exactly is the graph in Xcode showing? Am I overlooking something?
In Instruments I have Allocation Lifespan set to All Allocations and Allocation Type set to All Heap and Anonymous VM for the screenshots above.
UPDATE
By running Instruments with Activity Monitor I was able to see that the "Real Memory" was increasing at the same rate as is displayed in Xcode. #Mark Szymczyk pointed out that OpenGL ES Texture memory allocations are not shown in the Allocations instrument.
By purging the texture cache with the following command in Cocos2D 3.1 at regular intervals, memory use consistently drops back down to around 18mb and begins increasing again as I add more sprites.
[[CCDirector sharedDirector] purgeCachedData];
Credits go to Mark Szymczyk for pointing me in this direction - thanks!
Looking at your screenshots, the Xcode graph is probably showing the equivalent of the Total Bytes column in your Instruments screenshot. When you remove an object, the persistent bytes will decrease, but the total bytes won't. That would explain why the memory use never goes down in the Xcode graph.
The Persistent Bytes column in Instruments is what you should be looking at to determine your app's memory usage.

why is memory usage changing when no code is being executed? (as3)

Hi there,
As you can see by the graph from profiler over the space of 1 minute the memory rises about 2mb and then drops back down only to rise again to the same spot. This is on an almost blank screen and no code is running. No new objects are being created. I've also noticed on iOS the CPU usage is also rising and falling in a similar pattern-from 20% up to 70%.
Thanks for reading.
There are many reasons. I recently had a similar situation where CPU was strangely high.
My debugging methodology was to comment out ALL code other than the boiler plate document class constructor and slowly introduce variables, classes and methods (in blocks rather than one at a time!) until the issue reappeared.
In my particular case it was to do with a network monitor class that I had incorrectly set up.

iPad: Allocations show 3 MB memory while dirty memory continuously increases ans reaches greater than 100 MB

I am working on an iPad app that includes lots of images, animations and videos. My app crashes after running 12 to 13 minutes. Allocations show me only 3 MB memory usage while in VM Tracker dirty memory continuously increases till it reaches 130 MB and then app crashes. But VM Tracker does not give me any insight about what is actually happening in the code and which piece of code is responsible for increasing dirty memory. I am badly stuck with it. Have tried every possible thing. I am not using [UIImage imageNamed:], have avoided autoreleased objects as long as I can, have used autorelease pools where some convenience constructor is used. I am using FTUtils for animations in most places while in some places I am using [UIView beginAnimation:] but I am unable to reach any conclusion. Any help will be highly appreciated because I am badly stuck.

fake call to applicationDidReceiveMemoryWarning being triggered even if I have around 80 MB of RAM left

I have NSLog-ed the remaining memory in a timer repeating after 1 second. It just print the remaining memory.
The runtime requirement app is around 20 MB max. Log shows free memory 90 MB+ when I launch the app.
There is a tabBar in which one of the Tabs is having a Google Mapkit's Map.
Once the application comes in working state, RAM - 80 MB.
When I scroll through the zoomed map- BOOM!!! "applicationDidReceiveMemoryWarning" logger still shows free memory counter around 75-80 MB.
This is causing my other views data to be released.
Anyways..even if the RAM is available and when app doesn't crash.. panning in Map drastically reduce the RAM to 3-4 MBs from 70-80 MB. With this case, if app claims for memory - lets say for a captured image - BOOM Again!!! "applicationDidReceiveMemoryWarning"
Anyone have experienced this before...?? any helpful comments..?
if you've not experienced this - just try it out with Apple's weather map sample app or the Native Maps Application on ur phone!! - to see how map eats up runtime memory when we pan a zoomed map...!!
Regards,
SamFisher
That's how it's supposed to work. Memory is there to be used. Any that is not used is being wasted. As long as you have no memory leaks and you're unloading/reloading your views and caches correctly, there is no problem here.

Resources