Too much memory consumed with setContentView - memory

I am facing a problem from last many days and i am not able to understand what's going wrong. I have a lay out with 3 buttons (3 more when buttons are pressed ) a background image and a background on the action back. The resolutions of the buttons are 294*286( 32 bit color ), the background image is 367*592 and my screen resolution is 1080*1092
which got from the below code:
Display display = getWindowManager().getDefaultDisplay();
android.graphics.Point size = new android.graphics.Point();
display.getSize(size);
Log.e("MemoryToUse", Integer.toString(size.x) + "/" + Integer.toString(size.y));
When i used the layout in the setContentView(...) the memory sized increased by 25MB and my total (first)activity ( not the app ) crossed 52Mb's to load.
When i made all the background in the bitmaps as "#null" then the size reduced to 25MB.
I am using the same background and about 3 buttons in every activity but my first activity's layout is taking 25MB while the other activities ( even with more GUI elements and atleast 3 buttons + more line of codes) is taking 20MB(approx).
I need to reduce the first activity's load memory but i have no clue.
Any help/suggestions will be highly appreciated. My Activity is empty ( as i used a blank activity to check how much memory the layout is taking ) and let me know if there are any questions/ or any specific code snippet is needed for any analysis.
Thanks.
GB.

A friend of mine share the below link of stackoverflow and it worked.
Android background image memory usage

Related

SceneKit scenes lag when resuming app

In my app, I have several simple scenes (a single 80 segment sphere with a 500px by 1000px texture, rotating once a minute) displaying at once. When I open the app, everything goes smoothly. I get constant 120fps with less than 50mb of memory usage and around 30% cpu usage.
However, if I minimize the app and come back to it a minute later, or just stop interacting with the app for a while, the scenes all lag terribly and get around 4 fps, despite Xcode reporting 30fps, normal memory usage, and super low (~3%) cpu usage.
I get this behavior when testing on a real iPhone 7 iOS 10.3.1, and I'm not sure if this behavior exists on other devices or the emulator.
Here is a sample project I pulled together to demonstrate this issue. (link here) Am I doing something wrong here? How can I make the scenes wake up and resume using as much cpu as they need to maintain good fps?
I won't probably answer the question you've asked directly, but can give you some points to think about.
I launched you demo app on my iPod 6-th gen (64-bit), iOS 10.3.1 and it lags from the very beginning up to about a minute with FPS 2-3. Then after some time it starts to spin smoothly. The same after going background-foreground. It can be explained with some caching of textures.
I resized one of the SCNView's so that it fits the screen, other views stayed behind. Set v4.showsStatistics = true
And here what I got
as you can see Metal flush takes about 18.3 ms for one frame and its only for one SCNView.
According to this answer on Stackoverflow
So, if my interpretation is correct, that would mean that "Metal
flush" measures the time the CPU spends waiting on video memory to
free up so it can push more data and request operations to the GPU.
So we might suspect that problem is in 4 different SCNViews working with GPU simultaneously.
Let's check it. Comparing to the 2-nd point, I've deleted 3 SCNViews behind and put 3 planets from those views to the front one. So that one SCNView has 4 planets at once. And here is the screenshot
and as you can see Metal flush takes up to 5 ms and its from the beginning and everything goes smoothly. Also you may notice that amount of triangles (top right icon) is four times as many as what we can see on the first screenshot.
To sum up, just try to combine all SCNNodes on one SCNView and possibly you'll get a speed up.
So, I finally figured out a partially functional solution, even though its not what I thought it would be.
The first thing I tried was to keep all the nodes in a single global scene as suggested by Sander's answer and set the delegate on one of the SCNViews as suggested in the second answer to this question. Maybe this used to work or it worked in a different context, but it didn't work for me.
How Sander ended up helping me was the use of the performance statistics, which I didn't know existed. I enabled them for one of my scenes, and something stood out to me about performance:
In the first few seconds of running, before the app gets dramatic frame drops, the performance display read 240fps. "Why was this?", I thought. Who would need 240 fps on a mobile phone with a 60hz display, especially when the SceneKit default is 60. Then it hit me: 60 * 4 = 240.
What I guess was happening is that each update in a single scene triggered a "metal flush", meaning that each scene was being flushed 240 times per second. I would guess that this fills the gpu buffer (or memory? I have no idea) slowly, and eventually SceneKit needs to start clearing it out, and 240 fps across 4 views is simply too much for it to keep up with. (which explains why it initially gets good performance before dropping completely.).
My solution (and this is why I said "partial solution"), was to set the preferedFramesPerSecond for each SceneView to 15, for a total of 60 (I can also get away with 30 on my phone, but I'm not sure if this holds up on weaker devices). Unfortunately 15fps is noticeably choppy, but way better than the terrible performance I was getting originally.
Maybe in the future Apple will enable unique refreshes per SceneView.
TL;DR: set preferredFramesPerSecond to sum to 60 over all of your SceneViews.

Optimization lot of UIImageView

I run into an optimization problem. I have a to display a lot of image a the same time on the screen (up to 150). These image are displayed like if it was a sphere, that can roll.
So far it run 60 fps on a new device, but there is some micro lags and I don't know what is the cause. The profiler tell me 59-60 fps but the problem is that the missing fps is visible and fell like a micro-slutter.
If I reduce the number of imageViews drastically, this problem disappear.
It doesn't seems to be gpu-limited, because I use something like 30% at most of the gpu, but the cpu never goes to more than 50% neither.
What could cause that/how can I detect the cause of this micro lag ? (the image are inited with contentOfFile)
What is the best solution to improve performance knowing that at least half of the image are not on the screen (other side of the sphere) ?
-Recycle UIImageViews
-Set not showed imageView to hidden (it's what i'm doing right now)
-Use rasterization layer (But since the size is changing everytime, it will re-render, right ?)
-Another magic solution that i'm not aware of.
If you want to try the app :
https://itunes.apple.com/us/app/oolala-instant-hangout-app/id972713282?mt=8
1) The way those shapes are created is also important. The most performant way to do that (that's I'm aware of) would be to pre-render images with alpha channel.
2) There is also a "classic" jpeg decompression problem
3) We used CALayer directly to render ~300 images in a UIScrollView on iPhone 4S screen without dropped frames. Might give it a shot.
_layer = [CALayer new];
_layer.contents = (id)_image.CGImage;
_layer.opaque = _opaque;
_layer.contentsScale = [UIScreen mainScreen].scale;
It was a long time ago and was used on iOS 5. You should probably disregard this suggestion unless you have a fast way to test it.

UICollectionView low frame rate

I have a collection view that displays 3 images, two labels, and 1 attributed string(strings are of different colors and font sizes and values are not unique for every cell). One of the images is coming from the web and I used AFnetworking to do the downloading and caching. The collection view displays 15 cells simultaneously.
When I scroll I can only achieve 25 frames/sec.
Below are the things I did:
-Processing of data were done ahead and cached to objects
-Image and views are opaque
-Cells and views are reused
I have done all the optimizations I know but I can't achieve at least 55 frames/sec.
If you could share other techniques to speed up the re-use of cells.
I was even thinking of pre-rendering the subviews off screen and cache it somewhere but I am not sure how it is done.
When I run the app on the iPhone it is fast since it only shows at least four cells at a time.
The first thing you need to do is fire up instruments and find out if you're CPU-bound (computation or regular I/O is a bottleneck) or GPU-bound (the graphics card is struggling). . depending on which of these is the issue the solution varies.
Here's a video tutorial that shows how to do this (among other things) . . This one is from Sean Woodhouse # Itty Bitty Apps (they make the fine Reveal tool).
NB: In the context of performance tuning we usually talk about being I/O bound or CPU bound as separate concerns, however I've grouped them together here meaning "due to either slow computation or I/O data is not getting to the graphics card fast enough". . if this is indeed the problem, then the next step is to find out whether it is indeed related to waiting on I/O or the CPU is maxed-out.
Instruments can be really confusing at first, but the above videos helped me to harness its power.
And here's another great tutorial from Anthony Egerton.
What is the size of the image that you use?
One of the optimization technique which would work is that
Resize the image so that it matches the size of the view you are displaying.

Faster Alternatives to UIImageView in iOS

I have a list of png images that I want them to show one after another to show an animation. In most of my cases I use a UIImageView with animationImages and it works fine. But in a couple of cases my pngs are 1280*768 (full screen iPad) animations with 100+ frames. I see that using the UIImageView is quite slow on the emulator (too long to load for the first time) and I believe that if I put it on the device it will be even slower.
Is there any alternative that can make show an image sequence quite smoothly? Maybe Core Animation? Is there any working example I can see?
Core Animation can be used for vector/key-frame based animation - not image sequences. Loading over a hundred full-screen PNGs on an iPad is a really bad idea, you'll almost certainly get a memory warning if not outright termination.
You should be using a video to display these kind of animations. Performance will be considerably better. Is there any reason why you couldn't use a H.264 video for your animation?
Make a video of your pictures. It is the simplest and probably most reasonable approach.
If you want really good performance and full control over your animation, you can convert the pictures to pvrtc4 format and draw them as billboards (textured sprites) with OpenGL. This can be a lot of work if you don't know how to do it.
Look at the second example
http://www.modejong.com/iPhone/
Extracts from http://www.modejong.com/iPhone/
There is also the UIImageView.animationImages API, but it quickly sucks up all the system memory when using more than a couple of decent size images.
I wanted to show a full screen animation that lasts 2 seconds, at 15 FPS that is a total of 30 PNG images of size 480x320. This example implements an animation oriented view controller that simply waits to read the PNG image data for a frame until it is needed.
Instead of alllocating many megabytes, this class run in about a half a meg of memory with about a 5-10% CPU utilization on a 2nd gen iPhone. This example has also been updated to include the ability to optionally play an audio file via AVAudioPlayer as the animation is displayed.

What is the maximum width for frame in iPad Application?

my application like calendar view and i should set all views onload so should setup for every year 12 grid view but the iPad(3) give me memory worrying after build 13 grid view and it should cause it is a high data so i search for another way and i found something could infinite & StreetScroller but when i try it the data should be fix (ex: can't handle the position to change data load the next year) is that correct or i miss. up and there is a way to use this and i don't know it ???
plz help me to find to solution ???
From what I understand, you're implementing a calendar type view, and you get memory issues with large amounts of data.
In answer to the title of this question, there is no maximum width for a frame (I assume you mean UIView) in an iOS application. But, it is important to make sure you manage your memory correctly. For example, I could have a view that is 749202 pixels wide, and contains detailed charts/text/images/etc... If I filled out this View in full when it first loaded, the application would crash; it would use too much memory.
In order to make sure this doesn't occur, I need to optimize my memory usage. For example, I know that the maximum width of the iPad in portrait is 768 pts, and landscape is 1024 pts (discoverable via the width and height of a view's window). As such, I will only create/render my data when it is (or about to be) in view. This means I only have to render 0.1% of my total width at a time, which is a lot more manageable memory wise.
As to your specifc situation, the description is vague, and there isn't any code, so this is the best information I can give. In regards to the StreetScroller example (if I remember this is the WWDC sample project for UIScrollView), I don't know what issue you're having with that exactly.

Resources