CIContext, iOS 9 and memory issues - ios

So I recently updated iOS to 9.0.2.
I've been using RosyWriter, Apple's example to capture and filter video frames using CIFilter and CIContext.
And it worked great in iOS 7 and 8.
It all broke down in iOS 9.
Now memory report in RosyWriter and my app looks like this:
And eventually the app crashes.
I call [_ciContext render: toCVPixelBuffer: bounds: colorSpace: ]; and imageWithCVPixelBuffer. Looks like CIContext has an internal memory leak when I call these two methods.
After spending about 4 days I found that if I create a new CIContext instance every time I want to render a buffer and release it after - this keeps the memory down. But this is not a solution because it's too expensive to do so.
Anyone else has this problem? Is there a way around this?
Thanks.

I can confirm that this memory leak still exists on iOS 9.2. (I've also posted on the Apple Developer Forum.)
I get the same memory leak on iOS 9.2. I've tested dropping EAGLContext by using MetalKit and MLKDevice. I've tested using different methods of CIContext like drawImage, createCGImage and render but nothing seem to work.
It is very clear that this is a bug as of iOS 9. Try it out your self by downloading the example app from Apple (see below) and then run the same project on a device with iOS 8.4, then on a device with iOS 9.2 and pay attention to the memory gauge in Xcode.
Download
https://developer.apple.com/library/ios/samplecode/AVBasicVideoOutput/Introduction/Intro.html#//apple_ref/doc/uid/DTS40013109
Add this to the APLEAGLView.h:20
#property (strong, nonatomic) CIContext* ciContext;
Replace APLEAGLView.m:118 with this
[EAGLContext setCurrentContext:_context];
_ciContext = [CIContext contextWithEAGLContext:_context];
And finaly replace APLEAGLView.m:341-343 with this
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
#autoreleasepool
{
CIImage* sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIFilter* filter = [CIFilter filterWithName:#"CIGaussianBlur" keysAndValues:kCIInputImageKey, sourceImage, nil];
CIImage* filteredImage = filter.outputImage;
[_ciContext render:filteredImage toCVPixelBuffer:pixelBuffer];
}
glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);

Just use below code after use context
context = [CIContext contextWithOptions:nil];
and release CGImageRef object also
CGImageRelease(<CGImageRef IMAGE OBJECT>);

Krafter,
Are you writing custom filters? I'm finding that dod works differently in iOS 9.
It looks like if dod.extent.origin.x and dod.extent.origin.y are not close to whole numbers stored as doubles (e.g. 31.0, 333.0), then the extent.size of the output image will be (dod.extent.size.width + 1.0, dod.extent.size.height + 1.0). Before iOS 9.0 the extent.size of the output image was always (dod.extent.size). So, if (you are cycling the same image through a custom CIFilter over and over && your dod.extent isn't close to nice, even whole numbers) {you get an image whose dimensions increase by 1.0 each time the filter runs, and that might produce a memory profile like you have.}
I'm assuming this is a bug in iOS 9, because the size of the output image should always match the size of dod.
My setup: iOS v9.2, iPhone 5C and iPad 2, Xcode 7.2, Obj-C

Related

"[CALayer renderInContext]" crashes on iPhone X

I have a custom UIView that I want to render as an UIImage. The custom UIView is a subclass of UIImageView.
Inside this view, I am rendering some UI elements (drawing a bunch of circles over the image). The number of circles added can go up to the order of thousands.
I am using this simple code snippet to render the view as an UIImage:
// Create the UIImage (code runs on the main thread inside an #autorelease pool)
UIGraphicsBeginImageContextWithOptions(viewToSave.image.size, viewToSave.opaque, 1.0);
[viewToSave.layer renderInContext:UIGraphicsGetCurrentContext()];
imageToSave = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Since I'm rendering some stuff inside the .layer of the UIView,
// I don't think I can use "drawViewHierarchyInRect: afterScreenUpdates:"
Here is the memory allocations taken from Instruments, in an example with ~3000 circles added as sub-views:
Now here's the strange part... This runs fine and I can render the image multiple times (consecutive) and save it in the image gallery on devices like iPhone 5, iPhone 5s, iPhone 6s, iPad Air 2, iPad Mini 4... But the same code triggers a memory warning on iPhone X and eventually crashes the application...
Unfortunately, I do not have access to an iPhone X and the person who reported this doesn't have access to a Mac, so I cannot investigate deeper.
I really don't know if I am doing anything wrong... Are you aware if there is something different about the iPhone X? I've been struggling this issue for quite a while...
I guess that the problem has to do with how CALayer:renderInContext: handles drawing of thousands of views in a context that requires them to be scaled up. Would it be possible to try render the sub-views yourself? Then compare and verify if it works better by using instrumentation.
UIImage *imageToSave = [self imageFromSubLayer:viewToSave];
- (UIImage *)imageFromSubLayers:(UIImageView *)imageView {
CGSize size = imageView.image.size;
UIGraphicsBeginImageContextWithOptions(size, YES, .0);
CGContextRef context = UIGraphicsGetCurrentContext();
for (CALayer *layer in imageView.layer.sublayers)
[layer renderInContext:context];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
As allways, the answer lays in the smallest (and not included in the question) detail...
What I really didn't consider (and found out after quite a while), is that the iPhone X has a scale factor equal to #3x. This is the difference between the iPhone X and all other devices that were running the code just fine...
At some point in my code, I was setting the .contentScaleFactor of the subviews to be equal to [UIScreen mainScreen].scale. This means that on higher-end devices, the image quality should be better.
For the iPhone X, [UIScreen mainScreen].scale returns 3.
For all of the other devices I have tested with, [UIScreen mainScreen].scale returns 2. This means that on the iPhone X, the ammount of memory used to render the image is way higher.
Fun fact number two: From another useful SO post, I found out that on the iPhone X, if you try to allocate more than 50% of it's total amount of memory (1392 MB), it crashes. On the other hand, for example, in the case of the iPhone 6s, the percentage is higher: 68% (1396 MB). This means that for some older devices you have more meory to work with than on the iPhone X.
Sorry for missleading, it was a honest mistake from my part. Thank you all for your answers!
I recently wrote a method into an app I'm making to convert UIView's into UIImages (so i could display gradients on progress views/tab bars). I ended up settling with the following code, I'm using this code to render tab bar buttons, it works on all devices including the X.
Objective C:
UIGraphicsImageRenderer *renderer = [[UIGraphicsImageRenderer alloc] initWithSize:gradientView.bounds.size];
UIImage *gradientImage = [renderer imageWithActions:^(UIGraphicsImageRendererContext * _Nonnull rendererContext) {
[gradientView drawViewHierarchyInRect:gradientView.bounds afterScreenUpdates:true];
}];
Swift 4:
let renderer = UIGraphicsImageRenderer(size: gradientView.bounds.size)
let image = renderer.image { ctx in
gradientView.drawHierarchy(in: gradientView.bounds, afterScreenUpdates: true)
}
I used this code in a sample project I wrote up, here is a link to the project files:
Swift,
Objective C
as you will see both projects will run on iPhone X perfectly!
I know, the following sounds weird. But try to make the target image one pixel larger than the one that you are drawing. This solved it for me (my particular problem: "[CALayer renderInContext]" crashes on iPhone X).
In code:
UIGraphicsBeginImageContextWithOptions(
CGSizeMake(viewToSave.image.size.width + 1,
viewToSave.image.size.height + 1),
viewToSave.opaque,
1.0
);

GLKView snapshot returns distorted image

I am using CIFunHouse apple demo for my project to applying filter effects, When I try to take a snapshot of GLKView on iPad Air by making
UIImage* imageCaptured = [(GLKView*)_videoPreviewView snapshot];
the UIImage generated have distortion. see image
Any ideas? How to fix this.
Thanks in advance.

CIImage to CMSampleBufferRef conversion

I am using captureStillImageAsynchronouslyFromConnection: to get image sample buffer from a camera. After that I am running the image through an OpenGL (GPU) to apply filters but unfortunately someone at Apple have put camera on iPhone 4 which output is bigger than the maximum texture size.
Brad Larsons explanation: The iPhone 4 is a special case, in that it can take photos large enough (2592x1936) that they just exceed the maximum texture size of the GPU on those devices (2048x2048). This causes the processing to fail, currently. All other devices either don't take photos that large, or support larger texture sizes (the iPad 2, iPad 3, and iPhone 4S support these larger sizes).
so the code I have is scaling down the image but I have to create CMSampleBufferRef after resizing on iPhone 4 only to cheat the capture process ... anyone knows how to get the CMSampleBufferRef from CIImage?
Objective-C
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)
options:[NSDictionary dictionaryWithObjectsAndKeys:[NSNull null], kCIImageColorSpace, nil]];
ciImage = [[ciImage imageByApplyingTransform:myScaleTransform] imageByCroppingToRect:myRect];

CATiledLayer PDF Performance is Poor on iPad 3 Retina Display

I'm using rather straightforward code to display a zoomable PDF in a scrollview, and it has been working beautifully on the iPad 2 and the original iPad. But it's staggeringly slow on the iPad 3. I know I'm pushing more pixels, but the rendering performance is simply unacceptable.
In iOS 5.0 and later, the tileSize property is arbitrarily clamped at 1024, which means tiles appear half that size on the retina display. Has anyone found a way to overcome this limitation?
Otherwise, has anyone found a way to improve the speed of the CATiledLayer on the iPad 3?
Have you tried setting shouldRasterize to YES on the layer?
Did you run a time profiler on these draws and did you rule out the possibility of redundant draws?
I've had some weird double drawing, which was easily found using:
- (void)drawLayer:(CALayer *)layer inContext:(CGContextRef)context
{
NSLog(#"draw %#", NSStringFromCGRect(CGContextGetClipBoundingBox(context)));
// draw pdf
}
There's also a variety of settings to play with:
tiledLayer.levelsOfDetail = 2
tiledLayer.levelsOfDetailBias = 4
tiledLayer.tileSize = self.bounds.size
CGContextSetInterpolationQuality(context, kCGInterpolationLow)
CGContextSetRenderingIntent(context, kCGRenderingIntentDefault)
self.contentScaleFactor = 1.0

OpenGL GL_RGB texture format isn't working on iOS (GL_RGBA works great)

Hello people of the wasteland :),
Brief: There is a problem with GL_RGB internal texture format on iOS platform.
In my application I try to save some memory by using GL_RGB instead of GL_RGBA as an internal format.
I'm using the next code piece to achieve this. Nothing else is changed.
glTexImage2D(_textureTargetType,
0,
GL_RGB, // pixel internalFormat
texWidth, // image width
texHeight, // image height
0, // border
GL_RGBA, // pixel format
GL_UNSIGNED_BYTE, // pixel data type
bitmapData);
On MacOS these changes went fluently, no problems. But on iOS, particularly 4.3 (OpenGL ES2.0) it gives me GL_INVALID_OPERATION everytime I try to render textured polgons with this texture. As nothing except this format is changed I think the problem is in incompatibility of GL_RGB internal format with OpenGL ES2.0. This is just my guess, I'm no guru.
This doesn't work in simulator nor iPod touch 4th gen.
Thank you for any reasonable suggestion.
According to the documentation, "internalformat must match format. No conversion between formats is supported during texture image processing." See the Khronos website. OpenGL does not have this limitation, so this code will work on Mac OS, but not the more limited OpenGL ES on iOS devices.

Resources