UIImage initWithCGImage leads to memory issue in Obj-C - ios

I develop an application that grabs images from iPhone rear camera. These images are then processed asynchronously.
So I am using AVFoundation functions in Obj-C. My problem is that my app is crashing because of memory issue when capturing images.
Here is the code that I use in the captureOutput callback :
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
connection.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage* ciimage = [[CIImage alloc] initWithCVPixelBuffer:imageBuffer];
CIContext* context = [CIContext contextWithOptions:nil];
CGImage* cgImage = [context createCGImage:ciimage fromRect:[ciimage extent]];
#synchronized(self) {
UIImage* image = [[UIImage alloc] initWithCGImage:cgImage];
self.uiimageBuffer = image;
}
CGImageRelease(cgImage);
}
As I need to asynchronously process the image grabbed elsewhere in the application, I introduced a buffer called uiimageBuffer. This buffer is updated everytime captureOutput is called, like written right below :
UIImage* image = [[UIImage alloc] initWithCGImage:cgImage];
self.uiimageBuffer = image;
But the allocation of the UIImage leads to memory issue very very quickly (few seconds).
So my question is : how could I update my buffer without allocating new UIImage at every calls of captureOutput ?
PS : the same piece of code written in Swift 4 doesn't lead to memory issue.
Thank you

How about #autoreleasepool? This helped me several times in both captureOutput and requestMediaDataWhenReady.
https://developer.apple.com/library/archive/documentation/Cocoa/Conceptual/MemoryMgmt/Articles/mmAutoreleasePools.html

Related

Lag when setting AVCaptureConnection video orientation

The problem in question uses AVFoundation to setup a camera whose output is displayed in a AVCaptureVideoPreviewLayer, and is also processed as a pixel buffer. In order for the pixel buffer to be processed by the - processSampleBuffer: method, it must be provided in the correct orientation, which is dependent on the device orientation.
As far as I know, this can be done either by rotating the pixel buffer as its given in the sample buffer delegate method by accessing the raw pixel values in -captureOutput:didOutputSampleBuffer:fromConnection:, or by setting the videoOrientation property on the appropriate AVCaptureConnection, which ensures the pixel buffer is provided at the desired orientation. An outline of the setup is as follows:
- (void)setupCamera
{
AVCaptureSession *session = [AVCaptureSession new];
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[session addInput:deviceInput];
dispatch_queue_t videoOutputQueue = dispatch_queue_create("com.MyApp.videoQueue", DISPATCH_QUEUE_SERIAL);
dispatch_set_target_queue(videoOutputQueue, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0));
AVCaptureVideoDataOutput *videoOutput = [AVCaptureVideoDataOutput new];
videoOutput.alwaysDiscardsLateVideoFrames = YES;
[videoOutput setSampleBufferDelegate:self queue:videoOutputQueue];
[session addOutput:videoOutput];
// more setup
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
connection.videoOrientation = [self getCurrentOrientation]; // setting this to a new value causes the preview layer to freeze momentarily
[self processSampleBuffer:sampleBuffer]; // some arbitrary image processing method
}
This works as intended as far as the orientation of the pixel buffer is concerned, however, whenever the device is rotated to a new orientation giving connection.videoOrientation a new value, the preview layer freezes for a fraction of a second. Blocking the delegate method's thread (e.g. by adding a sleep) doesn't freeze the preview layer, so that's not the problem. Any help towards a solution is hugely appreciated!

xcode ios opencv memory leak. App is crashing

I am currently working on iOS with openvc,
I am trying to convert an cv::Mat to an UIImage.
But the app is crashing after a few seconds!
(Terminated due to Memory Error)
This is my code that I am currently using:
using namespace cv;
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
{
CvVideoCamera* videoCamera;
CADisplayLink*run_loop;
UIImage*image2;
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
videoCamera = [[CvVideoCamera alloc] initWithParentView:_liveview];
videoCamera.defaultAVCaptureDevicePosition = AVCaptureDevicePositionBack;
videoCamera.defaultAVCaptureSessionPreset = AVCaptureSessionPresetiFrame1280x720;
videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationLandscapeLeft;
videoCamera.defaultFPS = 30;
videoCamera.delegate = self;
[videoCamera start];
run_loop = [CADisplayLink displayLinkWithTarget:self selector:#selector(update)];
[run_loop setFrameInterval:2];
[run_loop addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes];
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (void)update{
_smallliveview.image = image2;
}
- (UIImage *)UIImageFromMat:(cv::Mat)image
{
cvtColor(image, image, CV_BGR2RGB);
NSData *data = [NSData dataWithBytes:image.data length:image.elemSize()*image.total()];
CGColorSpaceRef colorSpace;
if (image.elemSize() == 1) {
colorSpace = CGColorSpaceCreateDeviceGray();
} else {
colorSpace = CGColorSpaceCreateDeviceRGB();
}
CGDataProviderRef provider = CGDataProviderCreateWithCFData((CFDataRef)data);//CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
// Creating CGImage from cv::Mat
CGImageRef imageRef = CGImageCreate(image.cols, //width
image.rows, //height
8, //bits per component
8 * image.elemSize(), //bits per pixel
image.step.p[0], //bytesPerRow
colorSpace, //colorspace
kCGImageAlphaNone|kCGBitmapByteOrderDefault,// bitmap info
provider, //CGDataProviderRef
NULL, //decode
false, //should interpolate
kCGRenderingIntentDefault //intent
);
// Getting UIImage from CGImage
UIImage *finalImage = [UIImage imageWithCGImage:imageRef];
//[self.imgView setImage:finalImage];
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
return finalImage;
}
#pragma mark - Protocol CvVideoCameraDelegate
#ifdef __cplusplus
- (void)processImage:(Mat&)image;
{
image2 = [self UIImageFromMat:image];
}
#endif
#end
What should i do?
It would be very nice if somebody can help me!? (;
Greetings David
Throw away every bit of what you wrote to create a UIImage and use the MatToUIImage() function instead. Simply pass the mat to the function, and you have your image.
Although you didn't ask, you shouldn't use a run loop or display link here. Time and initiate related methods to the processImage method called by OpenCV.
Also, make sure you're using the latest version. This has nothing to do with your problem, but it's good practice.
To import OpenCV 3 into your Xcode 8 project:
Install 'OpenCV2' with Cocoapods (it says '2', but it's still version 3). Don't install the 'devel' build.
Open your project in the workspace Cocoapods created for you — not the project file you created — and append every implementation file that uses OpenCV with .mm (versus .m). You'll get strange error messages if you don't.

Getting NSData from UIImage object that contains CGImage

In order to upload an image file to my server, I need to get it's NSData. I am having trouble doing this right now, because my UIImage contains a CGImage.
Let me show you how I am doing things. When a user takes a photo, this is what I do with the captured photo:
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc]initWithData:imageData];
_subLayer = [CALayer layer];
image = [self selfieCorrection:image];
image = [self rotate:image andOrientation:image.imageOrientation];
CGRect cropRectFinal = CGRectMake(cropRect.origin.x, 140, cropRect.size.width, cropRect.size.height);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRectFinal);
_subLayer.contents = (id)[UIImage imageWithCGImage:imageRef].CGImage;
In the above code, I create a UIImage, and initialize it with the imageData object. I also create a layer object called _subLayer that will display the final image on the screen. The image's orientation is then rotated and corrected, and then I setup a CGRect to crop the part of the image I want to keep.
The most important part are the last 2 statements. I create a CGImageRef called imageRef using the original UIImage. Then, in the last statement, I set the contents property of _subLayer to equal my final image.
A little further down in this same view controller, inside of an IBAction I have the following statement which helps me pass the image to my next view controller: sendFriendsVC.finalPhotoToSend = _subLayer.contents;
finalPhotoToSend is a property that I have setup in the header file of my next view controller like this:
#property (nonatomic, retain) UIImage *finalPhotoToSend;
The data is successfully passed when I segue to the next view controller, but when I NSLog finalPhotoToSend, even though it is a UIImage, it prints as this to the console:
<CGImage 0x1563dac0>
In order to upload photos to my backend server service, it requires me to create an object using NSData. I have tried using these 2 methods to get the NSData out of the finalPhotoToSend object:
NSData *imageData = UIImageJPEGRepresentation(finalPhotoToSend, 0.7);
and
NSData *imageData = UIImagePNGRepresentation(finalPhotoToSend);
But those always give me the following error in xcode:
NSInvalidArgumentException', reason: '-[__NSCFType CGImage]: unrecognized selector sent to instance
I am not sure what to do. Is there a different method I should be using to get NSData out of my UIImage since it is technically holding a CGImage? Should I be doing something differently before I even pass the data to my next view controller?
Edit for Michael:
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:self {
if ([segue.identifier isEqualToString:#"mediaCaptureToSendToFriendsSegue"]) {
SendToFriendsViewController *sendFriendsVC = segue.destinationViewController;
sendFriendsVC.finalPhotoToSend = _subLayer.contents;
}
}
I just figured it out. Here's what I ended up using for the IBAction method implementation:
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:self {
if ([segue.identifier isEqualToString:#"mediaCaptureToSendToFriendsSegue"]) {
SendToFriendsViewController *sendFriendsVC = segue.destinationViewController;
UIImage* image2 = [UIImage imageWithCGImage:(__bridge CGImageRef)(_subLayer.contents)];
sendFriendsVC.finalPhotoToSend = image2;
}
}
I'm new to objective-c and not really entirely sure what the following statement is even doing:
UIImage* image2 = [UIImage imageWithCGImage:(__bridge CGImageRef)(_subLayer.contents)];
But xcode suggested it and it works. Now on the next view controller, the finalPhotoToSend object NSLogs as a UIImage instead of a CGImage.
Better answer:
You need to truly set a UIImage object (and not a CGImage pointer) to a property that you've declared to be a UIImage object.
Original:
if "finalPhotoToSend" is a property, you should be doing:
NSData *imageData = UIImageJPEGRepresentation(self.finalPhotoToSend, 0.7);
and not:
NSData *imageData = UIImageJPEGRepresentation(finalPhotoToSend, 0.7);

How to fix the Core Foundation object with a +1 retain count (ARC)?

Here is my method:
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(aSampleBuffer);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef myImage = [context
createCGImage:ciImage
fromRect:CGRectMake(0, 0,
CVPixelBufferGetWidth(imageBuffer),
CVPixelBufferGetHeight(imageBuffer))];
return [UIImage imageWithCGImage:myImage];
But last line show me is Potential leak of an object stored into 'myimage', and the line of myImage is Method returns a Core Foundation object with a +1 retain count. But my application is ARC enabled, so I can't release something. How can I fix it? Thanks.
My application is ARC enabled, so I can't release something
Wrong. ARC prevents you from sending the release message to Objective-C objects, since it manages their memory for you.
However you still have to manually manage the memory in any other case (i.e. C structures). You can - and must - use retain/release functions on such structures whenever appropriate.
In this case you have to manually call CGImageRelease on myImage, balancing the retain count, by doing (as proposed by H2CO3)
UIImage *retVal = [UIImage imageWithCGImage:myImage];
CGImageRelease(myImage);
return retVal;
UIImage *retVal = [UIImage imageWithCGImage:myImage];
CGImageRelease(myImage);
return retVal;

Memory Leak iOS (UIImageView,UIImage,CGImage) not freed on ARC

Im trying to implement a simple video stream but for some reason my memory won't get freed:
(void)updateImage:(UIImage *)image{
self.indicator.hidden = TRUE;
//CGImageRelease([self.imageView.image CGImage]);
self.imageView.image = nil;
self.imageView.image = image;
[self.imageView setNeedsDisplay];
}
If I use
CGImageRelease([self.imageView.image CGImage]);
memory will be freed. But when I return to a previous view controller the app will crash as it tries to free the allocated memory for that image, which I already freed. This method is called from a async task which creates an image using:
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CGDataProviderRelease(provider);
CFRelease(data);
As I understood it the UIImage now owns the CGImage and I shouldn't have to release it.
So is there anyway to ensure that the UIImage is freed when I updated the UIImageView with a new image?
Ok so I finally figured out the problem.
As I said I was using some background thread to perform the image update, the solution was to add it as a autorelease pool as following:
#autoreleasepool {
[self.delegate performSelectorOnMainThread:#selector(updateImage:) withObject:[self fetchImage] waitUntilDone:NO];
}

Resources