Memory warning when using dispatch_async in iOS - ios

I have the code below that captures jpeg frames at 30fps and records the video in mp4 format. I'm trying to wrap the processFrame method in a dispatch_async call so that the recording process will not lockup the video player. The problem with this is I'm getting Memory Warning level 2 and the app ultimately crashes after a few seconds. I can see that the dispatch_async method loads the queue in memory as it tries to append each frame in the recorded video output ,and at 30fps, it doesn't have enough time to process the frame and release used memory. I tried using dispatch_after to delay execution of processFrame but it doesn't help. Any ideas? Should I be doing this differently?
This method gets called at around 30 times per second.
//Process the data sent by the server and send follow-up commands if needed
-(void)processServerData:(NSData *)data{
//render the video in the UIImage control
UIImage *image =[UIImage imageWithData:data];
imageCtrl.image = image;
//record the frame in the background
dispatch_async(recordingQueue,^{[self processFrame:image];});
}
}
processFrame method
//function for processing each frame for recording
-(void) processFrame:(UIImage *) image {
if (myRecorder.frameCounter < myRecorder.maxFrames)
{
if([myRecorder.writerInput isReadyForMoreMediaData])
{
CMTime frameTime = CMTimeMake(1, myRecorder.timeScale);
CMTime lastTime=CMTimeMake(myRecorder.frameCounter, myRecorder.timeScale);
CMTime presentTime=CMTimeAdd(lastTime, frameTime);
buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];
if(buffer)
{
[myRecorder.adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
myRecorder.frameCounter++;
CVBufferRelease(buffer);
if (myRecorder.frameCounter==myRecorder.maxFrames)
{
[myRecorder finishSession];
myRecorder.frameCounter=0;
myRecorder.isRecording = NO;
}
}
else
{
NSLog(#"Buffer is empty");
}
}
else
{
NSLog(#"adaptor not ready frameCounter=%d ",myRecorder.frameCounter );
}
}
}

Solved! I discovered that I can use dispatch_async_semaphore to prevent the queue from being overloaded with too much request that it won't have enough time to release allocated resources.
Here's my updated code:
long success = dispatch_semaphore_wait(recordingSemaphore, DISPATCH_TIME_FOREVER);
if (success != 0 )
{
NSLog(#"Frame skipped");
}
else
{
dispatch_async(recordingQueue,^{
dispatch_semaphore_signal(recordingSemaphore);
[self processFrame:image];
});
}
The dispatch_semaphore created somewhere in my code. Here, I told the semaphore to accept up to only 50 request first and finish the processing before accepting anymore request.
dispatch_semaphore_t recordingSemaphore = dispatch_semaphore_create((long) 50); //so far stable at 50

Related

Saving ARSCNView snapshot in didUpdateFrame method causes camera to freeze

I am creating video in ARKit during session. When I press record button, camera freezes. I have written code in didUpdateFrame delegate that causes the problem. There I save scene.snapshot in an array. Also when i create video from these images, app crashes with following message in debugger:
Message from debugger: Terminated due to memory issue
-(void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame
{
if (_recordButton.state == UIControlStateSelected)
{
currentState = Recording;
[self saveImage];
}
else if (previousState == Recording)
{
NSLog(#"Stop recording");
currentState = NotRecording;
recordTime = NULL;
self.nextButton.enabled=YES;
}
//update recording state per frame update
previousState = currentState;
}
-(void)saveImage
{
UIImage *image = self.sceneView.snapshot;
[self.bufferArray addObject:image];
image = nil;
}
Do not use ARSCNView.snapshot with implementing ARSessionDelegate.didUpdateFrame. I had same issue and solution was do not implement ARSessionDelegate.didUpdateFrame. I have used CADisplayLink with ARSCNView.snapshot and it works well.
I also tried to use ARFrame.capturedImage, but it has not contain AR objects at all. ARSCNView.snapshot contains them.

Why a method works perfectly but crashes if I put it inside a dispatch_async?

I have this camera app.
I take the image from the camera, process it with a filter and at some point inside captureOutput:didOutputSampleBuffer:fromConnection: I take the final image and write it to a file using this:
CFRetain(sampleBuffer);
[myself writeToVideoImage:resultadoFinal
withSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
works wonderfully but If I put this inside a queue, like this:
CFRetain(sampleBuffer);
dispatch_async(_writeToVideoQueue, ^{
[myself writeToVideoImage:resultadoFinal
withSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
});
it crashes on the line
[_assetWriter startSessionAtSourceTime:presentationTime];
of
- (void)writeToVideoImage:(CIImage *)resultadoFinal
withSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
CFRetain(sampleBuffer);
CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CGRect extent = [resultadoFinal extent];
if (!_readyToWriteVideo) {
_readyToWriteVideo = [self setupAssetWriterVideoInputWithSize:extent.size];
return;
}
else if (_videoWritingStarted) {
_videoWritingStarted = NO;
// ***** CRASHES HERE ************
[_assetWriter startSessionAtSourceTime:presentationTime];
}
CVPixelBufferRef renderedOutputPixelBuffer = NULL;
OSStatus err = CVPixelBufferPoolCreatePixelBuffer(NULL,
_pixelBufferAdaptor.pixelBufferPool,
&renderedOutputPixelBuffer);
if (err) return;
[_ciContext render:resultadoFinal
toCVPixelBuffer:renderedOutputPixelBuffer
bounds:extent
colorSpace:_sDeviceRgbColorSpace];
[self writeToFile:renderedOutputPixelBuffer
comSampleTime:presentationTime
size:extent.size];
CFRelease(renderedOutputPixelBuffer);
CFRelease(sampleBuffer);
}
I don't have a clue of what is going on.
It appears to be something deallocating. I first suspected sampleBuffer was deallocating but I am retaining it inside and outside the function, just in case. I have also tried to create a copy of resultadoFinal inside the block, before calling the method, with no success.
Xcode shows the error
[AVAssetWriter startSessionAtSourceTime:] Cannot call method when status is 0'
There are questions on SO about that. I have tried all suggestions without success.
Any ideas?
The error "Cannot call method when status is 0" sounds like something isn't initialized properly - remember when you use dispatch_async you are running on a different thread, so you need to initialize your AVAssetWriter on that thread.

Auto-repeating count down timer in ReactiveCocoa

I'm new to ReactiveCocoa and there is a problem I couldn't yet find a way to solve. I have a network request in my app which returns data to be encoded in a QR code that will be valid for only 30 seconds. The network request returns a RACSignal and I send the data to be encoded in that signal to my view model. In the view model I map that data to a QR image and expose it as a property in my view model interface. After I create the QR image, I want to update a timeLeftString property that says "This code is valid only for 30 seconds" but the seconds will change as time progresses, and after that 30 seconds complete, I want to make another request to fetch another QR code data that will be valid for 30 seconds more and after that completes another request the fetch data that will be valid for 30 seconds...up until the screen is dismissed. How do I go about implementing this?
Currently I have this to get the data:
- (RACSignal *)newPaymentSignal
{
#weakify(self);
return [[[[APIManager sharedManager] newPayment] map:^id(NSString *paymentToken) {
ZXMultiFormatWriter *writer = [ZXMultiFormatWriter writer];
ZXBitMatrix *result =
[writer encode:paymentToken format:kBarcodeFormatQRCode width:250 height:250 error:nil];
if (!result) {
return nil;
}
CGImageRef cgImage = [[ZXImage imageWithMatrix:result] cgimage];
UIImage *image = [UIImage imageWithCGImage:cgImage];
return UIImagePNGRepresentation(image);
}] doNext:^(NSData *data) {
#strongify(self);
self.qrImageData = data;
}];
}
and this for timer
- (RACSignal *)timeRemainingSignal
{
#weakify(self);
return [[[RACSignal interval:0.5 onScheduler:[RACScheduler scheduler]] //
startWith:[NSDate date]] //
initially:^{
#strongify(self);
self.expiryDate = [[NSDate date] dateByAddingTimeInterval:30];
}];
}
The flow is: get data from the api, start the timer, and when the time is up make a new request to get new data and start timer again..and repeat this forever.
1- How do I start the timer after I get data from the API?
2- How do I make this flow repeat forever?
3- How do I stop the timer before 30 seconds complete and start the flow from the beginning if the user taps a button on the user interface?
4- I have an expiryDate property which is 30 seconds added to current date because I thought I will take the difference of expiryDate and [NSDate date] to decide whether the time is up - is there a better way to implement this?
5- How do I break the flow when it's repeating forever and unsubscribe from everything when the screen is dismissed (or, say, when user taps another button)?
thanks so much in advance for the answers.
I think the missing piece of the puzzle is the very useful flattenMap operator. It essentially replaces any nexts from its incoming signal with nexts from the signal returned by it.
Here's one approach to solving your problem (I replaced your newPaymentSignal method with a simple signal sending a string):
- (RACSignal *)newPaymentSignal
{
return [[RACSignal return:#"token"] delay:2];
}
- (void)start
{
NSInteger refreshInterval = 30;
RACSignal *refreshTokenTimerSignal =
[[RACSignal interval:refreshInterval onScheduler:[RACScheduler mainThreadScheduler]]
startWith:[NSDate date]];
[[[[refreshTokenTimerSignal
flattenMap:^RACStream *(id _)
{
return [self newPaymentSignal];
}]
map:^NSDate *(NSString *paymentToken)
{
// display paymentToken here
NSLog(#"%#", paymentToken);
return [[NSDate date] dateByAddingTimeInterval:refreshInterval];
}]
flattenMap:^RACStream *(NSDate *expiryDate)
{
return [[[[RACSignal interval:1 onScheduler:[RACScheduler mainThreadScheduler]]
startWith:[NSDate date]]
takeUntil:[refreshTokenTimerSignal skip:1]]
map:^NSNumber *(NSDate *now)
{
return #([expiryDate timeIntervalSinceDate:now]);
}];
}]
subscribeNext:^(NSNumber *remaining)
{
// update timer readout here
NSLog(#"%#", remaining);
}];
}
Every time the outer refreshTokenTimerSignal fires, it gets mapped to a new newPaymentSignal, which in turn when it returns a value gets mapped to an expiry date, which is used to create a new "inner" timer signal which fires every second.
The takeUntil operator on the inner timer completes that signal as soon as the outer refresh timer sends a next.
(One peculiar thing here was that I had to add a skip:1 to the refreshTokenTimerSignal, otherwise the inner timer never got started. I would have expected it to work even without the skip:1, maybe someone better versed in the internals of RAC could explain why this is.)
To break the flow of the outer signal in response to various events, you can experiment with using takeUntil and takeUntilBlock on that too.

Strange behavior of background music in cocos2d iOS

I am using cocos2d v2 and experiencing a very strange behaviour.
I have a couple of audio tracks which are supposed to be played as background music one after another. But I noticed when these tracks are playing in background, any updates on screen (rendering) isn't working.
For instance I added a new sprite marker after every new track but nothing shown on screen until all the tracks are done playing. I also tried displaying track # using CCLABELBMFont but that also didn't show anything on screen until all tracks are finished playing.
Here's the code:
NSString *keyString;
CCARRAY_FOREACH([[GameManager sharedGameManager] _musicItems], keyString){
if ([[[GameManager sharedGameManager] _soundEngine] isBackgroundMusicPlaying]) {
int waitCycles = 0;
while (waitCycles < AUDIO_MAX_WAITTIME) {
[NSThread sleepForTimeInterval:0.1f];
if (![[[GameManager sharedGameManager] _soundEngine] isBackgroundMusicPlaying]) {
break;
}
waitCycles += 1;
}
}
//play sound file
CCLOG(#"Playing Sound file: %#", keyString);
[[GameManager sharedGameManager] playBackgroundTrack:keyString];
**EDIT:**
/******** changed to include dispatch: start *********/
dispatch_async(dispatch_get_main_queue(), ^{
CCLOG(#"on main thread");
CCSprite *marker = [CCSprite spriteWithSpriteFrameName:#"marker.png"];
[marker setPosition:ccp(100 * count, 200)];
[self addChild:marker z:100];
});
/***************** end **********************/
}
EDIT:
Here's implementation for audio setup
-(void)setupAudioEngine{
if(_hasAudioBeenInitialized){
return; //sound engine already initialized
}
else{
_hasAudioBeenInitialized = YES;
NSOperationQueue *queue = [[NSOperationQueue new] autorelease];
NSInvocationOperation *asyncSetupOperation = [[NSInvocationOperation alloc] initWithTarget:self
selector:#selector(initAudioAsync) object:nil];
[queue addOperation:asyncSetupOperation];
[asyncSetupOperation autorelease];
}
}
-(void)initAudioAsync{
//Initialize audio engine asynchronously
CCLOG(#"Audio Manager Initializing");
_managerSoundState = kAudioManagerInitializing;
//start audio engine
[CDSoundEngine setMixerSampleRate:CD_SAMPLE_RATE_HIGH];
//Init audio manager asynchronously as it can take a few seconds
//The kAMM_FxPlusMusic mode ensure only this game plays audio
[CDAudioManager initAsynchronously:kAMM_FxPlusMusic];
//wait for audio manager to initialize
while ([CDAudioManager sharedManagerState] != kAMStateInitialised) {
[NSThread sleepForTimeInterval:0.1];
}
CDAudioManager *audioManager = [CDAudioManager sharedManager];
if (audioManager.soundEngine == nil || audioManager.soundEngine.functioning == NO) {
CCLOG(#"COCOS Dension failed to init. No audio will play");
_managerSoundState = kAudioManagerFailed;
}
else{
[audioManager setResignBehavior:kAMRBStopPlay autoHandle:YES];
_soundEngine = [SimpleAudioEngine sharedEngine];
_managerSoundState = kAudioManagerReady;
CCLOG(#"COCOS Dension is ready now");
}
}
Anyone has ideas why it's happening?
Your sprites are never drawn, because you are blocking the main thread. You should dispatch asynchronously to a background queue and when you want to make changes to the UI (adding or manipulating sprites) dispatch back to the main queue.

How to tear down a GCD queue?

I have a GCD drawing queue to update my OpenGL ES scene which is structured like this:
- (void)drawFrame {
dispatch_async(drawingQueue, ^{
if (dispatch_semaphore_wait(frameRenderingSemaphore, DISPATCH_TIME_NOW) != 0) {
return;
}
#autoreleasepool {
[self startDrawing];
// drawing code
[self endDrawing];
}
dispatch_semaphore_signal(frameRenderingSemaphore);
});
}
When the app resigns active or enters background (both) I stop the OpenGL drawing run loop by invalidating the CADisplayLink.
The problem however is that dispatch_asyn dispatches a drawing block even until after the CADisplayLink got invalidated. When the user presses the home button, my app crashes because it attempted to draw a frame in OpenGL even though iOS teared down the context already.
Is there a way to kill / pause a GCD queue so it doesn't dispatch anything anymore?
I think the easiest way would be to have a flag in your application that you check before executing the block. For example:
- (void)drawFrame {
dispatch_async(drawingQueue, ^{
if (appIsTerminated || appIsInBackground) {
return;
}
if (dispatch_semaphore_wait(frameRenderingSemaphore, DISPATCH_TIME_NOW) != 0) {
return;
}
#autoreleasepool {
[self startDrawing];
// drawing code
[self endDrawing];
}
dispatch_semaphore_signal(frameRenderingSemaphore);
});
}
You can set those values in your app delegate in applicationDidEnterBackground and applicationWillTerminate.
You could also try this:
dispatch_suspend(drawingQueue);
dispatch_release(drawingQueue);
Not quite sure about those though.
Here's all the details: https://developer.apple.com/library/mac/#documentation/Performance/Reference/GCD_libdispatch_Ref/Reference/reference.html

Resources