I am creating video in ARKit during session. When I press record button, camera freezes. I have written code in didUpdateFrame delegate that causes the problem. There I save scene.snapshot in an array. Also when i create video from these images, app crashes with following message in debugger:
Message from debugger: Terminated due to memory issue
-(void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame
{
if (_recordButton.state == UIControlStateSelected)
{
currentState = Recording;
[self saveImage];
}
else if (previousState == Recording)
{
NSLog(#"Stop recording");
currentState = NotRecording;
recordTime = NULL;
self.nextButton.enabled=YES;
}
//update recording state per frame update
previousState = currentState;
}
-(void)saveImage
{
UIImage *image = self.sceneView.snapshot;
[self.bufferArray addObject:image];
image = nil;
}
Do not use ARSCNView.snapshot with implementing ARSessionDelegate.didUpdateFrame. I had same issue and solution was do not implement ARSessionDelegate.didUpdateFrame. I have used CADisplayLink with ARSCNView.snapshot and it works well.
I also tried to use ARFrame.capturedImage, but it has not contain AR objects at all. ARSCNView.snapshot contains them.
Related
I have this camera app.
I take the image from the camera, process it with a filter and at some point inside captureOutput:didOutputSampleBuffer:fromConnection: I take the final image and write it to a file using this:
CFRetain(sampleBuffer);
[myself writeToVideoImage:resultadoFinal
withSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
works wonderfully but If I put this inside a queue, like this:
CFRetain(sampleBuffer);
dispatch_async(_writeToVideoQueue, ^{
[myself writeToVideoImage:resultadoFinal
withSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
});
it crashes on the line
[_assetWriter startSessionAtSourceTime:presentationTime];
of
- (void)writeToVideoImage:(CIImage *)resultadoFinal
withSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
CFRetain(sampleBuffer);
CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CGRect extent = [resultadoFinal extent];
if (!_readyToWriteVideo) {
_readyToWriteVideo = [self setupAssetWriterVideoInputWithSize:extent.size];
return;
}
else if (_videoWritingStarted) {
_videoWritingStarted = NO;
// ***** CRASHES HERE ************
[_assetWriter startSessionAtSourceTime:presentationTime];
}
CVPixelBufferRef renderedOutputPixelBuffer = NULL;
OSStatus err = CVPixelBufferPoolCreatePixelBuffer(NULL,
_pixelBufferAdaptor.pixelBufferPool,
&renderedOutputPixelBuffer);
if (err) return;
[_ciContext render:resultadoFinal
toCVPixelBuffer:renderedOutputPixelBuffer
bounds:extent
colorSpace:_sDeviceRgbColorSpace];
[self writeToFile:renderedOutputPixelBuffer
comSampleTime:presentationTime
size:extent.size];
CFRelease(renderedOutputPixelBuffer);
CFRelease(sampleBuffer);
}
I don't have a clue of what is going on.
It appears to be something deallocating. I first suspected sampleBuffer was deallocating but I am retaining it inside and outside the function, just in case. I have also tried to create a copy of resultadoFinal inside the block, before calling the method, with no success.
Xcode shows the error
[AVAssetWriter startSessionAtSourceTime:] Cannot call method when status is 0'
There are questions on SO about that. I have tried all suggestions without success.
Any ideas?
The error "Cannot call method when status is 0" sounds like something isn't initialized properly - remember when you use dispatch_async you are running on a different thread, so you need to initialize your AVAssetWriter on that thread.
Hi i am using sync api for download images from DBx and sync my app data to DBX. Now i want to download images from DBX with progress bar. I tried this but i could not any progress value and get downloaded with this warning. This is my code
DBFile *orignalImg = [[DBFilesystem sharedFilesystem]openFile:imgInfo.imgPath error:nil];
__weak DBFile *oFile = orignalImg;
if (oFile)
{
[orignalImageArray addObject:oFile]; // make reference of file
}
if (orignalImg.status.cached)
{
// already displayed
}
else
{
[orignalImg addObserver:self block:^(void)
{
DBFileStatus *fileStatus = oFile.status;
DBFileStatus *newerStatus = oFile.newerStatus;
UIImage *aImage = [UIImage imageWithData: [oFile readData:nil]];
if (fileStatus.cached) // if image downloaded
{
//display image
}
else if (fileStatus.state == DBFileStateDownloading) // show progress bar
{
// show progress
[self showProgress:strPath andProgressValue:fileStatus.progress];
}
else if (newerStatus && newerStatus.state == DBFileStateDownloading)// show progress bar
{
[self showProgress:strPath andProgressValue:fileStatus.progress]; }
}];
}
Warning is :- dropbox_file_wait_for_ready should not be called on the main thread
#rmaddy is right... calling readData before the file has finished downloading will cause that call to block, and so you won't see any progress. (That's also presumably causing the warning.)
If you don't do that, you should be able to see progress as the file downloads, but it looks like you haven't implemented that part yet.
I am using cocos2d v2 and experiencing a very strange behaviour.
I have a couple of audio tracks which are supposed to be played as background music one after another. But I noticed when these tracks are playing in background, any updates on screen (rendering) isn't working.
For instance I added a new sprite marker after every new track but nothing shown on screen until all the tracks are done playing. I also tried displaying track # using CCLABELBMFont but that also didn't show anything on screen until all tracks are finished playing.
Here's the code:
NSString *keyString;
CCARRAY_FOREACH([[GameManager sharedGameManager] _musicItems], keyString){
if ([[[GameManager sharedGameManager] _soundEngine] isBackgroundMusicPlaying]) {
int waitCycles = 0;
while (waitCycles < AUDIO_MAX_WAITTIME) {
[NSThread sleepForTimeInterval:0.1f];
if (![[[GameManager sharedGameManager] _soundEngine] isBackgroundMusicPlaying]) {
break;
}
waitCycles += 1;
}
}
//play sound file
CCLOG(#"Playing Sound file: %#", keyString);
[[GameManager sharedGameManager] playBackgroundTrack:keyString];
**EDIT:**
/******** changed to include dispatch: start *********/
dispatch_async(dispatch_get_main_queue(), ^{
CCLOG(#"on main thread");
CCSprite *marker = [CCSprite spriteWithSpriteFrameName:#"marker.png"];
[marker setPosition:ccp(100 * count, 200)];
[self addChild:marker z:100];
});
/***************** end **********************/
}
EDIT:
Here's implementation for audio setup
-(void)setupAudioEngine{
if(_hasAudioBeenInitialized){
return; //sound engine already initialized
}
else{
_hasAudioBeenInitialized = YES;
NSOperationQueue *queue = [[NSOperationQueue new] autorelease];
NSInvocationOperation *asyncSetupOperation = [[NSInvocationOperation alloc] initWithTarget:self
selector:#selector(initAudioAsync) object:nil];
[queue addOperation:asyncSetupOperation];
[asyncSetupOperation autorelease];
}
}
-(void)initAudioAsync{
//Initialize audio engine asynchronously
CCLOG(#"Audio Manager Initializing");
_managerSoundState = kAudioManagerInitializing;
//start audio engine
[CDSoundEngine setMixerSampleRate:CD_SAMPLE_RATE_HIGH];
//Init audio manager asynchronously as it can take a few seconds
//The kAMM_FxPlusMusic mode ensure only this game plays audio
[CDAudioManager initAsynchronously:kAMM_FxPlusMusic];
//wait for audio manager to initialize
while ([CDAudioManager sharedManagerState] != kAMStateInitialised) {
[NSThread sleepForTimeInterval:0.1];
}
CDAudioManager *audioManager = [CDAudioManager sharedManager];
if (audioManager.soundEngine == nil || audioManager.soundEngine.functioning == NO) {
CCLOG(#"COCOS Dension failed to init. No audio will play");
_managerSoundState = kAudioManagerFailed;
}
else{
[audioManager setResignBehavior:kAMRBStopPlay autoHandle:YES];
_soundEngine = [SimpleAudioEngine sharedEngine];
_managerSoundState = kAudioManagerReady;
CCLOG(#"COCOS Dension is ready now");
}
}
Anyone has ideas why it's happening?
Your sprites are never drawn, because you are blocking the main thread. You should dispatch asynchronously to a background queue and when you want to make changes to the UI (adding or manipulating sprites) dispatch back to the main queue.
I have a GCD drawing queue to update my OpenGL ES scene which is structured like this:
- (void)drawFrame {
dispatch_async(drawingQueue, ^{
if (dispatch_semaphore_wait(frameRenderingSemaphore, DISPATCH_TIME_NOW) != 0) {
return;
}
#autoreleasepool {
[self startDrawing];
// drawing code
[self endDrawing];
}
dispatch_semaphore_signal(frameRenderingSemaphore);
});
}
When the app resigns active or enters background (both) I stop the OpenGL drawing run loop by invalidating the CADisplayLink.
The problem however is that dispatch_asyn dispatches a drawing block even until after the CADisplayLink got invalidated. When the user presses the home button, my app crashes because it attempted to draw a frame in OpenGL even though iOS teared down the context already.
Is there a way to kill / pause a GCD queue so it doesn't dispatch anything anymore?
I think the easiest way would be to have a flag in your application that you check before executing the block. For example:
- (void)drawFrame {
dispatch_async(drawingQueue, ^{
if (appIsTerminated || appIsInBackground) {
return;
}
if (dispatch_semaphore_wait(frameRenderingSemaphore, DISPATCH_TIME_NOW) != 0) {
return;
}
#autoreleasepool {
[self startDrawing];
// drawing code
[self endDrawing];
}
dispatch_semaphore_signal(frameRenderingSemaphore);
});
}
You can set those values in your app delegate in applicationDidEnterBackground and applicationWillTerminate.
You could also try this:
dispatch_suspend(drawingQueue);
dispatch_release(drawingQueue);
Not quite sure about those though.
Here's all the details: https://developer.apple.com/library/mac/#documentation/Performance/Reference/GCD_libdispatch_Ref/Reference/reference.html
I have the code below that captures jpeg frames at 30fps and records the video in mp4 format. I'm trying to wrap the processFrame method in a dispatch_async call so that the recording process will not lockup the video player. The problem with this is I'm getting Memory Warning level 2 and the app ultimately crashes after a few seconds. I can see that the dispatch_async method loads the queue in memory as it tries to append each frame in the recorded video output ,and at 30fps, it doesn't have enough time to process the frame and release used memory. I tried using dispatch_after to delay execution of processFrame but it doesn't help. Any ideas? Should I be doing this differently?
This method gets called at around 30 times per second.
//Process the data sent by the server and send follow-up commands if needed
-(void)processServerData:(NSData *)data{
//render the video in the UIImage control
UIImage *image =[UIImage imageWithData:data];
imageCtrl.image = image;
//record the frame in the background
dispatch_async(recordingQueue,^{[self processFrame:image];});
}
}
processFrame method
//function for processing each frame for recording
-(void) processFrame:(UIImage *) image {
if (myRecorder.frameCounter < myRecorder.maxFrames)
{
if([myRecorder.writerInput isReadyForMoreMediaData])
{
CMTime frameTime = CMTimeMake(1, myRecorder.timeScale);
CMTime lastTime=CMTimeMake(myRecorder.frameCounter, myRecorder.timeScale);
CMTime presentTime=CMTimeAdd(lastTime, frameTime);
buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];
if(buffer)
{
[myRecorder.adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
myRecorder.frameCounter++;
CVBufferRelease(buffer);
if (myRecorder.frameCounter==myRecorder.maxFrames)
{
[myRecorder finishSession];
myRecorder.frameCounter=0;
myRecorder.isRecording = NO;
}
}
else
{
NSLog(#"Buffer is empty");
}
}
else
{
NSLog(#"adaptor not ready frameCounter=%d ",myRecorder.frameCounter );
}
}
}
Solved! I discovered that I can use dispatch_async_semaphore to prevent the queue from being overloaded with too much request that it won't have enough time to release allocated resources.
Here's my updated code:
long success = dispatch_semaphore_wait(recordingSemaphore, DISPATCH_TIME_FOREVER);
if (success != 0 )
{
NSLog(#"Frame skipped");
}
else
{
dispatch_async(recordingQueue,^{
dispatch_semaphore_signal(recordingSemaphore);
[self processFrame:image];
});
}
The dispatch_semaphore created somewhere in my code. Here, I told the semaphore to accept up to only 50 request first and finish the processing before accepting anymore request.
dispatch_semaphore_t recordingSemaphore = dispatch_semaphore_create((long) 50); //so far stable at 50