How to stream video as it is being recorded? - ios

Okay so I'm my app i have a ViewController which handles the recording of a video from the camera which it then saves into the documents directory of my applications folders. Now what i want to do is to whilst recording the video simultaneously upload parts of the current file being written too to a server (I am new at this stuff but guessing http server). The reason I am doing this is because I wish to add support so when videoing i can stream to the chrome cast. This is possible because already the EZCast app performs a similar function.
I have already worked out how to upload video to a http server, send a video within the http server to the chrome chromecast and actually record the video using these sources :
Chrome Cast : https://developers.google.com/cast/
Chrome Cast : https://github.com/googlecast/CastVideos-ios
Http Server : https://github.com/robbiehanson/CocoaHTTPServer
Recording From Idevice camera : https://github.com/BradLarson/GPUImage
To cast a video I obviously connect but before allowing me to get to the record view I have to already be connected so my code to purely cast an .mp4 video simple looks like this :
-(void)startCasting
{
[self establishServer];
self.mediaControlChannel = [[GCKMediaControlChannel alloc] init];
self.mediaControlChannel.delegate = self;
[self.deviceManager addChannel:gblvb.mediaControlChannel];
[self.mediaControlChannel requestStatus];
NSString *path = [NSString stringWithFormat:#"http://%#%#%hu%#%#", [self getIPAddress], #":" ,[httpServer listeningPort], #"/", #"Movie.mp4"];
NSString *image;
NSString *type;
self.metadata = [[GCKMediaMetadata alloc] init];
image = #"";//Image HERE
[gblvb.metadata setString:#"
forKey:kGCKMetadataKeySubtitle];//Description Here
type = #"video/mp4";//Video Type
[self.metadata setString:[NSString stringWithFormat:#"%#%#", #"Casting " , #"Movie.mp4"]forKey:kGCKMetadataKeyTitle];//Title HERE
//define Media information
GCKMediaInformation *mediaInformation =
[[GCKMediaInformation alloc] initWithContentID:path
streamType:GCKMediaStreamTypeNone
contentType:type
metadata:gblvb.metadata
streamDuration:0
customData:nil];
//cast video
[self.mediaControlChannel loadMedia:mediaInformation autoplay:TRUE playPosition:0];
}
- (NSString *)getIPAddress {
NSString *address = #"error";
struct ifaddrs *interfaces = NULL;
struct ifaddrs *temp_addr = NULL;
int success = 0;
// retrieve the current interfaces - returns 0 on success
success = getifaddrs(&interfaces);
if (success == 0) {
// Loop through linked list of interfaces
temp_addr = interfaces;
while(temp_addr != NULL) {
if(temp_addr->ifa_addr->sa_family == AF_INET) {
// Check if interface is en0 which is the wifi connection on the iPhone
if([[NSString stringWithUTF8String:temp_addr->ifa_name] isEqualToString:#"en0"]) {
// Get NSString from C String
address = [NSString stringWithUTF8String:inet_ntoa(((struct sockaddr_in *)temp_addr->ifa_addr)->sin_addr)];
}
}
temp_addr = temp_addr->ifa_next;
}
}
// Free memory
freeifaddrs(interfaces);
return address;
}
Now before casting i need to establish my http server. This is simple and requires small implementation after adding CocoaHTTPServer to your project. My code to start the server looks like this :
static const int ddLogLevel = LOG_LEVEL_VERBOSE;
-(void)establishServer
{
[httpServer stop];
// Do any additional setup after loading the view from its nib.
// Configure our logging framework.
// To keep things simple and fast, we're just going to log to the Xcode console.
[DDLog addLogger:[DDTTYLogger sharedInstance]];
// Create server using our custom MyHTTPServer class
httpServer = [[HTTPServer alloc] init];
// Tell the server to broadcast its presence via Bonjour.
// This allows browsers such as Safari to automatically discover our service.
[httpServer setType:#"_http._tcp."];
// Normally there's no need to run our server on any specific port.
// Technologies like Bonjour allow clients to dynamically discover the server's port at runtime.
// However, for easy testing you may want force a certain port so you can just hit the refresh button.
// [httpServer setPort:12345];
// Serve files from our embedded Web folder
NSString *webPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/"];
DDLogInfo(#"Setting document root: %#", webPath);
[httpServer setDocumentRoot:webPath];
[self startServer];
}
- (void)startServer
{
// Start the server (and check for problems)
NSError *error;
if([httpServer start:&error])
{
DDLogInfo(#"Started HTTP Server on port %hu", [httpServer listeningPort]);
}
else
{
DDLogError(#"Error starting HTTP Server: %#", error);
}
}
Lastly I use this code to Begin displaying and recording from the iPhones camera :
- (void)viewDidLoad
{
[super viewDidLoad];
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront];
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionBack];
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1920x1080 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.horizontallyMirrorFrontFacingCamera = NO;
videoCamera.horizontallyMirrorRearFacingCamera = NO;
// filter = [[GPUImageSepiaFilter alloc] init];
// filter = [[GPUImageTiltShiftFilter alloc] init];
// [(GPUImageTiltShiftFilter *)filter setTopFocusLevel:0.65];
// [(GPUImageTiltShiftFilter *)filter setBottomFocusLevel:0.85];
// [(GPUImageTiltShiftFilter *)filter setBlurSize:1.5];
// [(GPUImageTiltShiftFilter *)filter setFocusFallOffRate:0.2];
// filter = [[GPUImageSketchFilter alloc] init];
filter = [[GPUImageFilter alloc] init];
// filter = [[GPUImageSmoothToonFilter alloc] init];
// GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRightFlipVertical];
[videoCamera addTarget:filter];
GPUImageView *filterView = (GPUImageView *)self.view;
// filterView.fillMode = kGPUImageFillModeStretch;
// filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill;
// Record a movie for 10 s and store it in /Documents, visible via iTunes file sharing
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.mp4"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
movieWriter.encodingLiveVideo = YES;
// movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(640.0, 480.0)];
// movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(720.0, 1280.0)];
// movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(1080.0, 1920.0)];
[filter addTarget:movieWriter];
[filter addTarget:filterView];
[videoCamera startCameraCapture];
}
bool recording;
- (IBAction)Record:(id)sender
{
if (recording == YES)
{
Record.titleLabel.text = #"Record";
recording = NO;
double delayInSeconds = 0.1;
dispatch_time_t stopTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
dispatch_after(stopTime, dispatch_get_main_queue(), ^(void){
[filter removeTarget:movieWriter];
videoCamera.audioEncodingTarget = nil;
[movieWriter finishRecording];
NSLog(#"Movie completed");
// [videoCamera.inputCamera lockForConfiguration:nil];
// [videoCamera.inputCamera setTorchMode:AVCaptureTorchModeOff];
// [videoCamera.inputCamera unlockForConfiguration];
});
UIAlertView *message = [[UIAlertView alloc] initWithTitle:#"Do You Wish To Store This Footage?"
message:#"Recording has fineshed. Do you wish to store this video into your camera roll?"
delegate:self
cancelButtonTitle:nil
otherButtonTitles:#"Yes", #"No",nil];
[message show];
[self dismissViewControllerAnimated:YES completion:nil];
}
else
{
double delayToStartRecording = 0.5;
dispatch_time_t startTime = dispatch_time(DISPATCH_TIME_NOW, delayToStartRecording * NSEC_PER_SEC);
dispatch_after(startTime, dispatch_get_main_queue(), ^(void){
NSLog(#"Start recording");
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
// NSError *error = nil;
// if (![videoCamera.inputCamera lockForConfiguration:&error])
// {
// NSLog(#"Error locking for configuration: %#", error);
// }
// [videoCamera.inputCamera setTorchMode:AVCaptureTorchModeOn];
// [videoCamera.inputCamera unlockForConfiguration];
recording = YES;
Record.titleLabel.text = #"Stop";
});
[self startCasting];
}
}
Now as you can probably see i am attempting to run the video being recorded straight after recording and pointing the server to the position. This is not working because I believe that the file is not offically located at that path until the stop button has been pressed but how can i fix this? can anybody help?
ChromeCast supported media types:
https://developers.google.com/cast/docs/media

Related

Black screen appears before the actual video on re-recording

I am recording a video in my app whose recording time is sent by the server. The server tests for certain gestures in the video and if it fails sends back a flag asking to re-record and resend.
Problem: Initial video is of correct duration. But the videos, in the event of server 'retry request' are appended with a black screen for a certain amount of time and then the recorded video plays.
flow:
- (void)viewDidLoad {
[super viewDidLoad];
// get recording time
[self getRecordingData];
}
recording function:
-(void)startRecording {
self.isRecording = YES;
self.isProcessing = YES;
[self startWritingToVideoFile];
[self logicMethodToStopRecording];
}
writing to video file function:
-(void)startWritingToVideoFile{
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:640], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, AVVideoCodecTypeH264, AVVideoCodecKey,nil];
self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
[self.assetWriterInput setTransform: CGAffineTransformMakeRotation( ( 90 * M_PI ) / 180 )];
self.pixelBufferAdaptor =
[[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:self.assetWriterInput
sourcePixelBufferAttributes:
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey,
nil]];
self.videoPath = [Utilities pathForTemporaryFileWithSuffix:#"mov"];
NSURL *videoURL = [NSURL fileURLWithPath:self.videoPath];
/* Asset writer with MPEG4 format*/
self.assetWriterMyData = [[AVAssetWriter alloc]
initWithURL: videoURL
fileType:AVFileTypeMPEG4
error:nil];
[self.assetWriterMyData addInput:self.assetWriterInput];
self.assetWriterInput.expectsMediaDataInRealTime = YES;
[self.assetWriterMyData startWriting];
[self.assetWriterMyData startSessionAtSourceTime:kCMTimeZero];
self.isReadyToWrite = YES;
}
logic method to stop recording:
-(void)logicMethodToStopRecording{
float time = [self.timeFromServer floatValue];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, time * NSEC_PER_SEC), dispatch_get_main_queue(), ^{
[self stopRecording];
});
}
actual stop recording Method:
-(void)stopRecording{
[self stopWritingToVideoFile];
}
-(void)stopWritingToVideoFile {
[self.assetWriterMyData finishWritingWithCompletionHandler:^{
[self showLoading];
if(!self.continueRunning){
return;
}
[self sendVideo];
// sends video file referenced in startWritingToVideoFile method
}];
}
The sending video method:
-(void)sendVideo{
[self.xxxxxx sendAPI:self.userToVerifyUserId videoPath:self.videoPath callback:^(NSString *jsonResponse){
[self handleResponse:jsonResponse];
}];
}
The response is handled by the following method:
-(void)handleResponse: (NSString*)result{
[Utilities deleteFile:self.videoPath];
//if no success but retry is true. re-record and re-send.
if(!self.success && retry){
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 3.0 * NSEC_PER_SEC), dispatch_get_main_queue(), ^{
[self startRecording]; // start recording called again
});
}
The initial file I create has suppose x seconds of video ---- it is being generated correctly
However, during re-attempts:
I am deleting the old file and recreating new files. Still my video output in the second and third attempts have black screen for n+x and 2n+x times. Please help.
************** EDIT ****************************************************
Additionally the callback method for AVCaptureSession does the following:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// a very dense way to keep track of the time at which this frame
// occurs relative to the output stream, but it's just an example!
static int64_t frameNumber = 0;
if(self.assetWriterInput.readyForMoreMediaData){
if(self.pixelBufferAdaptor != nil){
[self.pixelBufferAdaptor appendPixelBuffer:imageBuffer
withPresentationTime:CMTimeMake(frameNumber, 25)];
frameNumber++;
}
}
}
}
**************************EDIT - 2 *********************************************
+(NSString *)pathForTemporaryFileWithSuffix:(NSString *)suffix
{
NSString * result;
CFUUIDRef uuid;
CFStringRef uuidStr;
uuid = CFUUIDCreate(NULL);
assert(uuid != NULL);
uuidStr = CFUUIDCreateString(NULL, uuid);
assert(uuidStr != NULL);
result = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.%#", uuidStr, suffix]];
assert(result != nil);
CFRelease(uuidStr);
CFRelease(uuid);
return result;
}

The AVAudioEngine gets data from the network and plays it with noise or silence

I want to use the AVAudioEngine to play the data from the network, but when I get the data from the network and use scheduleBuffer to add the data, there will be noise or the sound will simply break
//init
- (void)viewDidLoad {
[super viewDidLoad];
[self initWebSocket:server_ip];
self.engine = [[AVAudioEngine alloc] init];
self.playerNode = [[AVAudioPlayerNode alloc] init];
[self.engine attachNode:self.playerNode];
self.format = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatFloat32
sampleRate:(double)48000.0
channels:(AVAudioChannelCount)1
interleaved:NO];
AVAudioMixerNode *mainMixer = [self.engine mainMixerNode];
[self.engine connect:self.playerNode to:mainMixer format:self.format];
if (!self.engine.isRunning) {
[self.engine prepare];
NSError *error;
BOOL success;
success = [self.engine startAndReturnError:&error];
NSAssert(success, #"couldn't start engine, %#", [error localizedDescription]);
}
[self.playerNode play];
}
// get data from network
- (void)webSocket:(SRWebSocket *)webSocket didReceiveMessage:(id)message{
NSData *data = message;
AVAudioPCMBuffer *pcmBuffer = [[AVAudioPCMBuffer alloc]
initWithPCMFormat:self.format
frameCapacity:(uint32_t)(data.length)
/self.format.streamDescription->mBytesPerFrame];
pcmBuffer.frameLength = pcmBuffer.frameCapacity;
[data getBytes:*pcmBuffer.floatChannelData length:data.length];
[self.playerNode scheduleBuffer:pcmBuffer completionHandler:nil];
}
You set 4 times longer than data length to the frameLength
Your Audio format uses float (32bit).
But NSData includes UInt8 (8bit) data stream.
So, pcmBuffer.frameLength must be data.length/4.

How to Edit Brightness,Contrast and Saturation and add subtitles of already created video saved in library?

I am working on a video app in which I have to Adjust brightness,Contrast and saturation of already created video.Also I have to add subtitle like in movies.I have read a lot about it and came to know regarding videos that we can add brightness,contrast and saturation at the time of creating video but can not edit in a already created video.Also I have came to know how I can add text in video but I want it to come like subtitles at intervals when video plays like movies.
Using the GPUImage I changed brightness like this at the time of recording.
GPUImageFilter *selectedFilter = nil ;
selectedFilter = [[GPUImageBrightnessFilter alloc] init];
[(GPUImageBrightnessFilter*)selectedFilter setBrightness:brightnesSlider.value];
But I need to edit the video which is already made and saved in the gallery.Any Clue.
References:
Apple Edit Demo
RAY WENDERLICH
Brightness,Contrast and saturation
Here is the code which worked for me.I used GPUImage.
viewController.h
#import "GPUImage.h"
GPUImageMovie *movieFile;
GPUImageOutput<GPUImageInput> *filter;
GPUImageMovieWriter *movieWriter;
int ArrayIndex;
UISlider *mSlider;
ViewController.m
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sample_iPod" withExtension:#"m4v"];
mSlider=[[UISlider alloc]initWithFrame:CGRectMake(60,380,200, 30)];
mSlider.continuous=YES;
[mSlider addTarget:self action:#selector(updatePixelWidth:) forControlEvents:UIControlEventValueChanged];
[self.view addSubview:mSlider];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = YES;
if(ArrayIndex==0)
{
filter=[[GPUImageBrightnessFilter alloc]init];
mSlider.maximumValue=1.0;
mSlider.minimumValue=-1.0;
mSlider.value=0.0;
}
else if(ArrayIndex==1)
{
filter=[[GPUImageContrastFilter alloc]init];
mSlider.minimumValue=0.0;
mSlider.maximumValue=4.0;
mSlider.value=1.0;
}
else if(ArrayIndex==2)
{
filter=[[GPUImageSaturationFilter alloc]init];
mSlider.minimumValue=0.0;
mSlider.maximumValue=2.0;
mSlider.value=1.0;
}
[movieFile addTarget:filter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)self.view;
[filter addTarget:filterView];
// In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(640.0, 480.0)];
[filter addTarget:movieWriter];
// Configure this for video from the movie file, where we want to preserve all video frames and audio samples
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (pathToMovie)) {
UISaveVideoAtPathToSavedPhotosAlbum (pathToMovie,self, #selector(video:didFinishSavingWithError:contextInfo:), nil);
}
}];
- (void)updatePixelWidth:(id)sender
{
if(ArrayIndex==0)
{
[(GPUImageBrightnessFilter *)filter setBrightness:[(UISlider *)sender value]];
}
else if (ArrayIndex==1)
{
[(GPUImageContrastFilter *)filter setContrast:[(UISlider *)sender value]];
}
else if (ArrayIndex==2)
{
[(GPUImageSaturationFilter *)filter setSaturation:[(UISlider *)sender value]];
}
}

Record video in iOS app without leaving app

PhoneGap offers a plug-in for recording video, but this plug-in seems to force users to leave the app for the Camera app in order to record video.
Is it possible to initiate and stop video recording without leaving the app?
The two methods below are the primary methods we are using (rest of plug-in is viewable from the link above) to capture video. How should we modify this to allow recording from the front camera, without leaving the app, when the user taps a button?
Can anyone provide guidance please? Thanks.
Capture.m:
- (void)captureVideo:(CDVInvokedUrlCommand*)command
{
NSString* callbackId = command.callbackId;
NSDictionary* options = [command.arguments objectAtIndex:0];
if ([options isKindOfClass:[NSNull class]]) {
options = [NSDictionary dictionary];
}
// options could contain limit, duration and mode
// taking more than one video (limit) is only supported if provide own controls via cameraOverlayView property
NSNumber* duration = [options objectForKey:#"duration"];
NSString* mediaType = nil;
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
// there is a camera, it is available, make sure it can do movies
pickerController = [[CDVImagePicker alloc] init];
NSArray* types = nil;
if ([UIImagePickerController respondsToSelector:#selector(availableMediaTypesForSourceType:)]) {
types = [UIImagePickerController availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];
// NSLog(#"MediaTypes: %#", [types description]);
if ([types containsObject:(NSString*)kUTTypeMovie]) {
mediaType = (NSString*)kUTTypeMovie;
} else if ([types containsObject:(NSString*)kUTTypeVideo]) {
mediaType = (NSString*)kUTTypeVideo;
}
}
}
if (!mediaType) {
// don't have video camera return error
NSLog(#"Capture.captureVideo: video mode not available.");
CDVPluginResult* result = [CDVPluginResult resultWithStatus:CDVCommandStatus_ERROR messageToErrorObject:CAPTURE_NOT_SUPPORTED];
[self.commandDelegate sendPluginResult:result callbackId:callbackId];
pickerController = nil;
} else {
pickerController.delegate = self;
pickerController.sourceType = UIImagePickerControllerSourceTypeCamera;
pickerController.allowsEditing = NO;
// iOS 3.0
pickerController.mediaTypes = [NSArray arrayWithObjects:mediaType, nil];
if ([mediaType isEqualToString:(NSString*)kUTTypeMovie]){
if (duration) {
pickerController.videoMaximumDuration = [duration doubleValue];
}
//NSLog(#"pickerController.videoMaximumDuration = %f", pickerController.videoMaximumDuration);
}
// iOS 4.0
if ([pickerController respondsToSelector:#selector(cameraCaptureMode)]) {
pickerController.cameraCaptureMode = UIImagePickerControllerCameraCaptureModeVideo;
// pickerController.videoQuality = UIImagePickerControllerQualityTypeHigh;
// pickerController.cameraDevice = UIImagePickerControllerCameraDeviceRear;
// pickerController.cameraFlashMode = UIImagePickerControllerCameraFlashModeAuto;
}
// CDVImagePicker specific property
pickerController.callbackId = callbackId;
SEL selector = NSSelectorFromString(#"presentViewController:animated:completion:");
if ([self.viewController respondsToSelector:selector]) {
[self.viewController presentViewController:pickerController animated:YES completion:nil];
} else {
// deprecated as of iOS >= 6.0
[self.viewController presentModalViewController:pickerController animated:YES];
}
}
}
- (CDVPluginResult*)processVideo:(NSString*)moviePath forCallbackId:(NSString*)callbackId
{
// save the movie to photo album (only avail as of iOS 3.1)
/* don't need, it should automatically get saved
NSLog(#"can save %#: %d ?", moviePath, UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(moviePath));
if (&UIVideoAtPathIsCompatibleWithSavedPhotosAlbum != NULL && UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(moviePath) == YES) {
NSLog(#"try to save movie");
UISaveVideoAtPathToSavedPhotosAlbum(moviePath, nil, nil, nil);
NSLog(#"finished saving movie");
}*/
// create MediaFile object
NSDictionary* fileDict = [self getMediaDictionaryFromPath:moviePath ofType:nil];
NSArray* fileArray = [NSArray arrayWithObject:fileDict];
return [CDVPluginResult resultWithStatus:CDVCommandStatus_OK messageAsArray:fileArray];
}

captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error not being called

I am writing a videocapture app for ios 4+. It works fine on devices with ios 5+ but in ios 4+ the delegate didFinishRecordingToOutputFileAtURL is not being called after the recording has stopped. I have checked apple's reference which says "This method is always called for each recording request, even if no data is successfully written to the file."
https://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVCaptureFileOutputRecordingDelegate_Protocol/Reference/Reference.html
Any suggestions ?
Here is the complete code:
/
/
// HomeViewController.m
// MyAgingBooth
//
// Created by Mahmud on 29/10/11.
// Copyright 2011 __MyCompanyName__. All rights reserved.
//
#import "HomeViewController.h"
#import "Globals.h"
#import <MobileCoreServices/MobileCoreServices.h>
#import <MediaPlayer/MPMoviePlayerController.h>
#import "SharedData.h"
#import "ResultViewController.h"
#implementation HomeViewController
#synthesize BtnFromCamera, PreviewLayer;
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
// Custom initialization
isPlaying=NO;
playerScore=0;
playerTurn=0;
}
return self;
}
- (void)dealloc
{
//[levelTimer release];
[super dealloc];
}
- (void)didReceiveMemoryWarning
{
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
#pragma mark - View lifecycle
- (void)viewDidLoad
{
[super viewDidLoad];
self.navigationController.navigationBar.barStyle = UIBarStyleBlackTranslucent;
playerName.text=[NSString stringWithFormat: #"Player %d", (playerTurn+1)];
//add a right bar button item proceed to next.
UIBarButtonItem *proceedButton = [[UIBarButtonItem alloc] initWithTitle:#"Proceed" style:UIBarButtonItemStylePlain target:self action:#selector(proceedToNext)];
//[proceedButton setImage:[UIImage imageNamed:#"info.png"]];
self.navigationItem.rightBarButtonItem=proceedButton;
[proceedButton release];
[BtnFromCamera setTitle:#"Start" forState:UIControlStateNormal];
//[self.BtnFromCamera setImage:camera forState:UIControlStateNormal];
//[self.BtnFromCamera setImage:camera forState:UIControlStateSelected];
NSArray *words=[NSArray arrayWithObjects:#"SAY: Girls",#"SAY: Shut up", #"SAY: Tiger",#"SAY: Absurd",#"SAY: Tonight", #"SAY: Amstardam", nil];
[word setText:[words objectAtIndex:arc4random()%6]];
[self initCaptureSession];
}
-(void) proceedToNext
{
self.title=#"Back";
ResultViewController *resultViewController= [[ResultViewController alloc] initWithNibName:#"ResultViewController" bundle:nil];
[self.navigationController pushViewController:resultViewController animated:YES];
[resultViewController release];
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationPortrait);
}
//Action handlers for the buttons
// take snap with camera
-(void) initCaptureSession
{
NSLog(#"Setting up capture session");
CaptureSession = [[AVCaptureSession alloc] init];
//----- ADD INPUTS -----
NSLog(#"Adding video input");
//ADD VIDEO INPUT
AVCaptureDevice *VideoDevice = [self frontFacingCameraIfAvailable ];
//[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (VideoDevice)
{
NSError *error;
VideoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:VideoDevice error:&error];
if (!error)
{
if ([CaptureSession canAddInput:VideoInputDevice])
[CaptureSession addInput:VideoInputDevice];
else
NSLog(#"Couldn't add video input");
}
else
{
NSLog(#"Couldn't create video input");
}
}
else
{
NSLog(#"Couldn't create video capture device");
}
//ADD AUDIO INPUT
NSLog(#"Adding audio input");
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (audioInput)
{
[CaptureSession addInput:audioInput];
}
//----- ADD OUTPUTS -----
//ADD VIDEO PREVIEW LAYER
NSLog(#"Adding video preview layer");
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:CaptureSession] autorelease]];
PreviewLayer.orientation = AVCaptureVideoOrientationPortrait; //<<SET ORIENTATION. You can deliberatly set this wrong to flip the image and may actually need to set it wrong to get the right image
[[self PreviewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
//ADD MOVIE FILE OUTPUT
NSLog(#"Adding movie file output");
MovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
Float64 TotalSeconds = 60; //Total seconds
int32_t preferredTimeScale = 30; //Frames per second
CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale); //<<SET MAX DURATION
MovieFileOutput.maxRecordedDuration = maxDuration;
MovieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024; //<<SET MIN FREE SPACE IN BYTES FOR RECORDING TO CONTINUE ON A VOLUME
if ([CaptureSession canAddOutput:MovieFileOutput])
[CaptureSession addOutput:MovieFileOutput];
AudioOutput = [[AVCaptureAudioDataOutput alloc] init];
if([CaptureSession canAddOutput:AudioOutput])
{
[CaptureSession addOutput:AudioOutput];
NSLog(#"AudioOutput addedd");
}
//SET THE CONNECTION PROPERTIES (output properties)
[self CameraSetOutputProperties]; //(We call a method as it also has to be done after changing camera)
//----- SET THE IMAGE QUALITY / RESOLUTION -----
//Options:
// AVCaptureSessionPresetHigh - Highest recording quality (varies per device)
// AVCaptureSessionPresetMedium - Suitable for WiFi sharing (actual values may change)
// AVCaptureSessionPresetLow - Suitable for 3G sharing (actual values may change)
// AVCaptureSessionPreset640x480 - 640x480 VGA (check its supported before setting it)
// AVCaptureSessionPreset1280x720 - 1280x720 720p HD (check its supported before setting it)
// AVCaptureSessionPresetPhoto - Full photo resolution (not supported for video output)
NSLog(#"Setting image quality");
[CaptureSession setSessionPreset:AVCaptureSessionPresetMedium];
if ([CaptureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) //Check size based configs are supported before setting them
[CaptureSession setSessionPreset:AVCaptureSessionPreset640x480];
//----- DISPLAY THE PREVIEW LAYER -----
//Display it full screen under out view controller existing controls
NSLog(#"Display the preview layer");
CGRect layerRect = CGRectMake(10,44,300,290); //[[[self view] layer] bounds];
[PreviewLayer setBounds:layerRect];
[PreviewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),
CGRectGetMidY(layerRect))];
//[[[self view] layer] addSublayer:[[self CaptureManager] previewLayer]];
//We use this instead so it goes on a layer behind our UI controls (avoids us having to manually bring each control to the front):
UIView *CameraView = [[[UIView alloc] init] autorelease];
[[self view] addSubview:CameraView];
//[self.view sendSubviewToBack:CameraView];
[[CameraView layer] addSublayer:PreviewLayer];
//----- START THE CAPTURE SESSION RUNNING -----
[CaptureSession startRunning];
}
//********** CAMERA SET OUTPUT PROPERTIES **********
- (void) CameraSetOutputProperties
{
AVCaptureConnection *CaptureConnection=nil;
//SET THE CONNECTION PROPERTIES (output properties)
NSComparisonResult order = [[UIDevice currentDevice].systemVersion compare: #"5.0.0" options: NSNumericSearch];
if (order == NSOrderedSame || order == NSOrderedDescending) {
// OS version >= 5.0.0
CaptureConnection = [MovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if (CaptureConnection.supportsVideoMinFrameDuration)
CaptureConnection.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (CaptureConnection.supportsVideoMaxFrameDuration)
CaptureConnection.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (CaptureConnection.supportsVideoMinFrameDuration)
{
CMTimeShow(CaptureConnection.videoMinFrameDuration);
CMTimeShow(CaptureConnection.videoMaxFrameDuration);
}
} else {
// OS version < 5.0.0
CaptureConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:[MovieFileOutput connections]];
}
//Set landscape (if required)
if ([CaptureConnection isVideoOrientationSupported])
{
AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationLandscapeRight; //<<<<<SET VIDEO ORIENTATION IF LANDSCAPE
[CaptureConnection setVideoOrientation:orientation];
}
//Set frame rate (if requried)
//CMTimeShow(CaptureConnection.videoMinFrameDuration);
//CMTimeShow(CaptureConnection.videoMaxFrameDuration);
}
- (IBAction) StartVideo
{
if (!isPlaying) {
self.navigationItem.rightBarButtonItem.enabled=NO;
[BtnFromCamera setTitle:#"Stop" forState:UIControlStateNormal];
playerName.text=[NSString stringWithFormat:#"Player %d", playerTurn+1];
playerScore=0;
count=0;
isPlaying=YES;
//Create temporary URL to record to
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath])
{
NSError *error;
if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
{
//Error - handle if requried
NSLog(#"file remove error");
}
}
[outputPath release];
//Start recording
[MovieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
[outputURL release];
//NSString *DestFilename = # "output.mov";
//Set the file save to URL
/* NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:#"/output_%#.mov", [dateFormatter stringFromDate:[NSDate date]]];
[dateFormatter release];
NSURL* saveLocationURL = [[NSURL alloc] initFileURLWithPath:destinationPath];
[MovieFileOutput startRecordingToOutputFileURL:saveLocationURL recordingDelegate:self];
[saveLocationURL release]; */
levelTimer = [NSTimer scheduledTimerWithTimeInterval:0.05 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
}
else
{
isPlaying=NO;
NSLog(#"STOP RECORDING");
[MovieFileOutput stopRecording];
[levelTimer invalidate];
[BtnFromCamera setTitle:#"Start" forState:UIControlStateNormal];
self.navigationItem.rightBarButtonItem.enabled=YES;
}
}
//********** DID FINISH RECORDING TO OUTPUT FILE AT URL **********/
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
NSLog(#"didFinishRecordingToOutputFileAtURL - enter");
BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value)
{
RecordedSuccessfully = [value boolValue];
}
}
if (RecordedSuccessfully)
{
//----- RECORDED SUCESSFULLY -----
NSLog(#"didFinishRecordingToOutputFileAtURL - success");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
{
NSLog(#"File save error");
}
else
{
playerScore=(playerScore/count);
NSDictionary *dict = [[NSDictionary alloc] initWithObjectsAndKeys:playerName.text, #"PlayerName", [NSNumber numberWithFloat:playerScore], #"Score", assetURL, #"VideoURL",nil];
SharedData *d=[SharedData sharedManager];
[d.PlayerStats addObject:dict];
[dict release];
playerName.text=[NSString stringWithFormat:#"Score %f", playerScore];
playerTurn++;
}
}];
}
else {
NSString *assetURL=[self copyFileToDocuments:outputFileURL];
if(assetURL!=nil)
{
playerScore=(playerScore/count);
NSDictionary *dict = [[NSDictionary alloc] initWithObjectsAndKeys:playerName.text, #"PlayerName", [NSNumber numberWithFloat:playerScore], #"Score",assetURL , #"VideoURL",nil];
SharedData *d=[SharedData sharedManager];
[d.PlayerStats addObject:dict];
[dict release];
playerName.text=[NSString stringWithFormat:#"Score %f", playerScore];
playerTurn++;
}
}
[library release];
}
}
- (NSString*) copyFileToDocuments:(NSURL *)fileURL
{
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:#"/output_%#.mov", [dateFormatter stringFromDate:[NSDate date]]];
[dateFormatter release];
NSError *error;
if (![[NSFileManager defaultManager] copyItemAtURL:fileURL toURL:[NSURL fileURLWithPath:destinationPath] error:&error]) {
NSLog(#"File save error %#", [error localizedDescription]);
return nil;
}
return destinationPath;
}
- (void)levelTimerCallback:(NSTimer *)timer {
AVCaptureConnection *audioConnection = [self connectionWithMediaType:AVMediaTypeAudio fromConnections:[MovieFileOutput connections]];
//return [audioConnection isActive];
for (AVCaptureAudioChannel *channel in audioConnection.audioChannels) {
float avg = channel.averagePowerLevel;
// float peak = channel.peakHoldLevel;
float vol=powf(10, avg)*1000;
NSLog(#"Power: %f",vol);
if (isPlaying && vol > 0) {
playerScore=playerScore+vol;
count=count+1;
}
}
}
- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections
{
for ( AVCaptureConnection *connection in connections ) {
for ( AVCaptureInputPort *port in [connection inputPorts] ) {
if ( [[port mediaType] isEqual:mediaType] ) {
return connection;
}
}
}
return nil;
}
- (AVCaptureDevice *)frontFacingCameraIfAvailable
{
// look at all the video devices and get the first one that's on the front
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
if (device.position == AVCaptureDevicePositionFront)
{
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if ( ! captureDevice)
{
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
#end
Fixed the issue. I accedentally added an extra audio output. Removing the following fragment works for me.
AudioOutput = [[AVCaptureAudioDataOutput alloc] init];
if([CaptureSession canAddOutput:AudioOutput])
{
[CaptureSession addOutput:AudioOutput];
NSLog(#"AudioOutput addedd");
}

Resources