Output from video recording with AVFoundation - ios

I am recording a video using AVFoundation. I am using below code.
-(IBAction)record:(id)sender
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetHigh];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ( [session canAddInput:deviceInput] )
[session addInput:deviceInput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:CGRectMake(-70, 0, rootLayer.bounds.size.height, rootLayer.bounds.size.height)];
[rootLayer insertSublayer:previewLayer atIndex:0];
[session startRunning];
}
I can see the video on my previewlayer, how can I make this to NSData, so that I can send this to server instantaneously while recording for live broadcast.

You will need to create a AVCaptureVideoDataOutput and connect it to your session, and then implement
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
Inside here sampleBuffer holds the data of the frame
Again, I recommend you look at RosyWriter sample App as it has an example of getting the data of the frame in pixels

Related

AVFoundation custom video recorder audio issues

I try to learn how to create custom video recorder, everything work fine except when the audio having a lot of noisy sound. I try in the system camera, instagram or any other camera app their audio quality is very good and don't have some kind of low level sound. Beside, the recorded volume of the video also significantly lower compare to other camera app. I can't found any answer on stackoverflow which most of them having answer about AVAudioSession but not on AVCaptureSession.
Here is the code how I implement the recorder
_session = [[AVCaptureSession alloc]init];
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput *audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:[devices objectAtIndex:0] error:nil];
if ([_session canAddInput:audioInput])
[_session addInput:audioInput];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:nil];
if ( [_session canAddInput:deviceInput] )
[_session addInput:deviceInput];
_videoOutput = [[AVCaptureVideoDataOutput alloc] init];
_videoOutput.alwaysDiscardsLateVideoFrames = NO;
if ( [_session canAddOutput:_videoOutput] )
[_session addOutput:_videoOutput];
_audioOutput = [[AVCaptureAudioDataOutput alloc] init];
if ( [_session canAddOutput:_audioOutput] )
[_session addOutput:_audioOutput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:CGRectMake(0, 0, rootLayer.bounds.size.width, rootLayer.bounds.size.height)];
[rootLayer insertSublayer:previewLayer atIndex:0];
[_session setSessionPreset:AVCaptureSessionPresetHigh];
dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[_videoOutput setSampleBufferDelegate:self queue:queue];
[_audioOutput setSampleBufferDelegate:self queue:queue];
AVCaptureConnection *videoConnection = [_videoOutput connectionWithMediaType:AVMediaTypeVideo];
[videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[_session startRunning];
The video quality is fine and the code working, just having trouble on the audio part. So
How can I improve the audio quality when recording a video?
How can I increase the recording volume of the video?
I know how to increase it when just pure recording an audio with the code below but not when recording video.
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:(AVAudioSessionCategoryOptionDefaultToSpeaker | AVAudioSessionCategoryOptionAllowBluetooth) error:&sessionCategoryError];
Ok I finally figure it out. After create the AVCaptureSession do the following code.
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:(AVAudioSessionCategoryOptionDefaultToSpeaker | AVAudioSessionCategoryOptionAllowBluetooth) error:nil];
[audioSession setActive:YES error:nil];
_session = [[AVCaptureSession alloc]init];
//the 2 line below is very important.
_session.usesApplicationAudioSession = YES;
_session.automaticallyConfiguresApplicationAudioSession = NO;
Now the ouput audio of the video is even better than the system

AVFoundation camera not visible

I have just started AVFoundation and I'm trying to capture an image.The problem is that I'm able to capture the image but unable to see the camera in my view.Here is my code:
session = [AVCaptureSession new];
if([session canSetSessionPreset:AVCaptureSessionPresetHigh])
[session setSessionPreset:AVCaptureSessionPresetHigh];//specifies photo quality
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
[session addInput:deviceInput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view]layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:_cameraView.frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImageOutput = [AVCaptureStillImageOutput new];
[stillImageOutput setOutputSettings:[[NSDictionary alloc]initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];
[session addOutput:stillImageOutput];
[session startRunning];
Is there anything wrong with my code?
Try [[rootLayer layer] addSublayer:previewLayer];
For me it works!

Recording a video with AVFoundation

I've created the session, the device, input and output but when I call startRecordingToOutputFileURL:recordingDelegate: method, it calls didFinishRecordingToOutputFileAtURL delegate method, so it finishes without starting!
This is my code..
-(void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.view.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
_movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([session canAddOutput:movieFileOutput]) {
[session addOutput:movieFileOutput];
}
[session startRunning];
}
-(void)buttonClicked {
[_movieFileOutput startRecordingToOutputFileURL:url recordingDelegate:self];
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error {
//This method is getting called when I press the button
NSLog(#"Finished Recording");
}
So what is wrong with my code?
Did I miss anything?
Thanks
Finally it worked!
The problem was the URL.
I've tried many URLs with non of them worked, it has to be in the NSDocumentsDirectory, that's it!

Custom Camera View Not Working on iOS 8/Xcode 6

So I had this code working on another app of mine to take photos on a custom camera view when I had iOS 7 on my phone and Xcode 5.1, now on iOS 8 and Xcode 6, the camera works but I can't see the live view of the camera in my leftVertical UIView. Here's my code, would appreciate any help Thanks!
#import <AVFoundation/AVFoundation.h>
session = [[AVCaptureSession alloc] init];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
[session setSessionPreset:AVCaptureSessionPreset352x288];
else
[session setSessionPreset:AVCaptureSessionPreset352x288];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.leftVertical.frame;
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
//////////////////////////
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
try this...
In –viewWillAppear: start camera capture on main thread, like this..
dispatch_async(dispatch_get_main_queue(), ^{
if (![session isRunning])
{
[session startRunning];
}
});

Custom camera works, live preview does not

I'm attempting to build a custom camera app.
Tasks:
Shows an live image preview.
Allows you to tap anywhere on the screen to capture an image.
Displays the resulting image above the live image preview.
Currently everything works except for task 1. The view that contains my live image preview remains it's background color (or remains transparent when no background color is selected), even though the camera is operational and an image is displayed when the screen is tapped. Any ideas? I've referred to this previous discussion and I think I'm covering all my bases: AVFoundation camera preview layer not working
What am I missing?
-(void)viewWillAppear:(BOOL)animated{
session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
AVCaptureVideoPreviewLayer *previewlayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewlayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.frameforCapture.frame;
[previewlayer setFrame:frame];
[rootLayer insertSublayer:previewlayer atIndex:0];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
//....
}

Resources