AVFoundation custom video recorder audio issues - ios

I try to learn how to create custom video recorder, everything work fine except when the audio having a lot of noisy sound. I try in the system camera, instagram or any other camera app their audio quality is very good and don't have some kind of low level sound. Beside, the recorded volume of the video also significantly lower compare to other camera app. I can't found any answer on stackoverflow which most of them having answer about AVAudioSession but not on AVCaptureSession.
Here is the code how I implement the recorder
_session = [[AVCaptureSession alloc]init];
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput *audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:[devices objectAtIndex:0] error:nil];
if ([_session canAddInput:audioInput])
[_session addInput:audioInput];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:nil];
if ( [_session canAddInput:deviceInput] )
[_session addInput:deviceInput];
_videoOutput = [[AVCaptureVideoDataOutput alloc] init];
_videoOutput.alwaysDiscardsLateVideoFrames = NO;
if ( [_session canAddOutput:_videoOutput] )
[_session addOutput:_videoOutput];
_audioOutput = [[AVCaptureAudioDataOutput alloc] init];
if ( [_session canAddOutput:_audioOutput] )
[_session addOutput:_audioOutput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:CGRectMake(0, 0, rootLayer.bounds.size.width, rootLayer.bounds.size.height)];
[rootLayer insertSublayer:previewLayer atIndex:0];
[_session setSessionPreset:AVCaptureSessionPresetHigh];
dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[_videoOutput setSampleBufferDelegate:self queue:queue];
[_audioOutput setSampleBufferDelegate:self queue:queue];
AVCaptureConnection *videoConnection = [_videoOutput connectionWithMediaType:AVMediaTypeVideo];
[videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[_session startRunning];
The video quality is fine and the code working, just having trouble on the audio part. So
How can I improve the audio quality when recording a video?
How can I increase the recording volume of the video?
I know how to increase it when just pure recording an audio with the code below but not when recording video.
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:(AVAudioSessionCategoryOptionDefaultToSpeaker | AVAudioSessionCategoryOptionAllowBluetooth) error:&sessionCategoryError];

Ok I finally figure it out. After create the AVCaptureSession do the following code.
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:(AVAudioSessionCategoryOptionDefaultToSpeaker | AVAudioSessionCategoryOptionAllowBluetooth) error:nil];
[audioSession setActive:YES error:nil];
_session = [[AVCaptureSession alloc]init];
//the 2 line below is very important.
_session.usesApplicationAudioSession = YES;
_session.automaticallyConfiguresApplicationAudioSession = NO;
Now the ouput audio of the video is even better than the system

Related

AVFoundation camera not visible

I have just started AVFoundation and I'm trying to capture an image.The problem is that I'm able to capture the image but unable to see the camera in my view.Here is my code:
session = [AVCaptureSession new];
if([session canSetSessionPreset:AVCaptureSessionPresetHigh])
[session setSessionPreset:AVCaptureSessionPresetHigh];//specifies photo quality
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
[session addInput:deviceInput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view]layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:_cameraView.frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
stillImageOutput = [AVCaptureStillImageOutput new];
[stillImageOutput setOutputSettings:[[NSDictionary alloc]initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];
[session addOutput:stillImageOutput];
[session startRunning];
Is there anything wrong with my code?
Try [[rootLayer layer] addSublayer:previewLayer];
For me it works!

Capture video full screen with AVCaptureSession

I need my app to display a live preview of what the front camera is filming, and this video stream's dimensions need to match the screen size of the device.
But there are black bars appearing at the top and bottom of the preview view...
How can capture and preview video that has the dimensions of the full screen? Note I don't want to resize the video!
Below is my code:
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];
UIView *cameraPreviewView = [[UIView alloc] initWithFrame:self.view.frame] ;
[self.view addSubview:cameraPreviewView];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
AVCaptureVideoPreviewLayer *cameraPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
cameraPreviewLayer.frame = cameraPreviewView.bounds;
[cameraPreviewView.layer addSublayer:cameraPreviewLayer];
[cameraPreviewLayer setBackgroundColor:[[UIColor blackColor] CGColor]];
[cameraPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[session startRunning];

Output from video recording with AVFoundation

I am recording a video using AVFoundation. I am using below code.
-(IBAction)record:(id)sender
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetHigh];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ( [session canAddInput:deviceInput] )
[session addInput:deviceInput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:CGRectMake(-70, 0, rootLayer.bounds.size.height, rootLayer.bounds.size.height)];
[rootLayer insertSublayer:previewLayer atIndex:0];
[session startRunning];
}
I can see the video on my previewlayer, how can I make this to NSData, so that I can send this to server instantaneously while recording for live broadcast.
You will need to create a AVCaptureVideoDataOutput and connect it to your session, and then implement
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
Inside here sampleBuffer holds the data of the frame
Again, I recommend you look at RosyWriter sample App as it has an example of getting the data of the frame in pixels

AVCaptureSession does not record audio on iPad 2

I've been making this app that includes recording a video together with audio. The videorecording works as it should, and on most devices, so does the audo recording.
Except, on an iPad 2 (iOS 6.1.3) the audio recording does not work. In the official "Camera" app, the audio recording works flawlessly, so it's not a device-dependent problem.
This is the code:
NSURL *outputFileURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"]];
NSError *error;
AVCaptureDevice *device = [self frontCamera];
AVCaptureDeviceInput *inputVideo = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
self.session = [[AVCaptureSession alloc] init];
[self.session beginConfiguration];
if([device supportsAVCaptureSessionPreset:AVCaptureSessionPreset1280x720]){
[self.session setSessionPreset:AVCaptureSessionPreset1280x720];
} else if([device supportsAVCaptureSessionPreset:AVCaptureSessionPresetiFrame960x540]){
[self.session setSessionPreset:AVCaptureSessionPresetiFrame960x540];
} else if([device supportsAVCaptureSessionPreset:AVCaptureSessionPreset640x480]){
[self.session setSessionPreset:AVCaptureSessionPreset640x480];
}
self.recorder = [[AVCamRecorder alloc] initWithSession:self.session outputFileURL:outputFileURL];
[self.recorder setDelegate:self];
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *audioError;
AVCaptureDeviceInput *inputAudio = [[AVCaptureDeviceInput alloc] initWithDevice:audioDevice error:&audioError];
NSLog(#"AudioError: %#", audioError);
NSLog(#"InputAudio: %#", inputAudio);
[self.session addInput:inputVideo];
[self.session addInput:inputAudio];
[self.session commitConfiguration];
[self.session startRunning];
In the log, "audioError" is null and "inputAudio“ seems to be a valid variable.
Any idea on how to fix this?

Double AVCaptureSession output on iOS

i'm trying to take a picture from both cameras on an iOS device at the same time. I'ld also like to have a live preview of both cameras on screen. I use this code:
- (void)prepareCameraView:(UIView *)window
{
NSArray *captureDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = window.layer;
NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = CGRectMake(0.0f, 0.0f, window.bounds.size.width/2.0f, window.bounds.size.height);
[window.layer addSublayer:captureVideoPreviewLayer];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:[captureDevices objectAtIndex:0] error:&error];
if (!input)
{
NSLog(#"ERROR : trying to open camera : %#", error);
}
[session addInput:input];
[session startRunning];
}
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = window.layer;
NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = CGRectMake(window.bounds.size.width/2.0f, 0.0f, window.bounds.size.width/2.0f, window.bounds.size.height);
[window.layer addSublayer:captureVideoPreviewLayer];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:[captureDevices objectAtIndex:1] error:&error];
if (!input)
{
NSLog(#"ERROR : trying to open camera : %#", error);
}
[session addInput:input];
[session startRunning];
}
}
But when the app starts the session for the front camera, the session of the back camera stops and leaves a still image.
Is there a way to display output from both cameras live ?
Thanks
No its not. At a time only one camera feed can be used when using AVCaptureSession.
Multiple AVCaptureInputs are not allowed simultaneously. So as soon as one session begins, the other will stop.
Your best bet would be to create two sessions, start the first and as soon as it reports a frame, stop it and start the second. Then stop the second and start the first, keep doing this. This will work but there will be noticeable latency in the inputs you receive.

Resources