I have an AVCaptureSession which manages video and image capture for my application.
I also have sound effects in the application which of course are silenced when the user
switches the silence switch on their iPhone.
However when I add AVCaptureAudioDataOutput to the session, the sound effects are no longer silenced when the silence switch is set.
Here is the code used for adding audion:
NSError *errorAud;
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&errorAud];
if (errorAud) {
NSLog(#"%#", [errorAud localizedDescription]);
}
if ( [avSession canAddInput:audioDeviceInput] ) {
[avSession addInput:audioDeviceInput];
[self setAudioCaptureDeviceInput:audioDeviceInput];
}
audioOutput = [[AVCaptureAudioDataOutput alloc] init];
[audioOutput setSampleBufferDelegate:self queue:vsessionQueue];
if ([avSession canAddOutput:audioOutput]) {
[avSession addOutput:audioOutput];
}
for (AVCaptureConnection *connection in [audioOutput connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeAudio]) {
audioConnection = connection;
break;
}
}
}
AVAuthorizationStatus audioAuthorizationStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeAudio];
//This sort of fixes the silent mode issue but makes the capture session
//stop outputs stop calling didOutputSampleBuffer
//Whenever a sound effect is played
//avSession.usesApplicationAudioSession = NO;
//avSession.automaticallyConfiguresApplicationAudioSession = NO;
As you can from the above, I tried messing with usesApplicationAudioSession and automaticallyConfiguresApplicationAudioSession but setting usesApplicationAudioSession = NO
makes the capturesession stop calling
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
whenever I play a sound effect using AVAudioPlayer
Related
use GPUImageMovieWriter recording video(other music app is background playing), but set audioEncodingTarget is movieWriter, music is stoped.
how continue playing
I resoveled this problem, we must set a property for automaticallyConfiguresApplicationAudioSession as NO before startCameraCapture.like this:
self.gpuCamera.captureSession.automaticallyConfiguresApplicationAudioSession.automaticallyConfiguresApplicationAudioSession = NO;
[self.gpuCamera startCameraCapture];
i set automaticallyConfiguresApplicationAudioSession = NO;
and
'[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionMixWithOthers error:nil];'
the background music still stoped.
for we used'[camera addAudioInputsAndOutputs]'
this must be set at first,or when configuation the preview will flash(auto focus)
'- (BOOL)addAudioInputsAndOutputs
{
if (audioOutput)
return NO;
[_captureSession beginConfiguration];
_microphone = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
audioInput = [AVCaptureDeviceInput deviceInputWithDevice:_microphone error:nil];
if ([_captureSession canAddInput:audioInput])
{
[_captureSession addInput:audioInput];
}
audioOutput = [[AVCaptureAudioDataOutput alloc] init];
if ([_captureSession canAddOutput:audioOutput])
{
[_captureSession addOutput:audioOutput];
}
else
{
NSLog(#"Couldn't add audio output");
}
[audioOutput setSampleBufferDelegate:self queue:audioProcessingQueue];
[_captureSession commitConfiguration];
return YES;
}'
it seems caused by the audio input and output.
I am using QuickBlox iOS SDK for vidoe chating in my app. It works fine. Now I want to record the chat video and save it in camera roll. How can I do that.
I have gone through their documentation and implemented this -
-(IBAction)record:(id)sender{
// Create video Chat
videoChat = [[QBChat instance] createAndRegisterVideoChatInstance];
[videoChat setIsUseCustomVideoChatCaptureSession:YES];
// Create capture session
captureSession = [[AVCaptureSession alloc] init];
// ... setup capture session here
/*We create a serial queue to handle the processing of our frames*/
dispatch_queue_t callbackQueue= dispatch_queue_create("cameraQueue", NULL);
[videoCaptureOutput setSampleBufferDelegate:self queue:callbackQueue];
/*We start the capture*/
[captureSession startRunning];
}
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer: (CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// Do something with samples
// ...
// forward video samples to SDK
[videoChat processVideoChatCaptureVideoSample:sampleBuffer];
}
But I am not sure what to do from here.
How should I get the video data ?
From the quickblox docs
To setup a custom video capture session you simply follow these steps:
create an instance of AVCaptureSession
setup the input and output
implement frames callback and forward all frames to the QuickBlox iOS SDK
tell the QuickBlox SDK that you will use your own capture session
To setup a custom video capture session, setup input and output:
-(void) setupVideoCapture{
self.captureSession = [[AVCaptureSession alloc] init];
__block NSError *error = nil;
// set preset
[self.captureSession setSessionPreset:AVCaptureSessionPresetLow];
// Setup the Video input
AVCaptureDevice *videoDevice = [self frontFacingCamera];
//
AVCaptureDeviceInput *captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error){
QBDLogEx(#"deviceInputWithDevice Video error: %#", error);
}else{
if ([self.captureSession canAddInput:captureVideoInput]){
[self.captureSession addInput:captureVideoInput];
}else{
QBDLogEx(#"cantAddInput Video");
}
}
// Setup Video output
AVCaptureVideoDataOutput *videoCaptureOutput = [[AVCaptureVideoDataOutput alloc] init];
videoCaptureOutput.alwaysDiscardsLateVideoFrames = YES;
//
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[videoCaptureOutput setVideoSettings:videoSettings];
/*And we create a capture session*/
if([self.captureSession canAddOutput:videoCaptureOutput]){
[self.captureSession addOutput:videoCaptureOutput];
}else{
QBDLogEx(#"cantAddOutput");
}
[videoCaptureOutput release];
// set FPS
int framesPerSecond = 3;
AVCaptureConnection *conn = [videoCaptureOutput connectionWithMediaType:AVMediaTypeVideo];
if (conn.isVideoMinFrameDurationSupported){
conn.videoMinFrameDuration = CMTimeMake(1, framesPerSecond);
}
if (conn.isVideoMaxFrameDurationSupported){
conn.videoMaxFrameDuration = CMTimeMake(1, framesPerSecond);
}
/*We create a serial queue to handle the processing of our frames*/
dispatch_queue_t callbackQueue= dispatch_queue_create("cameraQueue", NULL);
[videoCaptureOutput setSampleBufferDelegate:self queue:callbackQueue];
dispatch_release(callbackQueue);
// Add preview layer
AVCaptureVideoPreviewLayer *prewLayer = [[[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession] autorelease];
[prewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CGRect layerRect = [[myVideoView layer] bounds];
[prewLayer setBounds:layerRect];
[prewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
myVideoView.hidden = NO;
[myVideoView.layer addSublayer:prewLayer];
/*We start the capture*/
[self.captureSession startRunning];
}
- (AVCaptureDevice *) cameraWithPosition:(AVCaptureDevicePosition) position{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == position) {
return device;
}
}
return nil;
}
- (AVCaptureDevice *) backFacingCamera{
return [self cameraWithPosition:AVCaptureDevicePositionBack];
}
- (AVCaptureDevice *) frontFacingCamera{
return [self cameraWithPosition:AVCaptureDevicePositionFront];
}
Implement frames callback:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// Usually we just forward camera frames to QuickBlox SDK
// But we also can do something with them before, for example - apply some video filters or so
[self.videoChat processVideoChatCaptureVideoSample:sampleBuffer];
}
Tell to QuickBlox iOS SDK that we use our own video capture session:
self.videoChat = [[QBChat instance] createAndRegisterVideoChatInstance];
self.videoChat.viewToRenderOpponentVideoStream = opponentVideoView;
//
// we use own video capture session
self.videoChat.isUseCustomVideoChatCaptureSession = YES;
I am using AVCaptureSession to record audio and video and pass this data to
- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if(!_isPaused) {
pass sample buffer to encoder
}
}
Everything works nice (delegate is called and I am processing frames) until I open control panel and play music (I pause processing video from appDelegate). After stop music and try to continue recording the delegate method is never called. I think that playing audio something "ruins" my AVCaptureSession or delegate and I need to fix this issue...
here is method where I init AVCaptureSession
- (void) startup
{
if (_session == nil)
{
NSLog(#"Starting camera");
// create capture device with video input
_session = [[AVCaptureSession alloc] init];
AVCaptureDevice* backCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];
[_session addInput:input];
// audio input from default mic
AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput* micinput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
[_session addInput:micinput];
_captureQueue = dispatch_queue_create("myQueue", DISPATCH_QUEUE_SERIAL);
AVCaptureVideoDataOutput* videoout = [[AVCaptureVideoDataOutput alloc] init];
[videoout setSampleBufferDelegate:self queue:_captureQueue];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
nil];
videoout.videoSettings = setcapSettings;
[_session addOutput:videoout];
_videoConnection = [videoout connectionWithMediaType:AVMediaTypeVideo];
NSDictionary* actual = videoout.videoSettings;
AVCaptureAudioDataOutput* audioout = [[AVCaptureAudioDataOutput alloc] init];
[audioout setSampleBufferDelegate:self queue:_captureQueue];
[_session addOutput:audioout];
_audioConnection = [audioout connectionWithMediaType:AVMediaTypeAudio];
[_session startRunning];
}
}
and here is my interface:
#interface CameraController () <AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate>
{
AVCaptureSession* _session;
AVCaptureVideoPreviewLayer* _preview;
dispatch_queue_t _captureQueue;
AVCaptureConnection* _audioConnection;
AVCaptureConnection* _videoConnection; }
and in appDelegate I pause and resume rocording like:
-(void)applicationDidBecomeActive:(UIApplication *)application {
[[CamerraController controller] resume capture]; //this only set _isPaused = NO;
}
-(void)applicationWillResignActive:(UIApplication *)application {
[[CamerraController controller] pauseCapture]; //this only set _isPaused = YES;
}
I am in dead end, thank you for any help
I have a single view application in which I am trying to test iOS7's AVCaptureMetadataOutput based on this explanation. My ViewController conforms to AVCaptureMetadataOutputObjectsDelegate and the code looks like this (almost exactly the same as Mattt's):
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// Testing the VIN Scanner before I make it part of the library
NSLog(#"Setting up the vin scanner");
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (input) {
[session addInput:input];
} else {
NSLog(#"Error: %#", error);
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
[session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
{
NSString *code = nil;
for (AVMetadataObject *metadata in metadataObjects) {
if ([metadata.type isEqualToString:AVMetadataObjectTypeCode39Code]) {
code = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
NSLog(#"code: %#", code);
}
When I run this on an iOS7 device (I've tried an iPhone 4 and iPhone 4s) XCode logs "Setting up the vin scanner" but the camera (ie the AVCaptureSession) never opens.
Edit 1:
I added the following code to show the camera output on screen:
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Display full screen
previewLayer.frame = self.view.frame;
// Add the video preview layer to the view
[self.view.layer addSublayer:previewLayer];
But the display is very odd, does not conform to the screen and the way it rotates does not make sense. The other issue is that when I focus the camera on a bar code the metadata delegate method is never called. Please see pictures below:
The camera will not open the way it does for the UIImagePickerController. The problem is that your code does nothing with the output. You'll need to add a preview layer to display the output of the camera as it streams in.
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
// Display full screen
previewLayer.frame = CGRectMake(0.0, 0.0, self.view.frame.size.width, self.view.frame.size.height);
// Add the video preview layer to the view
[self.view.layer addSublayer:previewLayer];
[session startRunning];
Edit**
After taking a deeper look at your code I noticed a few more issues.
First you need to also set the MetaDataObjectTypes you want to search for, right now your not looking for any valid object types. This should be added after you add the output to the session. You can view the full list of available types in the documentation
[output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
Second your AVCaptureSession *session is a local variable in your viewDidLoad, take this and place it just after your #interface ViewController () as shown below.
#interface ViewController ()
#property (nonatomic, strong) AVCaptureSession *session;
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
self.session = [[AVCaptureSession alloc] init];
// Testing the VIN Scanner before I make it part of the library
NSLog(#"Setting up the vin scanner");
self.session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (input) {
[self.session addInput:input];
} else {
NSLog(#"Error: %#", error);
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[self.session addOutput:output];
[output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]];
[self.session startRunning];
}
I'm developping an app which captures photos from my iPad front camera.
The photos are coming very dark.
Does someone have an idea about how to fix this issue, please ?
Here is my code and some explainations :
1) I initialize my capture session
-(void)viewDidAppear:(BOOL)animated{
captureSession = [[AVCaptureSession alloc] init];
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera;
for (AVCaptureDevice *device in devices){
if ([device position] == AVCaptureDevicePositionFront) {
frontCamera = device;
}
}
if ([frontCamera isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]){
NSError *error=nil;
if ([frontCamera lockForConfiguration:&error]){
frontCamera.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
frontCamera.focusMode=AVCaptureFocusModeAutoFocus;
[frontCamera unlockForConfiguration];
}
}
NSError *error = nil;
AVCaptureDeviceInput *frontFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
[captureSession addInput:frontFacingCameraDeviceInput];
[captureSession setSessionPreset:AVCaptureSessionPresetHigh];
captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init];
captureImageOutput =[[AVCaptureStillImageOutput alloc] init];
[captureSession addOutput:captureVideoOutput];
[captureSession addOutput:captureImageOutput];
}
2) When the user presses the button Record, it starts a timer and preview the content of the camera to a preview layer
- (IBAction)but_record:(UIButton *)sender {
MainInt = 4;
timer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:#selector(countup) userInfo:nil repeats:YES];
previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:captureSession];
previewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
CGRect rect = CGRectMake(0, 0, self.aView.bounds.size.width, self.aView.bounds.size.height);
previewLayer.frame = rect;
[self.aView.layer addSublayer:previewLayer];
[captureSession startRunning];
}
3) At the end of the timer, the photo is taken and saved
- (void)countup {
MainInt -=1;
if (MainInt == 0) {
[timer invalidate];
timer = nil;
[captureSession stopRunning];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in captureImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[captureImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
stillImage = [[UIImage alloc] initWithData:imageData];
}];
[captureSession startRunning];
[captureSession stopRunning];
}
}
4) Finally, when the user press the save button, the image is recorded in a specific album
- (IBAction)but_save:(UIButton *)sender {
UIImage *img = stillImage;
[self.library saveImage:img toAlbum:#"mySpecificAlbum" withCompletionBlock:^(NSError *error)];
}
In fact, all the code works properly but the resulting images are very dark...
This was happening to me as well and it turned out I was trying to capture too soon and the camera didn't have enough time to stabilize. I had to add about 0.5 seconds of delay before the pictures would be normal brightness.
HTH
I had the same issue on an iOS 7 5th gen ipod touch, but not on a 4th gen ipod touch with iOS 6.1.
I found that the fix is to show a preview of the camera:
// Setup camera preview image
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_previewImage.layer addSublayer:previewLayer];
As instructed at https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW22
Note: I did not investigate accomplishing this without a preview