Use Bluetooth device to record audio in AVCaptureSession - ios

I am using Apple's RosyWriter sample to record video and audio. Now I need to record audio using Bluetooth headset but it is not working for me. I am doing the below work for this
captureSession = [[AVCaptureSession alloc] init];
/*
* Create audio connection
*/
if(SYSTEM_VERSION_GREATER_THAN_OR_EQUAL_TO(#"7.0")){
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionAllowBluetooth error:nil];
captureSession.usesApplicationAudioSession = true;
captureSession.automaticallyConfiguresApplicationAudioSession = true;
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
// AVCaptureDeviceInput *audioIn = [[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:nil];
AVCaptureDeviceInput *audioIn = [[AVCaptureDeviceInput alloc] initWithDevice:audioDevice error:nil];
if ([captureSession canAddInput:audioIn])
[captureSession addInput:audioIn];
I have also followed this question

I was able to do this by changing audio out settings like below.
_audioCompressionSettings = [[audioOut recommendedAudioSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie] copy];

Related

iOS get Camera orientation

on iOS, is-it possible to get the Camera orientation (in degrees) like Android ?
Camera.CameraInfo info = new Camera.CameraInfo();
Camera.getCameraInfo(cameraId, info);
int degrees = info.orientation;
I tried
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for(AVCaptureDevice *camera in devices) {
AVCaptureDeviceInput *deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:nil];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session addInput:deviceInput];
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[session addOutput:output];
AVCaptureConnection *connection = [output connectionWithMediaType:AVMediaTypeMetadata];
NSLog(#((int)connection.videoOrientation).stringValue);
}
but it return "0" for all devices
Thanks
AVCaptureConnection has the videoOrientation property.
You can also try using device orientation, if you need to transform photo output somehow. Here you can find some instruction, how to work with device orientation if you need.

Camera takes dark image with iPhone

I am working on a media app and for capturing images. I am using AVcapture. Everything is working fine but I am facing an issue : when I am taking 5 images in a row using the camera, the first picture usually comes darker than the others.
My code for capture is as below:
[device lockForConfiguration:nil];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
[session beginConfiguration];
// Create device input and add to current session
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
[session addInput:input];
// Create video output and add to current session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
Does anyone know where does my issue come from?

AVCaptureSession addInput causing glitch in background audio

I'm making a video capturing iOS app and I want to be able to record audio from the microphone while allowing background music to play. I can do all of this but the background audio skips (pauses briefly) whenever the view with the camera enters and exits the foreground. I have isolated the bug to AVCaptureSession addInput:
AVCaptureSession session = [[AVCaptureSession alloc] init];
session.automaticallyConfiguresApplicationAudioSession = NO;
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
// this line causes the background music to skip
[session addInput:audioDeviceInput];
How can I prevent adding microphone input from affecting the background audio?
fyi - in didFinishLaunchingWithOptions I set the AVAudioSession Category:
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord
withOptions:AVAudioSessionCategoryOptionMixWithOthers | AVAudioSessionCategoryOptionDefaultToSpeaker
error:nil];
Apparently there is no workaround.
https://forums.developer.apple.com/message/74778#74778

XCode Barcode Scanner Phonegap Auto Focus

I develop phonegap ios application.I used barcode scanner zxing library. But I have a problem
How to implement camera auto focus ?
thank you
My Code:
-(NSString*)setUpCaptureSession {
NSError* error = nil;
AVCaptureSession* captureSession = [[[AVCaptureSession alloc] init] autorelease];
self.captureSession = captureSession;
AVCaptureDevice* device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (!device) return #"unable to obtain video capture device";
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) return #"unable to obtain video capture device input";
AVCaptureVideoDataOutput* output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
if (!output) return #"unable to obtain video capture output";
NSDictionary* videoOutputSettings = [NSDictionary
dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey
];
output.alwaysDiscardsLateVideoFrames = YES;
output.videoSettings = videoOutputSettings;
[output setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
if (![captureSession canSetSessionPreset:AVCaptureSessionPresetMedium]) {
return #"unable to preset medium quality video capture";
}
captureSession.sessionPreset = AVCaptureSessionPresetMedium;
if ([captureSession canAddInput:input]) {
[captureSession addInput:input];
}
else {
return #"unable to add video capture device input to session";
}
if ([captureSession canAddOutput:output]) {
[captureSession addOutput:output];
}
else {
return #"unable to add video capture output to session";
}
// setup capture preview layer
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
// run on next event loop pass [captureSession startRunning]
[captureSession performSelector:#selector(startRunning) withObject:nil afterDelay:0];
return nil;
}
Unfortunately it appears that the plugin you are using doesn't expose the capture device directly. It does, however, expose the AVCaptureSession via the captureSession property. From this property you should be able to work backwards to get the AVCaptureInputDevice
AVCaptureSession *session=[zxing captureSession]; //Assuming zxing the variable holding a reference to your zxing instance
NSArray *inputs= [session inputs];
AVCaptureInputDevice *input=(AVCaptureInputDevice *)inputs[0]; // Obtain first input device
AVCaptureDevice *device=input.device;
NSError *error;
if ([device lockForConfiguration:&error])
{
device.focusMode=AVCaptureFocusModeContinuousAutoFocus;
[device unlockForConfiguration];
}
else
{
// TODO Handle the device lock error
}

Camera feed slow to load with AVCaptureSession on iOS - How can I speed it up?

Right now I'm trying to allow users to take pictures in my app without using UIImagePickerController. I'm using AVCaptureSession and all the related classes to load a camera feed as a sublayer on a full-screen view I have on one of my view controllers. The code works but unfortunately the camera is very slow to load. Usually takes 2-3 seconds. Here is my code:
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
if ([session canSetSessionPreset:AVCaptureSessionPresetHigh])
//Check size based configs are supported before setting them
[session setSessionPreset:AVCaptureSessionPresetHigh];
[session setSessionPreset:AVCaptureSessionPreset1280x720];
CALayer *viewLayer = self.liveCameraFeed.layer;
//NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = viewLayer.bounds;
[viewLayer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device;
if(isFront)
{
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
else
{
device = [self frontCamera];
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
[session addInput:audioInput];
NSError *error = nil;
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
//NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
[session startRunning];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
Is there any way to speed it up? I've already tried loading it on another thread using Grand Central Dispatch and NSThread and though that stopped the app from freezing it made the loading of the camera take even longer. Any help is appreciated.
In my case, I need to wait for session to start running
dispatch_async(queue) {
self.session.startRunning()
dispatch_async(dispatch_get_main_queue()) {
self.delegate?.cameraManDidStart(self)
let layer = AVCaptureVideoPreviewLayer(session: self.session)
}
}
Waiting for AVCaptureSession's startRunning function was my solution too. You can run startRunning in global async and then in main thread you can add your AVCaptureVideoPreviewLayer.
Swift 4 sample
DispatchQueue.global().async {
self.captureSession.startRunning()
DispatchQueue.main.async {
let videoPreviewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
}
}
You can load the AVCaptureSession at the time of viewWillAppear. It works for me. When I switch to the view with the AVCaptureSession from other view, then I see the camera running immediately.
For anyone interested the solution I came up with was preloading the camera on a different thread and keeping it open.
I tried all the above methods but it was not as good as Instagram or Facebook, So I loaded the AVCaptureDevice, AVCaptureVideoPreviewLayer, AVCaptureSession in the Parent Screen and passed it as parameter to the Child Screen. It was loading very rapidly.

Resources