AVCaptureDevice is always null on simulator - ios

I am trying to capture live microphone audio data.
I took the following from the apple example for AVCaptureSession.
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (audioInput) {
[captureSession addInput:audioInput];
}
else {
// Handle the failure.
NSLog(#"ERROR");
}
audioCaptureDevice and audioInput are both null.

Yes, it should be. Because simulator doesn't have any microphone. You should always test any audio, video, rendering related task on a real device.
Take a look about Limitations of Testing in iOS Simulator
Hardware Limitations While most of the functionality of iOS devices
can be simulated in iOS Simulator, there are some hardware features
that must be tested directly on a device. The hardware features that
cannot be simulated are:
Accelerometer
Gyroscope
Camera
Proximity
Sensor Microphone Input

The simulator cannot take the Mac microphone as a source. You need to use a real device to test that.

Simulator is not having mic and camera. So Check like and proceed
if let captureDevice = AVCaptureDevice.default(for: AVMediaType.audio) {
// allocate AVCaptureDevice
}

Related

How to use any video bitstream from external accessory for preview?

In iOS, To preview the video, I found that I should use AVCaptureVideoPreviewLayer with an instance of AVCaptureSession. For example,
AVCaptureSession *captureSession = <#Get a capture session>;
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
UIView *aView = <#The view in which to present the layer#>;
previewLayer.frame = aView.bounds;
[aView.layer addSublayer:previewLayer];
And AVCaptureSession needs some AVCaptureDevices and AVCaptureDeviceInputs
For example,
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (audioInput) {
[captureSession addInput:audioInput];
}
else {
// Handle the failure.
}
I referred to apple development documents for the above examples.
But, whether the devices are audio or video, all examples uses only built-in camera, and mic of iPhone/iPad.
My project doesn't use built-in camera and mic but external accessories which supports MP4 and is already MFi compliant.
I've already tested MFi authentication and identification and MP4 bitstream coming to iPhone devices using external accessory framework.
But, I have no idea how I can use bitstream from external accessory (instead of built-in camera and mic) for displaying preview on UI view of iPhone.
Is there any expert in this kind of problems ?

App-recorded videos sometimes play no audio when played on phone

Our app records and plays (those) videos. But somehow some of the videos are played without sound.
If i copy the video files to the mac via iTunes it plays the videos with sound, so the videos do have sound.
I checked the videos with a tool (called GSpot) and they all have the same audio codec and bitrate.
I tried about everything i found on SO but some of the videos dont play audio and the code is always the same and the videos do have sound as on a mac you can hear it.
This is how i setup the player
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *setCategoryError = nil;
[audioSession setActive:YES error:&setCategoryError];
[audioSession setCategory:AVAudioSessionCategoryPlayback
error:&setCategoryError];
_mpviewController = [[MPMoviePlayerViewController alloc] initWithContentURL:[NSURL fileURLWithPath:moviepath]];
_mpviewController.moviePlayer.movieSourceType = MPMovieSourceTypeFile;
_mpviewController.moviePlayer.controlStyle = MPMovieControlStyleNone;
[_mpviewController.moviePlayer setScalingMode:MPMovieScalingModeAspectFill];
_mpviewController.moviePlayer.shouldAutoplay = YES;
[_mpviewController.moviePlayer prepareToPlay];
_mpviewController.view.frame = CGRectMake(0, 0, _screenWidth, _screenHeight);
[self addSubview:_mpviewController.view];
some time later, on a button press, it starts playing.
The record settings
[_session beginConfiguration];
[_session setSessionPreset: AVCaptureSessionPreset640x480];
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
BOOL lock = [videoDevice lockForConfiguration:&error];
if(lock) videoDevice.focusMode = AVCaptureFocusModeContinuousAutoFocus;
[videoDevice unlockForConfiguration];
if(videoDevice == nil){
assert(0);
}
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error){
assert(0);
}
[_session addInput:input];
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
if(audioInput != nil) {
[_session addInput:audioInput];
}
[_session commitConfiguration];
I dont know if it is related but the audio seems to not play on videos that had a different view overlayed during the recording. It is possible to overlay a fullscreen view over the video preview while recording and it records fine and the file itself has audio and they play fine on a pc/mac... only on the device within our app those videos (again, not sure if related) have no audio. App uses same code for every video, so i can rename them, swap them etc, the behaviour has something to do with the audio channel of the video.
edit
Additional observation:
The vids where there is no sound, quicktime (only on mac) also plays no audio. QT on windows does play audio and all other players on windows do play audio as well.
Somehow the phone corrupts its own recording so that afterwards it doesnt recognize the audio channel it just recorded
edit 2
iPhone 6, iOS 8.1beta 2 -> as described
iPhone 4S, iOS 7.1.2 -> NO issue
iPad mini, iOS 8.0.2 -> always has the issue, not one videos audio channel can be read, but always exists and windows can play audio
I want to wrap this up.
The file was saved as ".mp4" and this causes the sporadic error.
Saving the file as ".mov" works. Same codec etc but without errors.
NSString *path = [[documentsDirectoryPath stringByAppendingPathComponent:
[NSString stringWithFormat:#"/%#", name]] stringByAppendingString:#".mov"];
outputURL = [[NSURL alloc] initFileURLWithPath:path];
[mAVCaptureMovieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];

Where is audio SampleBuffer in OpenTok, TokBox ios SDK

I am using the OpenTok iOS sdk to stream from iphone to chrome. What I would like to do is record a high res version of the video while streaming.
Using a custom video capturer via the OTVideoCapture interface from Example 2 Let's Build OTPublisher, I can successfully record the video sample buffer to file. The problem is, I cannot find any reference to the audio data gathered from the microphone.
I assume its using a audioInput(AVCaptureDeviceInput), to an audioOutput(AVCaptureAudioDataOutput) via AVCaptureAudioDataOutputSampleBufferDelegate is used somewhere.
Does anyone know how to access it from the OpenTok iOS SDK?
The captureOutput:didOutputSampleBuffer:fromConnection , fromConnection field will differentiate the audio and sound connection and provides the corresponding buffer.
To setup the audio input/output you can try in Let-Build-OTPublisher initCapture method
//add audio input / outputs
AVCaptureDevice * audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
_audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
if([_captureSession canAddInput:_audioInput])
{
NSLog(#"added audio device input");
[_captureSession addInput:_audioInput];
}
_audioOutput = [[AVCaptureAudioDataOutput alloc] init];
if([_captureSession canAddOutput:_audioOutput])
{
NSLog(#"audio output added");
[_captureSession addOutput:_audioOutput];
}
[_audioOutput setSampleBufferDelegate:self queue:_capture_queue];

camera not focusing on iPhone 4, running iOS 7.1

We are having trouble after the iOS upgrade went from 7.0.6 to 7.1.0. I don't see this issue on iPhone 4s, 5, 5c, nor 5s running iOS 7.1 So much for all the non-fragmentation talk. I am posting the camera initialization code:
- (void)initCapture
{
//Setting up the AVCaptureDevice (camera)
AVCaptureDevice* inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError* cameraError;
if ([inputDevice lockForConfiguration:&cameraError])
{
if ([inputDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus])
{
NSLog(#"AVCaptureDevice is set to video with continuous auto focus");
CGPoint autofocusPoint = CGPointMake(0.5f, 0.5f);
[inputDevice setFocusPointOfInterest:autofocusPoint];
[inputDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
}
[inputDevice unlockForConfiguration];
}
//setting up the input streams
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:nil];
//setting up up the AVCaptureVideoDataOutput
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
[captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
//setting up video settings
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
//passing the settings to the AVCaptureVideoDataOutput
[captureOutput setVideoSettings:videoSettings];
//setting up the AVCaptureSession
captureSession = [[AVCaptureSession alloc] init];
captureSession.sessionPreset = AVCaptureSessionPresetMedium;
[captureSession addInput:captureInput];
[captureSession addOutput:captureOutput];
if (!prevLayer)
{
prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
}
NSLog(#"initCapture preview Layer %p %#", self.prevLayer, self.prevLayer);
self.prevLayer.frame = self.view.bounds;
self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: self.prevLayer];
[self.captureSession startRunning];
}
Any help would be greatly appreciated...
The code provided by Apple you are using is outdated - they have fully rewritten it now. I'd try my luck and go for the new workflow.
Check it out here.
To close this thread up, we were using the camera for scanning of QR codes in addition to the libzxing. We decided to implement native iOS 7.0 AVCaptureMetadataOutputObjectsDelegate instead of the older AVCaptureVideoDataOutputSampleBufferDelegate. The Metadata delegate is much simpler and cleaner, and we found the example in http://nshipster.com/ios7/ very helpful.
Here are some ideas to diagnose your problem:
You have no else case for if ([inputDevice lockForConfiguration:&cameraError]). Add one.
In the else case, log the error contained in cameraError.
You have no else case for if ([inputDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus]). Add one; log that, or add a breakpoint there to test in your debugging.
You don't check the return value of the property focusPointOfInterestSupported, before attempting setFocusPointOfInterest.
Consider calling setFocusMode before setFocusPointOfInterest (not sure if it matters, but that's what I have)
In general, you may want to do all your checks before attempting to lock the configuration.
Following neuman8's comment stating that something in libzxing is preventing the refocus, I did some investigating myself
I found the following line in the Decoder.mm file to be the culprit.
ArrayRef<char> subsetData (subsetBytesPerRow * subsetHeight);
It seems that ArrayRef is a class in zxing/common/Array.h file that attempts to allocate an array with the specified size. It did not seem to do anything wrong, but I guessed that the allocation of about 170k char element array may take some time and be the culprit for slowing down the blocking call enough to prevent other threads from running.
So, I tried to just put in a brute force solution to test the hypothesis. I added a sleep just after the allocation.
[NSThread sleepForTimeInterval:0.02];
The camera started focusing again and was able to decipher the QR codes.
I am still unable to find a better way to resolve this. Is there anyone who is able to figure a more efficient allocation of the large array, or have a more elegant way of yielding the thread for the camera focus?Otherwise this should solve the problem for now, even if it is ugly.

IOS Audio Recording, How to Check if Mic / Playback is Busy Before Taking Mic

If anything is playing, recording, how to we check to see if the MIC is available (idle) for recording? Currently using
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
VCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice : audioCaptureDevice error:&error];
AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc] init];
[captureSession addInput : audioInput];
[captureSession addOutput : audioOutput];
[captureSession startRunning];
Need to check before grabbing the MIC / Playback from something that is already has it.
The mic device can not be busy/access to it can not be locked, even if you call [AVCaptureDevice lockForConfiguration] on a mic device it will not lock it and it is still accessible to the foreground application.
To see if other audio is playing you can check kAudioSessionProperty_OtherAudioIsPlaying e.g.:
UInt32 propertySize, audioIsAlreadyPlaying=0;
propertySize = sizeof(UInt32);
AudioSessionGetProperty(kAudioSessionProperty_OtherAudioIsPlaying, &propertySize, &audioIsAlreadyPlaying);
Additionally on Audio Session Programming Guide it is stated: "There is no programmatic way to ensure that an audio session is never interrupted. The reason is that iOS always gives priority to the phone. iOS also gives high priority to certain alarms and alerts"

Resources