Where is audio SampleBuffer in OpenTok, TokBox ios SDK - ios

I am using the OpenTok iOS sdk to stream from iphone to chrome. What I would like to do is record a high res version of the video while streaming.
Using a custom video capturer via the OTVideoCapture interface from Example 2 Let's Build OTPublisher, I can successfully record the video sample buffer to file. The problem is, I cannot find any reference to the audio data gathered from the microphone.
I assume its using a audioInput(AVCaptureDeviceInput), to an audioOutput(AVCaptureAudioDataOutput) via AVCaptureAudioDataOutputSampleBufferDelegate is used somewhere.
Does anyone know how to access it from the OpenTok iOS SDK?

The captureOutput:didOutputSampleBuffer:fromConnection , fromConnection field will differentiate the audio and sound connection and provides the corresponding buffer.
To setup the audio input/output you can try in Let-Build-OTPublisher initCapture method
//add audio input / outputs
AVCaptureDevice * audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
_audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
if([_captureSession canAddInput:_audioInput])
{
NSLog(#"added audio device input");
[_captureSession addInput:_audioInput];
}
_audioOutput = [[AVCaptureAudioDataOutput alloc] init];
if([_captureSession canAddOutput:_audioOutput])
{
NSLog(#"audio output added");
[_captureSession addOutput:_audioOutput];
}
[_audioOutput setSampleBufferDelegate:self queue:_capture_queue];

Related

How to use any video bitstream from external accessory for preview?

In iOS, To preview the video, I found that I should use AVCaptureVideoPreviewLayer with an instance of AVCaptureSession. For example,
AVCaptureSession *captureSession = <#Get a capture session>;
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
UIView *aView = <#The view in which to present the layer#>;
previewLayer.frame = aView.bounds;
[aView.layer addSublayer:previewLayer];
And AVCaptureSession needs some AVCaptureDevices and AVCaptureDeviceInputs
For example,
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (audioInput) {
[captureSession addInput:audioInput];
}
else {
// Handle the failure.
}
I referred to apple development documents for the above examples.
But, whether the devices are audio or video, all examples uses only built-in camera, and mic of iPhone/iPad.
My project doesn't use built-in camera and mic but external accessories which supports MP4 and is already MFi compliant.
I've already tested MFi authentication and identification and MP4 bitstream coming to iPhone devices using external accessory framework.
But, I have no idea how I can use bitstream from external accessory (instead of built-in camera and mic) for displaying preview on UI view of iPhone.
Is there any expert in this kind of problems ?

Multiple AVCaptureVideoDataOutput per a single AVCaptureDevice at the same time

Scenario
I am working on an application that does video processing and streaming. I already have video capture from the back camera streaming perfectly. The problem is I have to do my processing to the video data also, but only locally. As it turns out the API I am using to do the local video processing requires a different pixel format than the APIs I am using to stream the data to my server. It seems I need to have two separate sessions capturing video from the back camera simultaneously. That would allow one session to do the processing and one for streaming.
Problem
Every time I attempt to create a new session to use the same AVCaptureDevice (back), my streaming immediately stops. Code below:
captureSession = [[AVCaptureSession alloc] init];
AVCaptureDeviceInput *videoIn = [[AVCaptureDeviceInput alloc]
initWithDevice:[self videoDeviceWithPosition:AVCaptureDevicePositionBack]
error:nil];
if ([captureSession canAddInput:videoIn])
{
[captureSession addInput:videoIn];
}
AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
[videoOut setAlwaysDiscardsLateVideoFrames:YES];
[videoOut setVideoSettings:
#{(id)kCVPixelBufferPixelFormatTypeKey: #(kCVPixelFormatType_32BGRA)}];
dispatch_queue_t videoCaptureQueue =
dispatch_queue_create("Video Process Queue", DISPATCH_QUEUE_SERIAL);
[videoOut setSampleBufferDelegate:self queue:videoCaptureQueue];
if ([captureSession canAddOutput:videoOut]) {
[captureSession addOutput:videoOut];
}
I receive an interruption reason videoDeviceInUseByAnotherClient.
videoDeviceInUseByAnotherClient: An interruption caused by the video device temporarily being made unavailable (for example, when used by another capture session).
I have also tried adding the output of the original capture session to the new session but every time the canAddOutput: method returns NO. My guess is because there is already a session associated with that output.
Question
How do I use the same AVCaptureDevice to output to two separate AVCaptureVideoDataOutputs at the same time? Or how can I achieve the same thing as the diagram below?

AVCaptureDevice is always null on simulator

I am trying to capture live microphone audio data.
I took the following from the apple example for AVCaptureSession.
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
if (audioInput) {
[captureSession addInput:audioInput];
}
else {
// Handle the failure.
NSLog(#"ERROR");
}
audioCaptureDevice and audioInput are both null.
Yes, it should be. Because simulator doesn't have any microphone. You should always test any audio, video, rendering related task on a real device.
Take a look about Limitations of Testing in iOS Simulator
Hardware Limitations While most of the functionality of iOS devices
can be simulated in iOS Simulator, there are some hardware features
that must be tested directly on a device. The hardware features that
cannot be simulated are:
Accelerometer
Gyroscope
Camera
Proximity
Sensor Microphone Input
The simulator cannot take the Mac microphone as a source. You need to use a real device to test that.
Simulator is not having mic and camera. So Check like and proceed
if let captureDevice = AVCaptureDevice.default(for: AVMediaType.audio) {
// allocate AVCaptureDevice
}

App-recorded videos sometimes play no audio when played on phone

Our app records and plays (those) videos. But somehow some of the videos are played without sound.
If i copy the video files to the mac via iTunes it plays the videos with sound, so the videos do have sound.
I checked the videos with a tool (called GSpot) and they all have the same audio codec and bitrate.
I tried about everything i found on SO but some of the videos dont play audio and the code is always the same and the videos do have sound as on a mac you can hear it.
This is how i setup the player
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *setCategoryError = nil;
[audioSession setActive:YES error:&setCategoryError];
[audioSession setCategory:AVAudioSessionCategoryPlayback
error:&setCategoryError];
_mpviewController = [[MPMoviePlayerViewController alloc] initWithContentURL:[NSURL fileURLWithPath:moviepath]];
_mpviewController.moviePlayer.movieSourceType = MPMovieSourceTypeFile;
_mpviewController.moviePlayer.controlStyle = MPMovieControlStyleNone;
[_mpviewController.moviePlayer setScalingMode:MPMovieScalingModeAspectFill];
_mpviewController.moviePlayer.shouldAutoplay = YES;
[_mpviewController.moviePlayer prepareToPlay];
_mpviewController.view.frame = CGRectMake(0, 0, _screenWidth, _screenHeight);
[self addSubview:_mpviewController.view];
some time later, on a button press, it starts playing.
The record settings
[_session beginConfiguration];
[_session setSessionPreset: AVCaptureSessionPreset640x480];
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
BOOL lock = [videoDevice lockForConfiguration:&error];
if(lock) videoDevice.focusMode = AVCaptureFocusModeContinuousAutoFocus;
[videoDevice unlockForConfiguration];
if(videoDevice == nil){
assert(0);
}
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error){
assert(0);
}
[_session addInput:input];
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput * audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
if(audioInput != nil) {
[_session addInput:audioInput];
}
[_session commitConfiguration];
I dont know if it is related but the audio seems to not play on videos that had a different view overlayed during the recording. It is possible to overlay a fullscreen view over the video preview while recording and it records fine and the file itself has audio and they play fine on a pc/mac... only on the device within our app those videos (again, not sure if related) have no audio. App uses same code for every video, so i can rename them, swap them etc, the behaviour has something to do with the audio channel of the video.
edit
Additional observation:
The vids where there is no sound, quicktime (only on mac) also plays no audio. QT on windows does play audio and all other players on windows do play audio as well.
Somehow the phone corrupts its own recording so that afterwards it doesnt recognize the audio channel it just recorded
edit 2
iPhone 6, iOS 8.1beta 2 -> as described
iPhone 4S, iOS 7.1.2 -> NO issue
iPad mini, iOS 8.0.2 -> always has the issue, not one videos audio channel can be read, but always exists and windows can play audio
I want to wrap this up.
The file was saved as ".mp4" and this causes the sporadic error.
Saving the file as ".mov" works. Same codec etc but without errors.
NSString *path = [[documentsDirectoryPath stringByAppendingPathComponent:
[NSString stringWithFormat:#"/%#", name]] stringByAppendingString:#".mov"];
outputURL = [[NSURL alloc] initFileURLWithPath:path];
[mAVCaptureMovieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];

IOS Audio Recording, How to Check if Mic / Playback is Busy Before Taking Mic

If anything is playing, recording, how to we check to see if the MIC is available (idle) for recording? Currently using
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
VCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice : audioCaptureDevice error:&error];
AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc] init];
[captureSession addInput : audioInput];
[captureSession addOutput : audioOutput];
[captureSession startRunning];
Need to check before grabbing the MIC / Playback from something that is already has it.
The mic device can not be busy/access to it can not be locked, even if you call [AVCaptureDevice lockForConfiguration] on a mic device it will not lock it and it is still accessible to the foreground application.
To see if other audio is playing you can check kAudioSessionProperty_OtherAudioIsPlaying e.g.:
UInt32 propertySize, audioIsAlreadyPlaying=0;
propertySize = sizeof(UInt32);
AudioSessionGetProperty(kAudioSessionProperty_OtherAudioIsPlaying, &propertySize, &audioIsAlreadyPlaying);
Additionally on Audio Session Programming Guide it is stated: "There is no programmatic way to ensure that an audio session is never interrupted. The reason is that iOS always gives priority to the phone. iOS also gives high priority to certain alarms and alerts"

Resources