iPhone: Toggling front/back AVCaptureDeviceInput camera when processing individual frames via setSampleBufferDelegate - ios

I've run into an interesting issue when I attempt to switch from using the front camera to using the back camera while processing individual frames via the AVCaptureVideoDataOutput:setSampleBufferDelegate selector. The camera swap works and the preview screen that I'm displaying looks great, it's just that the resulting frames that I capture are no longer in portrait mode, they are in landscape. Also, swapping from the front then back to the back camera will result in the back camera capturing landscape frames. I suspect that since this is the case something is getting screwed up when I swap out the input - it's not the input that's incorrect. I verified this theory by starting the AVCaptureSession with the front facing camera - the frames passed to the buffer delegate are correctly in portrait mode. I've also played with explicitly stopping the AVCaptureSession while the device input is being swapped with no difference in results.
I cribbed from the AVCam demo for inspiration. The suspicious difference between that code and mine is that it records to an AVCaptureMovieFileOutput - it's not processing individual frames.
Any ideas? Why would the orientation of the frames sent to my processor change when I swap out the device input?
Thanks for any response!

Ah ha! I figured it out. For some reason after switching the device input my video output's AVCaptureConnection was getting its orientation reset to landscape-right. To solve the problem, after I swap the input I explicitly ask the video output's AVCaptureConnection to set its orientation to portrait.

Related

Sound not working for some AVCaptureDeviceFormat... is this a bug?

I see this question on SO about the same problem but in my case it is slightly difference.
On that question, the poster says he cannot record audio when he setups the app to shoot video at the highest resolutions a camera can provide under AVFoundation.
On the original question the poster mentions that his AVCaptureConnection has no audio. I believe he is talking inside captureOutput:didOutputSampleBuffer:fromConnection: but in my case the problem is slightly different. In my case, this method is never called for audio. I mean, every time this method is called connection is always a video one... or in other words, there is no data audio output delegate being called here.
I have checked the captureSession and the microphone is there, so captureSession contains a AVCaptureDeviceInput of audio.
(lldb) po _captureSession.inputs
<__NSArrayI 0x170227e00>(
<AVCaptureDeviceInput: 0x17422e2e0 [Back Camera]>,
<AVCaptureDeviceInput: 0x17422e8e0 [iPad Microphone]>
)
I am testing this on an iPad Pro 9.7. I have checked all resolutions of the front and back camera of this device and I have no audio for these:
FRONT CAMERA: 960p # 30 or 60 fps
BACK CAMERA 4032x3024 at 30 fps
I have tried to remove and add the audio device after changing the resolution but the captureSession hangs and the preview freezes. The app continues to work, no crash, but the preview freezes.
Is this a bug? I don't see any mention on any documentation saying I cannot record audio with the highest resolutions a camera can provide.
NOTE: To demo the problem, I have uploaded a modified version of Apple's CIFunHouse here. I have adjusted line 459 of FHViewController.m with 4032x3024 that is the maximum resolution of my iPad. You should adjust that for the maximum resolution of your device's rear camera.
For some strange reason, when you do that, the app crashes when it tries to initialize the audio. My code, that is based on that, initializes ok but does not record sound. I left the code crashing because perhaps it can help more that way. You will see that channelLayoutand and basicDescription are both NULL for that video format. Reduce the resolution and the audio will initialize ok.
Here is a hand-waving answer: 4032x3024 is not a commonly encountered video resolution. 480p, 720p and 1080p are though. And if you read about 4K resolution video you'll see that 3840x2160 is too.
In fact "2160p" does capture both audio and video on my iPhone 6s, so why not try that?
Will AVAssetWriter be able to encode 2160p? Who knows? Maybe.
But don't be too harsh on AVFoundation - it does a valiant job of putting a sane face on the craziness of hardware. If anything you should log functionality and documentation bugs.

AVCaptureSession captures black/dark frames after changing preset

I'm developing app which supports both still image and video capture with AVFoundation. Capturing them requires different AVCaptureSession presets. I check for canSetSessionPreset(), begin change with beginConfiguration(), set required preset with sessionPreset and end with commitConfiguration().
I found if I'm capturing still image with AVCaptureStillImageOutput immediately after changing preset, it returns no errors, but the resulting image is black or very dark sometimes.
If I start capturing video with AVCaptureMovieFileOutput immediately after changing preset, first several frames in a resulting file are also black or very dark at times.
Right after changing preset the screen flickers likely due to the camera adjusting the exposure. So it looks like immediately after changing preset camera start measuring exposure from very fast shutter speed, which results in black/dark frames.
Both problems goes away if I insert a 0.1 second delay between changing the preset and starting capture, but that's ugly and no one can guarantee it will work all the time on all devices.
Is there a clean solution to this problem?
This is for future users...
It was happening for me when I was setting the sessionPreset as high and as soon as I was starting recording I was making changes to video output connection and setting focus which I then moved to while setting up camera and it worked!!!

how to capture a video in specific part rather than full screen in iOS

I am capturing a video in my IOS app using AVFoundation. i am able to record the video and able to playback also.
But my problem is that i am showing the capturing video in a view which is around 200 points height.so i expected the video would be recorded in the same dimensions.but when i playback the video its showing that the whole screen has been recorded.
so i want to know is there any way to record the camera preview which was visible to user only.And any help should be appreciated.
the screenshots:
()
()
You cannot think of video resolution in terms of UIView dimensions (or even screen size, for that matter). The camera is going to record at a certain resolution depending on how you set up the AVCaptureSession. For instance, you can change the video quality by setting the session preset via:
[self.captureSession setSessionPreset:AVCaptureSessionPreset640x480]
(It is already set by default to the highest setting.)
Now, when you play the video back, you do have a bit of control over how it is presented. For instance, if you want to play it in a smaller view (who's layer is of type AVPlayerLayer), you can set the video gravity via:
AVCaptureVideoPreviewLayer *previewLayer = (AVCaptureVideoPreviewLayer*)self.previewView.layer;
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
And depending on what you pass for the gravity parameter, you will get different results.
Hopefully this helps. Your question is a little unclear as it seems like you want the camera to only record a certain amount of it's input, but you'd have to put your hand over part of the lens to accomplish that ;)

How do I prevent the video stream rotating in landscape - AVCaptureSession and OpenGL

I have created an application that captures a live video stream, does some processing on the video stream using OpenGL ES, and displays it on a screen in a UIView
Essentially the data flow is:
AVCaptureSession -> AVCaptureDeviceInput -> AVCaptureVideoDataOutput -> gl buffers -> draws to a UIView
I am not using AVCaptureVideoPreviewLayer for displaying the content.
When I display video using this system in portrait it works as expected, the video is in the same orientation as the display.
However, my app is intended to be used in landscape. In my shouldAutorotateToInterfaceOrientation method in the superview I only allow UIInterfaceOrientationLandscapeLeft, and UIInterfaceOrientationLandscapeRight.
My issue is that the video stream is rotated. The phone is orientated to landscape, and the video stream is rotated by 90 degrees. I would like the video stream to be in the same orientation.
I do not understand why the video stream is rotated, I guess it is trying to compensate for landscape mode, but I have not idea how to prevent this rotation. Any ideas?
Thanks
Mike
Thanks for the response VinceBurn.
Ultimately I tried out the methods in this post
And settled on the example code for willRotateToInterfaceOrientation:duration
Don't know if that would work, I'm just guessing something here.
If no simple solution comes, maybe try to apply a 90° rotation transform on your UIView layer.
(I don't like this idea, but could work)

AV Foundation camera preview layer gets zoomed in, how to zoom out?

The application currently I am using has a main functionality to scan QR/Bar codes continuously using Zxing library (http://code.google.com/p/zxing/). For continuous frame capturing I used to initialize the AVCaptureSession and AVCaptureVideoOutput, AVCaptureVideoPreviewLayer described in the apple Q&A http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html.
My problem is, when I used to run the camera preview, the image I can see through the Video device is much larger (1.5x) than the image we can see through the still camera of the iPhone. Our customer needs to hold the iPhone around 5cm distance from the bar code when he is scanning, but if you hold the iPhone to that parameter, the whole QR code won't be visible and the decoding fails.
Why is Video camera in iPhone 4 enlarges the image (by seeing through the AVCaptureVideoPreviewLayer) ?.
This is a function of the AVCaptureSession video preset, accessible by using the .sessionPreset property. For example, after configuring your captureSession, but before starting it, you would add
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
See the documentation here:
iOS Reference Document
The default preset for video is 1280x720 (I think) which is a lower resolution than the max supported by the camera. By using the "Photo" preset, you're getting the raw camera data.
You see the same behaviour with the built-in iPhone Camera app. Switch between still and video capture modes and you'll notice that the default zoom level changes. You see a wider view in still mode, whereas video mode zooms in a bit.
My guess is that continuous video capture needs to use a smaller area of the camera sensor to work optimally. If it used the whole sensor perhaps the system couldn't sustain 30 fps. Using a smaller area of the sensor gives the effect of "zooming in" to the scene.
I am answering my own question again. This was not answered even in Apple Dev forum, therefore I directly filed a technical support request from Apple and they have replied that this is a known issue and will be fixed and released with a future version. So there is nothing we can do more than waiting and see.

Resources