iOS AVFoundation Video Capture Orientation Options - ios

I have an app that I would like to have video capture for the front-facing camera only. That's no problem. But I would like the video capture to always be in landscape, even when the phone is being held in portrait.
I have a working implementation based on the AVCamDemo code that Apple published. And borrowing from the information in this tech note, I am able to specify the orientation. There's just one trick: while the video frame is oriented correctly, the contents still appear as though shot in portrait:
I'm wondering if I'm just getting boned by the physical constraints of the hardware: is the image sensor just oriented this way? The referenced tech note above makes this note:
Important: Setting the orientation on a still image output and movie
file output doesn't physically rotate the buffers. For the movie file
output, it applies a track transform (matrix) to the video track so
that the movie is rotated on playback, and for the still image output
it inserts exif metadata that image viewers use to rotate the image
properly when viewing later.
But my playback of that video suggests otherwise. Any insight or suggestions would be appreciated!
Thanks,
Aaron.

To answer your question, yes, the image sensor is just oriented that way. The video camera is an approx 1-megapixel "1080p" camera that has a fixed orientation. The 5MP (or 8MP for 4S, etc) still camera also has a fixed orientation. The lenses themselves don't rotate nor do any of the other camera bits, and hence the feed itself has a fixed orientation.
"But wait!", you say, "pictures I take with the camera app (or API) get rotated correctly. Why is that?" That's cuz iOS takes a look at the orientation of the phone when a picture is taken and stores that information with the picture (as an Exif attachment). Yet video isn't so flagged -- and each frame would have to be individually flagged, and then there's issues about what to do when the user rotates the phone during video....
So, no, you can't ask a video stream or a still image what orientation the phone was in when the video was captured. You can, however, directly ask the phone what orientation it is in now:
UIDeviceOrientation currentOrientation = [UIDevice currentDevice].orientation;
If you do that at the start of video capture (or when you grab a still image from a video feed) you can then use that information to do your own rotation of playback.

Related

Know if picture is Selfie or Portrait

I'm coding an app where users can upload pictures and add some filters to it.
The problem is that when I apply filter on it, the picture is rotating, ONLY if the picture has been taken with back camera.
If it was a selfie the picture is not rotating
If the picture is in portrait mode, the picture is not rotating
The problem is that I don't know how I could get these information, in order to rotate the picture only when I need it.
You're thinking about this the wrong way. It may be the case that images taken with your phone's rear camera appear rotated after applying a filter, but you cannot make this assumption for all devices. Instead, you can read the imageOrientation property on UIImage to obtain information about whether the image has an unusual rotation.

CMSampleBufferRef have always same video resolution?

I' trying to capture video by AVAssetWriter and AVCaptureOutput
You can find sample project here.
The video should be in portrait mode with any resolution. The main problem that it should be in portrait mode.
I'm trying to set different setting, but in the end, video is rotated and scaled to size (1920x1080) on iPhone SE.
Is it possible to control this resolution? Or at least orientation?
Video resolution is determined by the AVCaptureSession sessionPreset. You're setting that to medium, so you're getting the resolution that comes with that. If you want a different resolution, pass a different session preset, or use AVCaptureDevice to set a specific capture format. (For a good overview of capture session presets vs device formats, go back to this WWDC13 video.)
Per this Apple Developer Q&A, you need to set an orientation on the capture connection after you start the capture session in order to get "physically" rotated frame buffers (at a capture performance cost), or set the transform property on your asset writer (so that buffers are recorded in the sensor's native orientation, but clients display it in your intended orientation).

Retrieve iOS videocamera resolution

I need to retrieve the resolution in pixels of a movie captured by iOS camera of the iPhone...
Is there a library like UIDevice that check which type of device I'm using, also for the camera information?
Everything depends in which mode you'll start capturing video.
According to this link: https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW30
You can try to capture video with AVCaptureSessionPresetHigh preset, and than check what's the size of captured image.
This should give you highest video resolution in recording mode.

Does captureOutput:didOutputSampleBuffer:fromConnection: carry any orientation information?

I apply tons of image processing to a camera live preview layer via OpenGL, and I want to get some information about the input`s orientation before I ask for the image from OpenGL (to apply corresponding transformations).
The capturing takes place in a portrait only viewController, and having notifications from the device orientation, or stuff may not be syncronized with the actual image orientation, I'm afraid.
Has CMSampleBuffer any orientation info? Or AVCaptureConnection?

How to crop a video in iOS

I was having a look at the RosyWriter Sample Code provided by Apple as a starting point and I'd like to find a way how to crop a video.
So i have the full resolution video from the iPhones Camera, but I just want to use a cropped part of it (and also rotate this subpart).
I figured that in captureOutput:didOutputSampleBuffer: fromConnection: i can modify each frame by modifying the CMSampleBufferRef that i get passed in.
So my questions now are:
Is this the right place to crop my video?
Where do I specify that the final video (that get's saved to disc) has a smaller resolution than the full video captured by AVCaptureSession? Setting the AVVideoWidthKey and AVVideoHeightKey has no effect.
How can I crop the video and still have good performance?
Any help is appreciated!
Thanks a lot!
EDIT:
Maybe I just need to know how I can make a video that was shot in portrait a landscape one by turning the images of the video by 90 degrees and then zoom in to fit the width again...?!?
In AVVideoSetttings.h there is the AVVideoScalingModeKey. This key combined with the defined values control how the video is scaled/cropped when encoding the images to the video container. For example if you specified a value of AVVideoScalingModeFit then cropping is used. Check out the header for how other values effect the video images.

Resources