Capture 60fps on iOS to reduce motion blur - ios

I need to reduce motion blur in my video, so I tried every sessionPreset on AVCaptureSession to get 60fps video capture but none of them seem to work. I found a few old threads that mention the 1280 preset would work on iOS 5, but I had no success with any preset on iOS 10 (iPhone 6).
It just fails when I try to configure activeVideoMin/MaxFrameDuration to 60fps.
The queried format also says it only supports a range between 2 to 30fps.
Is there really no way to capture 60fps (or reduce video motion blur) on iOS 10?

Iphone 6 and 6 plus has tow device format. one for 1080p30 and another for 1080p60. When you set AVCaptureSessionPresetHigh it uses 1080p30.
To record vide with 60fps, iterate through AVCaptureDevice Formats, find the format you want and then set AVCaptureDevice **setActiveFormat** property instead of AVCaptureSession setSessionPreset.
https://developer.apple.com/library/content/technotes/tn2409/_index.html

Related

CMSampleBufferRef have always same video resolution?

I' trying to capture video by AVAssetWriter and AVCaptureOutput
You can find sample project here.
The video should be in portrait mode with any resolution. The main problem that it should be in portrait mode.
I'm trying to set different setting, but in the end, video is rotated and scaled to size (1920x1080) on iPhone SE.
Is it possible to control this resolution? Or at least orientation?
Video resolution is determined by the AVCaptureSession sessionPreset. You're setting that to medium, so you're getting the resolution that comes with that. If you want a different resolution, pass a different session preset, or use AVCaptureDevice to set a specific capture format. (For a good overview of capture session presets vs device formats, go back to this WWDC13 video.)
Per this Apple Developer Q&A, you need to set an orientation on the capture connection after you start the capture session in order to get "physically" rotated frame buffers (at a capture performance cost), or set the transform property on your asset writer (so that buffers are recorded in the sensor's native orientation, but clients display it in your intended orientation).

Retrieve iOS videocamera resolution

I need to retrieve the resolution in pixels of a movie captured by iOS camera of the iPhone...
Is there a library like UIDevice that check which type of device I'm using, also for the camera information?
Everything depends in which mode you'll start capturing video.
According to this link: https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW30
You can try to capture video with AVCaptureSessionPresetHigh preset, and than check what's the size of captured image.
This should give you highest video resolution in recording mode.

iPad retina screen recording

Two parts:
Correct me if I'm wrong, but there isn't a standard video file format that holds 2048 x 1536 frames? (i.e. recording the full resolution of the iPad retina is impossible?)
My app uses a glReadPixels call to record the screen, and appends the pixel buffers to an AVAssetWriterInputPixelBufferAdaptor. If the video needs to be resized to export, what's the best way to do this? I'm trying right now with AVMutableVideoCompositionLayerInstructions and CGAffineTransforms, but it's not working. Any ideas?
Thanks
Sam
Yes , it is possible. My app is also taking big frame video .
Don't use glReadpixels it causes a lot of delay especially if you record big frames as 2048 x 1536
Since iOS 5.0 you can use a faster way using texture cash (link)

iOS Camera Programming - How to get maximum resolution images in the didOutputSampleBuffer callback

I have this camera App where I'd like to get the max resolution image in the didOutputSampleBuffer callback. Right now all the frames I receive in the callback are 852 x 640 (I am using an iPhone 4 for testing). Only when I request for a still image capture (via captureStillImageAsynchronouslyFromConnection) do I get one - and only one - frame corresponding to the actual image captured in the highest resolution of the device - 2592x1936.
Is it possible to set things up so that I constantly receive frames of resolution - 2592x1936 in didOutputSampleBuffer? Then I would like to save some of these frames as images in the callback without having to go through captureStillImageAsynchronouslyFromConnection to capture an image.
Video output can't support the full resolution that you see when capturing still images. Look at the table given in Use Capture Outputs to Get Output from a Session for a list of supported resolutions.
If you want to change the resolution set the appropriate setting on your camera session like so:
cameraSession.sessionPreset = AVCaptureSessionPresetHigh;
Note that AVCaptureSessionPresetPhoto isn't possible with video capture.

iOS FullScreen AVCaptureSession

I am developing a realtime video processing app for iOS 5. The video stream dimensions need to match the screen size of the device. I currently only have a iPhone 4 to develop against. For the iPhone 4 I set the AVCaptureSession preset to AVCaptureSessionPresetMedium:
AVCaptureSession *session = [AVCaptureSession new];
[session setSessionPreset:AVCaptureSessionPresetMedium];
The captured images (via CMSampleBufferRef) have the size of the screen.
My question: Is the assumption correct that the images captured with a session preset of AVCaptureSessionPresetMedium have the full screen device dimensions on iPhone 4s and iPad2 as well? I unfortunately cannot verify that myself.
I looked at the apple documentation:
http://developer.apple.com/library/mac/#documentation/AVFoundation/Reference/AVCaptureSession_Class/Reference/Reference.html#//apple_ref/doc/constant_group/Video_Input_Presets
but I cannot find a ipad2 dimension preset of 1024/768 and would like to save me the performance penalty of resizing images in real time.
Whats the recommended path to go?
The resolution of the camera and the resolution of the screen aren't really related anymore. You say
The captured images (via CMSampleBufferRef) have the size of the
screen
but I don't think this is actually true (and it may vary by device). A medium capture on an iPad 2 and an iPhone 4s is 480x360. Note this isn't even the same aspect ratio as the screen on a phone or iPod: the camera is 4x3 but the screen is 3x2.

Resources