UIImage from AVCaptureSession - ios

What I'm trying to achieve is grabbing a still frame from a AVCaptureSession for something like a QR Code scanner. I have a view with a AVCaptureVideoPreviewLayer sublayer.
I already tried using AVCaptureStillImageOutput, which worked, but that function makes the shutter sound. Since there's no way to mute that I can't use it. After that I tried to make a screenshot of the entire screen, which also failed, because it can't capture a AVCaptureVideoPreviewLayer. Now I'm kinda lost, the only real way to do this would be to feed a video feed into my OCR library but that would lag to much/be a lot of work.
Are there any other options?

Here's a tech note describing exactly what you want to achieve:
How to capture video frames from the camera as images using AV Foundation on iOS

Related

Is it possible to record video and overlay with CALayer simultaneously using AVVideoCompositionCoreAnimationTool?

I'm currently working on app that records video of user, binds particle emitters to hands landmarks on preview. In result effects are shown on camera preview, but they aren't captured on video.
I saw a great tutorial on AVVideoCompositionCoreAnimationTool https://www.raywenderlich.com/6236502-avfoundation-tutorial-adding-overlays-and-animations-to-videos, but that way it only is possible to rendere animations on already recoded video.
I wonder if there is any chance to use AVVideoCompositionCoreAnimationTool for recording video form camera and animations in real time. Or if you know other method to do it, without diving deep in metal and so on.
Thanks in advance!

Is there any way to capture video or image from Unity (which use Vuforia) iOS application?

I have an Augmented Reality functionality made using Unity + Vuforia plugin which I integrated into the iOS application. The app uses the camera as background and when you navigate camera to some marker 3D object will appear on it.
My task is to add buttons which will start and stop capture video (or image) from the camera. The output should be a video with camera scene + 3D object.
I made some investigation, but the only solution I found is to convert the view of AVCaptureVideoPreviewLayer on which camera preview is showing to a video (or image). But from my opinion, this solution is inefficient and not flexible.
Is there any way to get a current instance of the AVCaptureSession from Unity (or maybe Vuforia plugin)? Or maybe there is another way to solve my problem?
Any pieces of advice or guides will be very helpful.
I don't think you should use AVCaptureSession to get the preview and even do the capture operation in Cocoa-Touch instead you should capture the image in Unity and pass the data to Cocoa-Touch native API.
Here is the link how to capture the screenshot in Unity.

AVFoundation, create images all video's frames while playing its audio

at a bit of a loss here.
Basically, I need to create a frame server that will give my app an image of each frame contained in the video at said video's frame rate. I need to do this while also playing its audio.
I'll be using these frames as a texture source for certain geometries on an SceneKit
Have never used AVFoundation, any pointers, tutorials or suggestions are welcomed
That's a very general question.
Since you've never used AVFoundation, the AVFoundation Programming Guide should be your first stop.
In particular, the Still and Video Media Capture section shows how to set a delegate for the capture process.

i want to capture square video in iOS with custom control

i am able to show square preview of recording using AVCaptureVideoPreviewLayer class, But it saving video in rectangular in library. Instead of rectangle i want square video. I have used composition class to crop video and it is working but taking too much time. I check vine app which has square video output.
Please give me suggestion.
its late answer but it is used for another.. see the my ans
https://github.com/piemonte/PBJVision
record video in square

Simple way to capture a high quality image with AVCapture

All I need to do is capture an image, and all I can find is complicated code on capturing video or multiple frames. I can't use UIImagePickerController because I do not want to see the camera shutter animation and I have a custom overlay, and my app is landscape only. What is the simplest way to manually capture an image from the front or back live camera view in the correct orientation? I don't want to save it to the camera roll, I want to present it in a view controller for editing.
Take a look to the SquareCam (http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html) example from Apple. It contains all what you need for high-quality capture of images. I recently copy-pasted the code from this project myself where I solved the same task as you. It works well :)

Resources