i want to capture square video in iOS with custom control - ios

i am able to show square preview of recording using AVCaptureVideoPreviewLayer class, But it saving video in rectangular in library. Instead of rectangle i want square video. I have used composition class to crop video and it is working but taking too much time. I check vine app which has square video output.
Please give me suggestion.

its late answer but it is used for another.. see the my ans
https://github.com/piemonte/PBJVision
record video in square

Related

How to play video over SKScene

Basically I have an SKScene, and I want to play a video over the scene. The video is confetti falling with an alpha background. It will play when the player gets a high score. I am using an SKScene with shapes and images drawn with shape nodes and image nodes. I just was wondering if anyone could please tell me how to play the video over the screen and still see the game in the back, and be able to touch the buttons through the video. It is supposed to look like an animation playing.
I am using a video because I was just thinking that playing a video would be more processor efficient than having the game generate particles.
There is no built-in iOS solution. You can play 24BPP (fully opaque) movies under iOS, but the only built-in way to display alpha channel video would be to load a series of PNG images with alpha. Downside is that takes up a huge amount of memory and it bloats the app download. If you want to have a look at some working examples of this kind of functionality with a 3rd party app then see Alpha Channel Examples. You might also be interested in this blog post which shows example code of how to impl Alpha channel textures in OpenGL would could be implemented on top of SpriteKit too. The cube example shows rendering an alpha channel movie onto a cube, it was adapted from a Ray Wenderlich tutorial.
Here as an answer how to do that with GPUImageView. Also project on GitHub here and similar question from stackoverflow
The video stack doesn't yet support formats with alpha. For confetti, you should use SKEmitterNode. Size it to the area you envisioned for your video, and see Creating Particle Effects, i.e., its link to Add a particle emitter to your project and try out the "Snow" effect. It looks more like confetti when you give it a different color than white. Click the dot under "Color Ramp" to set the color.

Add Vintage Projector/jitter effect while recording a video

I want to develop a feature into an application, by which an vintage projector effect can be given to a recording video or a pre recorded video. If you want i an share an image. i want the effect similar to it, one part of the recording video is showing in the bottom frame and the bottom part of the recording video is showing in the top frame. Along with this the whole view should shake like a real vintage projector recording.![enter image description here][1]
Did you take a look at GPUImage?
It has lots of options for video recording/processing and allows to add/combine different filters.

how to capture a video in specific part rather than full screen in iOS

I am capturing a video in my IOS app using AVFoundation. i am able to record the video and able to playback also.
But my problem is that i am showing the capturing video in a view which is around 200 points height.so i expected the video would be recorded in the same dimensions.but when i playback the video its showing that the whole screen has been recorded.
so i want to know is there any way to record the camera preview which was visible to user only.And any help should be appreciated.
the screenshots:
()
()
You cannot think of video resolution in terms of UIView dimensions (or even screen size, for that matter). The camera is going to record at a certain resolution depending on how you set up the AVCaptureSession. For instance, you can change the video quality by setting the session preset via:
[self.captureSession setSessionPreset:AVCaptureSessionPreset640x480]
(It is already set by default to the highest setting.)
Now, when you play the video back, you do have a bit of control over how it is presented. For instance, if you want to play it in a smaller view (who's layer is of type AVPlayerLayer), you can set the video gravity via:
AVCaptureVideoPreviewLayer *previewLayer = (AVCaptureVideoPreviewLayer*)self.previewView.layer;
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
And depending on what you pass for the gravity parameter, you will get different results.
Hopefully this helps. Your question is a little unclear as it seems like you want the camera to only record a certain amount of it's input, but you'd have to put your hand over part of the lens to accomplish that ;)

UIImage from AVCaptureSession

What I'm trying to achieve is grabbing a still frame from a AVCaptureSession for something like a QR Code scanner. I have a view with a AVCaptureVideoPreviewLayer sublayer.
I already tried using AVCaptureStillImageOutput, which worked, but that function makes the shutter sound. Since there's no way to mute that I can't use it. After that I tried to make a screenshot of the entire screen, which also failed, because it can't capture a AVCaptureVideoPreviewLayer. Now I'm kinda lost, the only real way to do this would be to feed a video feed into my OCR library but that would lag to much/be a lot of work.
Are there any other options?
Here's a tech note describing exactly what you want to achieve:
How to capture video frames from the camera as images using AV Foundation on iOS

Simple way to capture a high quality image with AVCapture

All I need to do is capture an image, and all I can find is complicated code on capturing video or multiple frames. I can't use UIImagePickerController because I do not want to see the camera shutter animation and I have a custom overlay, and my app is landscape only. What is the simplest way to manually capture an image from the front or back live camera view in the correct orientation? I don't want to save it to the camera roll, I want to present it in a view controller for editing.
Take a look to the SquareCam (http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html) example from Apple. It contains all what you need for high-quality capture of images. I recently copy-pasted the code from this project myself where I solved the same task as you. It works well :)

Resources