iOS - Use live camera output for SKVideoNode - ios

Is it possible to use live camera output (like an AVCaptureSession) as input for an SKVideoNode? All the example and docs I've found only use prerecorded video, either through an AVPlayer or file URL.
I'm able to render the video in the background by making the scene transparent but I'd like the video to be part of the scene (for screenshoting and other manipulations that are better done within the SKScene)

Related

Is it possible to record video and overlay with CALayer simultaneously using AVVideoCompositionCoreAnimationTool?

I'm currently working on app that records video of user, binds particle emitters to hands landmarks on preview. In result effects are shown on camera preview, but they aren't captured on video.
I saw a great tutorial on AVVideoCompositionCoreAnimationTool https://www.raywenderlich.com/6236502-avfoundation-tutorial-adding-overlays-and-animations-to-videos, but that way it only is possible to rendere animations on already recoded video.
I wonder if there is any chance to use AVVideoCompositionCoreAnimationTool for recording video form camera and animations in real time. Or if you know other method to do it, without diving deep in metal and so on.
Thanks in advance!

Record video with custom camera UI but prevent save

I'm using AVCaptureFileOutputRecordingDelegate - didFinishRecordingToOutputFileAt inside my custom camera UI, but I don't want to pass for this method because the video is been saved when it finish recording.
For legacy reasons I can't save the video locally, then to take it in a static instance and delete it from local.
How can I do that ?
AVFoundation framework has only the following output for a capture session.
AVCaptureMovieFileOutput - to record and output a movie file
AVCaptureVideoDataOutput - process frames from video being captures
AVCaptureAudioDataOuput - process audio data being captures
AVCaptureStillImageOutput - to capture still image output
Since you don't want to save the recorded video to a file. The other best option would be using AVCaptureVideoDataOutput and get each frame on a continuous recording video and create a video from image buffer. To make a note you will not have audio output in this case. Again we can add AVCaptureAudioDataOuput and embed the audio separately on our recorded video. But this workaround will not work for higher frame rates. So best suggestion to save the video into temp folder and delete it later.

Obj C - How do I apply a GPUImage filter on top of a video that is playing on a loop in a AVPlayer (video player)?

I have a video I am playing using AVPlayer, but I want to apply a swipeable filter view like snapchat to the video while its playing on AVPlayer. So I was wondering how I can use GPUImage filters (such as black and white) to apply filters on the AVPlayer.
NOTE: I know how to write the filter onto a movie file, but I will only do that when the user saves the video, but I want them to be able to swipe through filters and see how it looks like.

ios overlaying alpha channel video on another video

I have been trying to create a video template which uses alpha channel video overlayed on the mp4 videos and images.
This is how I need to create a video http://viewptch.ptchcdn.com/rendered/52b28a9f8d4f980f3a3f99c3_cb44bf2b/52b28a9f8d4f980f3a3f99c3_lrg_main_main.mov
For overlaying alpha video on another videos, I have used AVAnimator, I was succeeded for playing a preview using AVFoundation, AVSynchronizedLayer and AVAnimator.
When rendering video from composition, frames of alpha channel videos renders very slowly.
I need to create a video with alpha channel video on top of another video.
Can any one please suggest me what are the possible ways to render a video like http://viewptch.ptchcdn.com/rendered/52b28a9f8d4f980f3a3f99c3_cb44bf2b/52b28a9f8d4f980f3a3f99c3_lrg_main_main.mov ?
You mention that you have looked at AVAnimator, did you download the KittyBoom example project and try it out? The specifics of how it works are detailed in this post. One thing to note is that when you build and run on the device, you need to turn Debug mode off otherwise it will not execute quickly because a number of extra checks are done in debug mode. Also, you have to make sure to test on the actual device, the simulator is not a good measure of performance on a real device. Performance is a key problem with video that contains an alpha channel as iOS does not support video with an alpha channel by default.

Capture video from cam + custom view into single video file

I wonder if it's possible in iOS 4 or 5 to save into a single video file not just a stream from camera, but a stream from camera WITH custom view(s) overlaid. Custom view will contain few labels with transparent background. Those labels will show additional info: current time and GPS coordinates. And every video player must be able to playback that additional info.
I think you can use AVCaptureVideoDataOutput to process each frame and use AVAssetWriter to record the processed frame.You can refer to this answer
https://stackoverflow.com/a/4944594/379941 .
And you can process CVImageBufferRef then use AVAssetWriterPixelBufferAdaptor's appendPixelBuffer:withPresentationTime: method to export.
And I strongly suggest using OpenCV to process frame. this is a nice tutorial http://aptogo.co.uk/2011/09/opencv-framework-for-ios/. OpenCV library is very great.

Resources