Unity ReplayKit how to bypass native preview? - ios

Using the Unity ReplayKit API from https://docs.unity3d.com/550/Documentation/ScriptReference/Apple.ReplayKit.ReplayKit.html
I can record, preview and share screen recordings just fine, however due to design & UX requirements I need to have custom buttons to invoke sharing/saving of the recorded clip. I need to show the recorded clip right after the capture (for example like videotexture on a plane) in the background with custom share button overlays.
Is there a way to access the clip captured by ReplayKit and bypass iOS native preview screen?

With iOS 11, you can have direct access to the video. See https://developer.apple.com/videos/play/wwdc2017/606/
You can save it using AVAssetWriter, then do whatever you want with it after that.
This project doesn't completely work, but will head you in the right direction: https://medium.com/#giridharvc7/replaykit-screen-recording-8ee9a61dd762

Related

How to make a screen-record guiding video for iOS device

I'm creating an iOS app on recording screen by using ReplayKit.
I want to make a user operation video to guide user on how to use my app to record screen.
So I need to record a video on how to record screen and this approach is a bit awkward.
I tried several ways: Screen Mirroring such as AirPlayer and QuickTime, but the recording button on control-center is disabled whilst using Screen Mirroring.
The effect I want is as follows:
https://support.apple.com/library/content/dam/edam/applecare/images/en_US/iOS/ios13-iphone-xs-control-center-screen-record-animation.gif
How was this video made?
I think you could recreate that video by recording your screen showing every action you want, save the video and edit it on top of an iPhone image asset.

Output UIView as video stream

Is it possible to stream content/context of a UIView as a direct video stream in Swift? I am not really looking for a “view screenshotting” functionality and than assembling video, this solution is possible but the framerate is far from ideal.
Update: maybe using OpenGL view?
1. View screenshots: What's your current solution of timing function ?
I believe if you use CADisplayLink, you can get better frame rate. As in my project, I can get ~15-20fps live streaming on full screen video view on iPhone 7Plus.
2. Using ReplayKit: I think I don't need to rewrite the introduction in another way because the Apple's docs were so clear.
Record or stream video from the screen, and audio from the app and
microphone.
Using the ReplayKit framework, users can record video from the screen,
and audio from the app and microphone. They can then share their
recordings with other users through email, messages, and social media.
You can build app extensions for live broadcasting your content to
sharing services. ReplayKit is incompatible with AVPlayer content.
The frame rate is quite higher than draw screenshot of views but currently it only supports capturing the whole screen.
So if you want to achieve capturing just a view, may be think about this way: Crop the buffer array of the output CMSampleBufferRef frame.
edit: If it's about mirroring a view to an external screen then we could have other solutions than ReplayKit or view screenshots.

HTML Recording video on iOS

I'm new to HTML coding, and i'm currently trying to build an app in iOS like snapchat, that will take a users camera and record without stopping if the user goes into the main menu or whatever. I'm looking for some HTML5 code that will allow me to have the main interface just be the back camera output, with buttons that i'll have over the front.
A few searches have led me here: http://www.html5rocks.com/en/tutorials/getusermedia/intro/
Which I have tried to make work but iOS does not support it.
I'm basically asking: How do I make an app record video with it starting on screen?
You could write a web app to do this, but not a native app (i.e. from the App Store). For that, you'll need to learn Objective-C or Swift, then take a look at the AVFoundation framework.

Swift - Slow Motion Camera in App

I'm been learning Swift and iOS development and am interested in making an application that uses the camera on the back of the iPhone to shoot slow motion video and then allow the user to use a slider to move through the video normally, but also to move frame by frame. Of course, this would be regulated by a button that transitioned the user from "normal navigation" to frame by frame. Can anyone point me in the right direction? Are there any resources where I can read about using the "slo-mo" feature in a custom app? Thanks.
If you don't want the user to be able to save the slow-mo video, what you could do is take a video shot, then have the app play it back in slow-mo. You can do all this using the AVFoundation framework. Here are some links to help you out:
Taking Control of the iPhone Camera in iOS 8
Objective-C: How to do slow-motion video in iOS (yes, this is for Objective-C, but you can change the code to Swift or just copy and paste it into a new Objective-C file)

iOS App: AVFoundation Video camera recording delegates

I am trying to develop an app that records the video using the rear camera.
I want the video to be tweaked (adding overlay image, green screen) before saving it to a file.
I am using AVFoundation, and have researched anywhere, but I still cant make the correct codes to preview and record (without the tweaking)
Some codes that Iget from the internet use BufferDelegate, the rest are using RecordingDelegate.
Do I need to use both to achieve what I want? or is it just one ofthem?
Could you please pinpoint me how I can achieve this?

Resources