Output UIView as video stream - ios

Is it possible to stream content/context of a UIView as a direct video stream in Swift? I am not really looking for a “view screenshotting” functionality and than assembling video, this solution is possible but the framerate is far from ideal.
Update: maybe using OpenGL view?

1. View screenshots: What's your current solution of timing function ?
I believe if you use CADisplayLink, you can get better frame rate. As in my project, I can get ~15-20fps live streaming on full screen video view on iPhone 7Plus.
2. Using ReplayKit: I think I don't need to rewrite the introduction in another way because the Apple's docs were so clear.
Record or stream video from the screen, and audio from the app and
microphone.
Using the ReplayKit framework, users can record video from the screen,
and audio from the app and microphone. They can then share their
recordings with other users through email, messages, and social media.
You can build app extensions for live broadcasting your content to
sharing services. ReplayKit is incompatible with AVPlayer content.
The frame rate is quite higher than draw screenshot of views but currently it only supports capturing the whole screen.
So if you want to achieve capturing just a view, may be think about this way: Crop the buffer array of the output CMSampleBufferRef frame.
edit: If it's about mirroring a view to an external screen then we could have other solutions than ReplayKit or view screenshots.

Related

Youtube Iframe Player API Developer Policy Inquiries

I would like to ask about developer policies of Youtube Iframe Player API(https://developers.google.com/youtube/iframe_api_reference).
Our team is currently developing an iOS app including Youtube embedded player.
We would like to check if our use cases below violate the rules for embedded youtube player.
Would it be okay if we use invisible player which we set opacity zero for purpose of getting thumbnail images in video?
Thumnails are taken by invisible player in the way as seeking to specific time and capture view at the time.
The thumnails are solely used for progress bar which consist of sequential images as green box in the below image and each image indicates each parts of video and there is no other purposes.
Images are made and used only in client local environment and not saved in our database.
Additionaly, can we provide the thumbnail extracted in the above method which background is removed?
Would it be possible that we hide below title bar which shows when player paused by adjusting CSS style, view frame size ?
Is there the other way to hide title bar ?
it seems that there are other mobile app services in Korea also using embeded player without titlebar like ‘Cake’ ( https://apps.apple.com/kr/app/cake-케이크-영어회화/id1350420987 )
We are developing feature ‘Metronome’ using BPM(beat per minute). for this, we need to use Audio PCM data in Video. regarding this we are currently examining three cases.
is it allowed to extract audio file from video to get PCM data?
is it allowed to access to device’s audio buffer to get PCM data?
is it allowed to record audio with mobile device to make audio file ?
Is it possible to provide a Metronome function that repeatedly plays a specific sound simultaneously with the audio of the Youtube video?
Would it be okay if we invert embeded player left and right? we plan to provide a mirror mode for user so that they practice dance conveniently.
Is it possible to provide a function that play repeatedly the specific time range (ex. from 00:05 to 00:10)?
Is it possible to provide a function that does not play immediately when the play button is pressed, but counts down for a certain period of time, such as 3 seconds, to play?

Displaying 360 Video from an iOS App to an AirPlay Device

A client's app displays two versions of a video in the app, one regular, one 360 view. The regular video is handled by AVPlayer. The 360 video is rendered by the open source package Swifty360Player (which works very well, btw). The app would like to be able to display either video on a big screen using AirPlay.
For the normal video, this is no problem. The 360 Video, however, is produced by a SceneKit view, so it's technically more akin to a 3D game than a video. I know that we can display game scenes on an AirPlay device if/when the user manually mirrors his iPhone/iPad to the AirPlay screen.
But I wonder, is there any way to generate a live video stream from the SceneKit view, and then transmit that video stream to the AirPlay device in real time? My wish is that the user could then use an AVRoutePickerView to select an AirPlay device from within the app.
ReplayKit does this for streaming services like Twitch but this app isn't looking to broadcast, just to share video with a single screen in the same room.
Is there anyway to accomplish this?

Add scroll playback controls to AVPlayer like in Photos app

I'm creating AVPlayer view to watch the stream from CCTV. And I want to make playback controls in a video like in Photos app at iPhone(open Photos, play some video, at the bottom, it will be UIScrollView with some screenshots of your video) like at image below.
May I ask you some help on how to create something like this?
And this is my first question here, so sorry if something wrong. =)
ScrollView playback
AVPlayer cannot do this for streaming. The suggestion made on snapshots would also not be close to your want of getting the same behaviour as on local preview of videos in the Photos App on iOS.
If you are looking for this kind of experience I would recommend heading over to movi.ai to get your hands on our cross platform solution with this ready out of the box.

I want to design a screen which will show different number of videos on same screen based on certain condition

This is an iOS problem.
I want to design a screen which will show a video on full screen. After some time based on some backend condition if another video is available I have to show that by splitting the screen into two vertical halves. After some time if one more video is available I have to again split the screen and show the third video horizontally on the bottom of the screen.
I am new to iOS and I am not able to manage the screen split on runtime based on backend condition. please help me in this regard.
Using AVPlayer it possible to play multiple videos in a view. You can use Apple's AVPlayer.
An AVPlayer is a controller object used to manage the playback and
timing of a media asset. You can use an AVPlayer to play local and
remote file-based media, such as QuickTime movies and MP3 audio files,
as well as audiovisual media served using HTTP Live Streaming.

How to implement screen recording with audio in iOS programmatically?

I have a requirement where I have screen recording with audio as well. I have done some Google and got to know about how can we implement screen recording but I am wondering how to save audio while screen recording.
Is there any possibility that we can merge the video and audio and then save the final data on disk?
But I am not sure that will it be feasible because there will be difference in syncing with audio and video frames.
For screen recording I got a link of ScreenCaptureView which actually lets you save the screen recording.
On iOS9 there is ReplayKit it's a framework that can be used to make screen recording on video games. It seems that you can use also for common screen capture.
For lower platforms it's a kind of big deal, video screen capturing exists but is a private framework ( I guess IOSurface). There are some work arounds as in this project , basically it starts to grab sigle screenshot and append them into a movie file, without audio

Resources