Youtube Iframe Player API Developer Policy Inquiries - youtube

I would like to ask about developer policies of Youtube Iframe Player API(https://developers.google.com/youtube/iframe_api_reference).
Our team is currently developing an iOS app including Youtube embedded player.
We would like to check if our use cases below violate the rules for embedded youtube player.
Would it be okay if we use invisible player which we set opacity zero for purpose of getting thumbnail images in video?
Thumnails are taken by invisible player in the way as seeking to specific time and capture view at the time.
The thumnails are solely used for progress bar which consist of sequential images as green box in the below image and each image indicates each parts of video and there is no other purposes.
Images are made and used only in client local environment and not saved in our database.
Additionaly, can we provide the thumbnail extracted in the above method which background is removed?
Would it be possible that we hide below title bar which shows when player paused by adjusting CSS style, view frame size ?
Is there the other way to hide title bar ?
it seems that there are other mobile app services in Korea also using embeded player without titlebar like ‘Cake’ ( https://apps.apple.com/kr/app/cake-케이크-영어회화/id1350420987 )
We are developing feature ‘Metronome’ using BPM(beat per minute). for this, we need to use Audio PCM data in Video. regarding this we are currently examining three cases.
is it allowed to extract audio file from video to get PCM data?
is it allowed to access to device’s audio buffer to get PCM data?
is it allowed to record audio with mobile device to make audio file ?
Is it possible to provide a Metronome function that repeatedly plays a specific sound simultaneously with the audio of the Youtube video?
Would it be okay if we invert embeded player left and right? we plan to provide a mirror mode for user so that they practice dance conveniently.
Is it possible to provide a function that play repeatedly the specific time range (ex. from 00:05 to 00:10)?
Is it possible to provide a function that does not play immediately when the play button is pressed, but counts down for a certain period of time, such as 3 seconds, to play?

Related

I want to design a screen which will show different number of videos on same screen based on certain condition

This is an iOS problem.
I want to design a screen which will show a video on full screen. After some time based on some backend condition if another video is available I have to show that by splitting the screen into two vertical halves. After some time if one more video is available I have to again split the screen and show the third video horizontally on the bottom of the screen.
I am new to iOS and I am not able to manage the screen split on runtime based on backend condition. please help me in this regard.
Using AVPlayer it possible to play multiple videos in a view. You can use Apple's AVPlayer.
An AVPlayer is a controller object used to manage the playback and
timing of a media asset. You can use an AVPlayer to play local and
remote file-based media, such as QuickTime movies and MP3 audio files,
as well as audiovisual media served using HTTP Live Streaming.

Output UIView as video stream

Is it possible to stream content/context of a UIView as a direct video stream in Swift? I am not really looking for a “view screenshotting” functionality and than assembling video, this solution is possible but the framerate is far from ideal.
Update: maybe using OpenGL view?
1. View screenshots: What's your current solution of timing function ?
I believe if you use CADisplayLink, you can get better frame rate. As in my project, I can get ~15-20fps live streaming on full screen video view on iPhone 7Plus.
2. Using ReplayKit: I think I don't need to rewrite the introduction in another way because the Apple's docs were so clear.
Record or stream video from the screen, and audio from the app and
microphone.
Using the ReplayKit framework, users can record video from the screen,
and audio from the app and microphone. They can then share their
recordings with other users through email, messages, and social media.
You can build app extensions for live broadcasting your content to
sharing services. ReplayKit is incompatible with AVPlayer content.
The frame rate is quite higher than draw screenshot of views but currently it only supports capturing the whole screen.
So if you want to achieve capturing just a view, may be think about this way: Crop the buffer array of the output CMSampleBufferRef frame.
edit: If it's about mirroring a view to an external screen then we could have other solutions than ReplayKit or view screenshots.

How do I change the YouTube player on my web site

How do I change the YouTube player on my web site. Like http://vidsshare.com/
Use the IFrame API for the embed video. The IFrame player API lets you embed a YouTube video player on your website and control the player using JavaScript.
Using the API's JavaScript functions, you can queue videos for playback; play, pause, or stop those videos; adjust the player volume; or retrieve information about the video being played. You can also add event listeners that will execute in response to certain player events, such as a player state change or a video playback quality change
Just check the link above to know more about it.
For more information, check this YouTube Player Demo. This demo demonstrates the YouTube Player API's functions. Embedded players must have a viewport that is at least 200px by 200px. If the player displays controls, it must be large enough to fully display the controls without shrinking the viewport below the minimum size.

How can we overcome the missing muted/volume property on HTML5 video on UiWebView/iOS Safari?

As many hybrid app developers know, Apple has decided to disallow setting the volume property of HTML5 video elements in JavaScript. This also amounts to the the muted property. The concept of muted videos which autoplay when scrolled into view and with the option of unmuting on tap is growing increasingly popular (pioneered by Vine, Facebook, etc.). I'm trying to find a way around this limitation in design. From what I've been able to read on the subject, there's not any hack or solution that solves this design requirement of mine.
Here's my thoughts so far:
I could split the audio from the video into a separate stream and sync current time with the video and call play() when the user is tapping. However, iOS Safari/UiWebView does not support simultaneous audio/video streams. Thus, this is simply not an option.
I could encode two videos, one with sound and one without. I could then swap the src on tap. However, this requires reloading the entire stream and also nearly doubles the amount of data required. The latency is noticeable. Thus, this won't be a viable solution.
I could embed a native AVPlayer class element in the webview. However, this would be an overlay and not be manageable from within the webview. Custom controls and UI interaction from within the dom would not be possible. Thus, this is not an option.
I could simply disable the output of the app and dynamically switch it on whenever the user taps a video element. However, to my knowledge this is not possible. I could show the native software volume slider, but that would defeat the purpose of this whole thing.
Do you have any suggestions or ways around this limitation?
I managed to find an acceptable solution. I split the videos into three files. One without audio, one without video and then one with both video and audio for desktop browsers/Android.
It seems like running simultaneous streams CAN work as long as they doesn't conflict with each other, which basically means a separate audiotrack and a video with no audio Channels play just fine in unison.

AVComposition breaks on Airplay

I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!

Resources