YouTube Player API android does not automatically change the video quality - android-youtube-api

Hope anyone can help me
We are developing an Android app that integrates the latest YouTube API Player. Everything seems to be ok when user wants to load a video (using the video Id), and the user is able to visualize it.
However, we have identified in our lab an strange behaviour in the YouTube player when the quality of the network where the user is located gets worse. We thought that the YouTube player would automatically adapt the video quality depending on the network conditions (no quality selection is done, thinking that by default is set to automatic), however it does not, but the native YouTube App does.
Let me describe the test performed where we have observed this behaviour:
An android device with the native YouTube App and our app installed and connected to our WLAN network
The WLAN network provides access to internet and we can inject some impairments on that network (i.e. reduce the bandwith)
Configure excellent bandwith in the nertwork
Start the video and visualize some seconds
Configure bad bandwith (i.e. 256 kbit/s) in the network
After a few seconds:
a. YouTube App stalls just a little, then decrease the quality and continue playing the video.
b. AT4-App stalls longer to continue playing at the initial quality.
We think we already use the Youtube API at the most so we don’t know if we can upgrade our app to behave exactly like native YouTube App, because it does not make any sense that the API has less functionality that the native App (which is supposed to rely on the same APIs).
Thks

Related

Adding features to Twilio Video SDKs

We want to use Twilio video SDKs in our Web, Android and iOS application. Since Twilio provides everything out of the box for conferencing, but we need to add some features from our end to Twilio SDK in order to use them in our application.
Feature needed: We want to include a minimize button present in video conference room which will minimize the screen of video call and user can use the base application concurrently (similar to WhatsApp video call). Also, maximize button will be added when the minimization of call has been done so that user can switch back to video call.
Our basic requirements are:
Audio and Video conferencing
Screen sharing
Recording of meetings
Mute options control for audio/video
Participants limit:
Minimum: 3, Maximum: 50
Duration limit:
Minimum 30 minutes, Maximum 240 minutes
Requirements specific to Web Application(in React):
Conferencing control resides within the application. Existing Web app will be the base interface for video conferencing.
Anyone can mute audio/video of any participant as per his will.
Requirements specific to Mobile Application(in Flutter):
Flexibility to user to switch between video call and our application(identical to how WhatsApp video call works). The video call screen gets minimized and user can use the application normally and still be present in conferencing.
How could I go ahead with this? Any help?
For the Flutter solution we are building an opensource plugin hosted on pub.dev. At the moment screen share has not yet been added, but it will eventually. The API docs can be found here.
Regarding your switch like WhatApp, you need to look into the Picture-in-Picture mode (PiP). Which is also part of the development fase of this plugin. You can find the milestone here and the issues related to it here.
You mention React for the Web. There are already plugins for React on the web. But the Web will also be implemented in the Flutter solution.

iOS requirement for live steaming

I'm working on the live streaming app like Periscope and doing research on requirement and restriction on iOS.
I found out that Apple only allows HLS (Http Live Streaming) for certain conditions. I found such conditions below from apple site.
If your app delivers video over cellular networks, and the video exceeds either 10 minutes duration or 5 MB of data in a five minute period, you are required to use HTTP Live Streaming.(https://developer.apple.com/library/content/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-SW5)
But I'm not sure that HLS should be used for both publishing and watching video or only for watching is acceptable? Because i thinking of using RTMP for publishing and HLS for watching.
I wrote an app similar to periscope that is on the app store now and it can use 2Mbps and connects via RTMP protocol to send the data. So my guess is they no longer enforce it. I also beleive that at the time that that was written cell service load was possibly to high and they were hoping that HLS would help with that would be my guess. Now with 4gLTE it can handle the loads a little better. Again that is just a guess. My app went up with no problem or mention of that and the review team was more than aware what the app did.

youtube Integration on a settop box and questions

We are trying to integrate Youtube app on a STB that is using RDK middleware (capable of running HTML5/javascript applications). I have been through the "YouTube TV HTML5 Technical Requirements 2016" document and have some questions.
1) As per my understanding youtube is an opensource app and integration work will be required? Will there be any customization be required? For example there is a difference how search functionality is available on different device types. Youtube app being run on a browser on a PC, a textbox is available where you can type what you want to search and then press the search icon next to it to start the search. However on the devices like Smart TV, Set Top Box where user does not have the pointing device and the keyboard, usually soft keyboard is required to be shown on the screen and search starts automatically after entering certain number of characters. I want to know if this functionality is customized by the app integrator or there are different code bases for different device types?
Similar questions i have is for the settings menu. For example to support dial 2.0 protocol to remotely launch the youtube application from the remote device you need to have settings menu to let you to pair / unpair the device. So settings menu seems to be different for different device types.
2) Similarly there are differences how user is allowed to perform forward / rewind during the playback. On PC browser i have seen user can seek to any position with in a stream using a mouse. However on smart TV's there is a rewind forward button which result in seek -/+ 10 secs. I have not seen trick modes on any implementation. Are trick modes required and how they are performed? If they are required then using seek or some sort of iframe tag file to allow smooth trickmodes? Again doesn't that part come from the app itself?
3) I'm trying to find if Youtube support any or all of these MPEG-DASH, Apple HLS, Microsoft Smooth Streaming, adaptive bit rate protocols. However not having much luck with them. I tried to capture the packets using wireshark and launched the youtube application and played back the video but i was unable to see any http calls that can give me hint that youtube app is using any of the above ABR formats (may be all the communication was under TLS and so encrypted and so i was unable to find whats going on). Even youtube app running from a browser on a PC, when i playback the video, i can see under settings -> Quality always remain at auto, 480p for the whole duration of playback. And if i change the quality to any value e.g 720p it always stay there for the whole duration of the playback. This is telling me it is not using any of the ABR formats. So i guess these ABR formats are probably for future use?
4) Under the youtube specifications i can see that target device must implement at least com.youtube.playready and com.widevine.alpha (for 4K contents) DRM's. I was trying to find if you tube has any content available in these formats but was unable to find any. Can you please confirm?
I would appreciate if someone can answer these or point me in the right direction.
Best Regards,
Farhan

Stream live video from ipad? [duplicate]

I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.

How do I download a youtube video in iOS

I know how to launch a youtube video within an iOS web view, but how can I download that video to save on my iPad app? Is it possible? Do I need to use some call with a NSURLConnection object and then save the received data into the proper file format?
One specific point I am unsure about is what the url to request the file would be. When viewing a video on youtube, the downloadable videos are triggered with a button, so it's not evident to me what the url would be.
You would need to download an iOS-compatible version of the video, possibly using the same technique employed by the Safari FlashToHTML5 plugin.
Youtube DOES offer video download links:
http://lifehacker.com/#!5152236/youtube-offers-official-downloads-and-purchases-for-videos
.. make sure you do not violate their TOS.
In general, when downloading large files, you might consider implementing resumable file downloads, especially because connectivity on a mobile device is transient.
On top of that, make sure that you're on a wifi network when downloading (or resuming a download), using Reachability. Note that excessive bandwidth usage over cellular networks is a reason for app rejection.

Resources