I'm developing iOS app which uses HLS. In our video platform we use Nginx proxy_pass module for redirecting from one origin to another when one of them becomes unavailable (at all we have 2 origins). Switching between origins is transparent for client, it's maintained by balancer, for client playlist URI stays unchanged.
We faced a problem with AVPlayer with such switching workflow wich also appear in QuickTime.
According to network log next things happens:
At the moment of switching AVPlayer asks for live playlist again and when it finishes playing chunks loaded before switching it starts to play not the next chunk in playlist but the first!
And there no change in AVPlayer currentTime property, it continues to change like there was no switching on the first chunk (in normal seeking at the start of playlist currentTime will be 0), no player item status changes happen, no notifications are thrown, nothing special in access log, error log is empty at all.
So we can't update user interface (i.e. seek bar) and alter user that he was redirected to another time of live streeam. And the problem is even worse in context that we should not show user the live stream out of bounds of particular broadcast.
Any suggestions how to solve this? Or it's core AVPlayer bug (as far as this issue also appear in Quicktime)?
The solution was:
1) to use different names for playlist on different origins,
2) send back to client 404 or another error when trying to update playlist from disabled origin,
3) use fallback in playlists - add second, alternative playlist from second origin in multibitrait playlist. It's stated in HLS documentation. When AVPlayer recieve error while updating playlist it tryes to update from fallback playlist
4) to provide manual quality selection, we had also to wrap playilists for particular qualities in intermediate variant playlists with master playlist and fallback playlist. FMS which we use for generating playlists from livestream can't do that, so we needed to generate variant playlist on side of Nginx
The result is fault-tolerant video platform with transparent switching between master and slave (second) origins which works with auto and manual quality selection.
Related
I have a react application hosted on heroku, which relies heavily on playing audio with an audio tag. There are two different "pages" (I'm using conditional rendering rather than react router, so everything is all one page), that essentially do the same thing:
Bring the user to a screen with album links
Update the src attribute of a hidden audio tag to point to a google drive hosted mp3 file when a given album link's play button is clicked, then utilize the autoPlay attribute to play the resource.
To ensure fair use (this is a hobby project), copyrighted audios have been trimmed to 25 seconds before hosting. These are pulled into the "music player" section of the app (accessible through the nav bar), one at a time as needed. I have other audio files of my own recordings that are not trimmed and run at about 3.5-5mins long, which appear in the "covers" section of the app.
All of the album links update the audio src and play correctly on Android and desktop devices. On iOS, regardless of browser, the "covers" audios will not play, but the "music player" audios will. I have spent hours trying to find a workaround for this, because it doesn't seem to have to do with browser auto-play policies as much as with some policy on iOS itself? The site is rendered in such a way that the user must interact with it before playing audio, so it satisfies autoPlay policies. Has anyone run across a similar issue?
You can find the site here:
thekillersmusic.herokuapp.com
This issue has been resolved. The problem for me was that the audio files were converted from wav to mp3 by changing the extension on my local device before hosting. This caused range headers to be included on the http requests made for the resources, and partial content responses to be sent back. For whatever reason, iOS devices had issues resolving the partial content responses. However, android and desktop systems did not. After redownloading the files as mp3, rather than wav, and then hosting the new files, the requests were handled with 200 status codes rather than 206 status codes, and the audio worked across all platforms.
I have been trying to track this down and there doesn't seem to be a consistent answer. If I have a website that tries to play multiple songs in a row (think playlist) using the HTML 5 audio element, can it continue to work on the iOS lock screen?
For some background, this answer seems to indicate it may be possible. But then this article suggests it is not.
I tried following the closest example of what Apple recommends, as found here, to replicate this. I am using plain, vanilla javascript and HTML, nothing fancy. I copied their example exactly and just substituted the audio tag for the video one, picking two random mp3 songs. All it does is wait for one song to end, then switch the src, load, and play the next track.
When I hit Play on the website, I then lock the iPhone. The first song will play, then stop. It does not continue to the next song.
If the website is open to the page, it will properly transition to the next song. On Android, it will continue to the next song even if the phone is locked.
I tried this with iOS 11 and 12. Neither worked. I have read many differing answers about how javascript is stopped when the website isn't in the foreground, and how iOS requires user interaction to play audio (even going from one song to the next). So it doesn't seem like this would work. But then other answers out there seem to indicate this is possible.
Is there a definitive yes or no? Am I doing something wrong here? Or is it just not possible?
There are multiple issues, which is causing some of the confusion.
First track plays, second does not due to user permission
If the first track was started with user permission, then the only way you can switch to a new track via script is with Apple's recommendation. Handle the ended event and swap in a new src, and call .play() before the callback completes. Otherwise, your code will not have permission to start the new audio.
Cannot get to ended event due to time-constrained source (#t=5,10)
Say you have a 30-second audio file and you tell the browser to only load seconds 5 through 10, via something like #t=5,10 at the end of the URL. When you reach 10, a paused event will fire, not ended. Therefore, it isn't possible to go on to the next track normally, as this event is not "blessed" by Apple to count as a relay of the previous user interaction.
I got around this by having my JavaScript check the currentTime repeatedly, and when it crossed a threshold, seek it to the duration (end) of the file myself. This allows ended to fire normally.
iOS doesn't allow scripts to run after the screen is locked or other apps are in use
This one is a real problem to debug, since it doesn't turn up in any of the simulators. In any case, you're right, your JavaScript is going to get suspended so even if you follow Apple's recommendations, you're out of luck. Or, are you?
A hack... set up a ScriptProcessorNode on a Web Audio context. Set the buffer size to like 512 samples or something. Then, on script process, force a timeupdate event for your Audio Element. This keeps your JavaScript there ticking away. ScriptProcessorNode has to have special privileges here, by its nature.
Apple: If you read this, please fix your browser. It would be great if we didn't have to hack around things to make something as simple as an audio playlist to work. Other browsers don't have nearly this level of problems.
The Soundcloud widget works fine playing multiple tracks in succession. If I then hide that frame, play a youtube video in a youtube iframe, and then switch back to a new track in the Soundcloud widget, it loads but will not play (ignoring the autoplay setting and any widget.play() calls). I had this working on Chromecast with the developer preview SDK and the 1.0 cast receiver but now with the 2.0.0 receiver it's broken. Any ideas how to proceed?
Currently there is no supported mechanism in the SDK to play YouTube videos outside of the YouTube app. Note that in general, applications may not allow other senders launch or control their receiver side, for example Hulu+ may not like it if you wan to write your own app to launch and control that application on your Chromecast; if they decide to allow such model, they need to publish the steps (for example, they can publish their App ID and additional custom data that would link deep into their application). YouTube is no different in that respect.
ok, got this working so hopefully this is useful to others. Assuming only one is active and visible at a time, the trick is to destroy the prior widgets rather than try to reuse them. For YouTube this does not mean reloading the iframe_api but simply calling YTPlayer.destroy() and new YT.Player() next time around. For SoundCloud keep a handle of the iframe and then call iframe.parentNode.removeChild(iframe) to destroy and then create again next time.
I'm trying to learn more about YouTube's TOS. More specifically:
II. Prohibitions
8: separate, isolate, or modify the audio or video components of any YouTube audiovisual content made available through the YouTube API;
I'm working inside of a Google Chrome Extension which consists of a persistant background page and a foreground pop-up page. I would like to display audiovisual content in the foreground to users. This is fine and works, however, upon closing the foreground -- the audiovisual content ceases because the page has been destroyed.
As such, I would like to sync two YouTube players such that one in the background is unmuted with the one in the foreground being muted, but with its visual content sync'ed to that of the background. Would this violate YouTube's TOS? I'm hoping the answer is no - it seems akin to having a tab open. Sometimes the visual content can be seen (at the user's discretion) but the audio content would be uninterrupted.
Thanks
If I interpret that correctly:
"to separate or isolate" means to cut off the video or the audio part (or even different channels of it, if any) of the returned/streamed media
"to modify" means that you transform the data in some way and you display it to the user, instead of the original data (i. e. you are prohibited to make a video streaming application that displays every movie in black and white).
So, unfortunately, I think your requirement does indeeed violate the TOS.
In short:
There's no way for you to make an app that allows you to listen to a youtube song with your display off...
(I believe they want you to see the ads or they want to prove to those who pay them to put the ads that people sees the ads they put on all youtube videos).
I have a content management server application written in Java. A background process goes through a list of video ids and fetches the details for those video ids using Youtube API.
I would like to check if a particular video entry is available for mobile or not.
I checked syndicate allowed like
String videoEntryUrl = "http://gdata.youtube.com/feeds/api/videos/"+videoID;
VideoEntry videoEntry = service.getEntry(new URL(videoEntryUrl), VideoEntry.class);
if(!videoEntry.getXmlBlob().getBlob().contains("yt:accessControl permission='denied' action='syndicate'")){
System.out.println("The video is syndicatable");
}
Checking for syndicate still not solved the problem and the server still lets in videos that cannot play on Android phone.
What is the right way to filter only the videos that can be played on mobile?
There's no single check to see whether a video is playable "on mobile".
There are a variety of different reasons why a particular video might not be playable on a particular platform, and unfortunately the only way to be absolutely sure whether a particular video will play in a particular player is to attempt to play it.
That being said, this blog post goes into more details about the types of common playback restrictions that crop up: http://apiblog.youtube.com/2011/12/understanding-playback-restrictions.html