Greetings to all fellow front-end coders.
I have quite a headache because of Apple's autoplay policy. As you've probably experienced, you can't autoplay a video / audio element with sound without user interaction. (I'm referring to this)
I'm currently trying to create my own music web application that will support playlists. I work with the Next.js framework and use react-player as player. Since it does not natively support playlists, I created a two-player pendulum system, where one plays the current sound file and the other loads the next. Then their roles change.
Everything works great in all browsers, except those on iOS, because this player uses autoplay to start playing.
I've been worried about this for several days. I've already thought about letting an Apple product users press play button every time a song changes as a punishment, but of course it's not a good solution.
I think a lot of you have encountered this and you have certainly found a solution. Please help me :(
Related
I have been trying to track this down and there doesn't seem to be a consistent answer. If I have a website that tries to play multiple songs in a row (think playlist) using the HTML 5 audio element, can it continue to work on the iOS lock screen?
For some background, this answer seems to indicate it may be possible. But then this article suggests it is not.
I tried following the closest example of what Apple recommends, as found here, to replicate this. I am using plain, vanilla javascript and HTML, nothing fancy. I copied their example exactly and just substituted the audio tag for the video one, picking two random mp3 songs. All it does is wait for one song to end, then switch the src, load, and play the next track.
When I hit Play on the website, I then lock the iPhone. The first song will play, then stop. It does not continue to the next song.
If the website is open to the page, it will properly transition to the next song. On Android, it will continue to the next song even if the phone is locked.
I tried this with iOS 11 and 12. Neither worked. I have read many differing answers about how javascript is stopped when the website isn't in the foreground, and how iOS requires user interaction to play audio (even going from one song to the next). So it doesn't seem like this would work. But then other answers out there seem to indicate this is possible.
Is there a definitive yes or no? Am I doing something wrong here? Or is it just not possible?
There are multiple issues, which is causing some of the confusion.
First track plays, second does not due to user permission
If the first track was started with user permission, then the only way you can switch to a new track via script is with Apple's recommendation. Handle the ended event and swap in a new src, and call .play() before the callback completes. Otherwise, your code will not have permission to start the new audio.
Cannot get to ended event due to time-constrained source (#t=5,10)
Say you have a 30-second audio file and you tell the browser to only load seconds 5 through 10, via something like #t=5,10 at the end of the URL. When you reach 10, a paused event will fire, not ended. Therefore, it isn't possible to go on to the next track normally, as this event is not "blessed" by Apple to count as a relay of the previous user interaction.
I got around this by having my JavaScript check the currentTime repeatedly, and when it crossed a threshold, seek it to the duration (end) of the file myself. This allows ended to fire normally.
iOS doesn't allow scripts to run after the screen is locked or other apps are in use
This one is a real problem to debug, since it doesn't turn up in any of the simulators. In any case, you're right, your JavaScript is going to get suspended so even if you follow Apple's recommendations, you're out of luck. Or, are you?
A hack... set up a ScriptProcessorNode on a Web Audio context. Set the buffer size to like 512 samples or something. Then, on script process, force a timeupdate event for your Audio Element. This keeps your JavaScript there ticking away. ScriptProcessorNode has to have special privileges here, by its nature.
Apple: If you read this, please fix your browser. It would be great if we didn't have to hack around things to make something as simple as an audio playlist to work. Other browsers don't have nearly this level of problems.
This question already has an answer here:
Audio Information of Current Track iOS Swift
(1 answer)
Closed 6 years ago.
How can I access another app's currently playing audio - the actual audio item but metadata is welcome too. I can see that this question has been asked a lot but with few solutions being offered over the years. I understand apple's philosophy for probably not wanting an app to be able to do this. I also understand that such a request is probably outside of the iOS API. With that being said, I would really like some kind of solution.
Logically, I feel that
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo
should return the info for whatever is currently playing; however, as others have mentioned, this value is always nil for audio being played outside of your app. Notably, popular audio apps seem to fail to use the MPNowPlayingInfoCenter class, making such audio fail to appear.
If using the default music app, one can use
MPMusicPlayerController.systemMusicPlayer().nowPlayingItem
However, what is a more consistent way to access audio playing through the podcasts app, Spotify, Pandora, Safari, etc?
Has anyone found a solution to this? Are there any old Objective-C frameworks that support this functionality?
One approach might be viable if there is there some way I can access the audio path of the item currently being played. For example, if I could get the path of the currently playing item, I could create an AV object from it:
AVAudioPlayer(contentsOfURL: audioUrl)
So is there a way I can get the audio url of the currently playing item and use it that way?
Is another approach better?
If a native solution does not exist, is it possible to bodge something together for it to work? Any advice or ideas welcome.
Edit: I don't believe anybody has been able to achieve this; however, I think a lot of people would like to. If this has been addressed, please link it! :)
This isn't currently possible in iOS. Just changing your AVAudioSession category options to .MixWithOthers, what will be an option to get info Song Info from other apps, causes your nowPlayingInfo to be ignored.
iOS only considers non-mixing apps for inclusion in MPNowPlayingInfoCenter, because there is uncertainty as to which app would show up in (e.g.) Control Center if there are multiple mixing apps playing at the same time.
Proposal: An option would be to use a music fingerprinting algorithm to recognize what is being played by recording it from your App.
Some interesting projects in this direction:
Gracenote https://developer.gracenote.com/ Gracenote (which is owned by Sony) has opened up it's SDKs and APIs and has a proper dev portal.
EchoNest & Spotify API http://developer.echonest.com/ Fusioned with Spotify since March 2016
ACRCloud https://www.acrcloud.com/ offers ACR solutions for custom files such as TV commercials, music tec
I'm researching to start developing an audio streaming apps. First thing I'd like to ask is the framework to use for it. I've tested DOUAudioStreamer , tumtumtum/StreamingKit , RadioKit and etc.
The framework needs to be able to seek a song from certain seconds with low latency. Spotify has that amazing seek feature i.e. find a song you've never listened before and play it, then slide the time slider to the middle, the song continues to play with low latency as if it has downloaded the whole song before.
I guess you could do some predictive analysis as to what songs the user will listen to next, i.e. start caching songs before they actually start to play. That will at least improve the experience somewhat.
I'm building an app in which I need to steam multiple tracks of audio that make up a song. They all need to be synchronized so the song plays back naturally.
I've been able to play back local multitrack audio very well with the solution on this thread: multi track mp3 playback for iOS application, but it looks like AVAudioPlayer isn't able to stream.
I've been looking into working with DOUAudioStreamer because I've read that it's the best solution for streaming audio on iOS without going pretty low-level, but it seems to lack the equivalent of a -playAtTime: method, which is how the tracks were synced up using `AVAudioPlayer.
Does anyone know a workaround for this using DOUAudioStreamer, or have any advice on another way I should approach this? Thanks.
I have built an app, designed to play each sound as the first one is finished by using 'ended' event.
In my initial version, each sound has its own audio element, resulting in something like:
function play_next_audio(){
speaker = $('audio#' + sounds[i++]).get(0);
speaker.addEventListener('ended',play_next_audio);
speaker.play();
}
This works great on all desktop browsers, including Safari, but does not go beyond the very first letter on iOS.
I have also tried a different approach - a single audio element that loads each sound in turn. There I experimented with binding the 'ended' event as well as loading first and binding 'canplaythrough' event instead. In both cases, it fails to work even on the desktop Safari - this time successfully playing the first two letters.
Here is the isolated test:
http://dev.connectfu.com:42001/app/test-sounds.html
Note that audio.load() is commented out several places - having it in or out seems to make no difference.
What am I doing wrong? How can I play series of sounds on iOS? Thank you so much for the help!
update As of 2017, the ended event doesn't fire on Safari (or Chrome) on iOS under several conditions. More information can be found here: It's almost 2017, and HTML5 audio is still broken on iOS.
I've built an HTML5 audio player (Chavah Messianic Radio) that "works" on Safari on iOS.
By "works", I mean, it plays the best it can on Apple's crippled iOS <audio> implementation. At the time of this writing, this includes iOS 5. I've tested this on iPhone 3 and up, and iPad original, iPad 2, and iPad 3.
Here are my findings:
Apple does not allow you to call .play on any audio until user interaction. For me, this means detecting iOS, then showing the music as paused until the user clicks play. Their reasoning is that this will consume data and battery; in practice, though, it just cripples web apps and stifles the evolution of the web.
If you want to play successive sounds (one after another), use a single element. When it's time to play the next sound, set existingAudio.src, then call existingAudio.load(), then call existingAudio.play(). This will allow you to play successive sounds.
Audio events don't fire if Safari is in the background.. While audio will continue playing if the user switches to a different app, the .ended event won't fire. This means it's practically impossible to build a music player app.
Hope this helps.
<rant>
In the meantime: Apple, please, please, please give us better support for HTML5 in iOS Safari. Here are your action items, Apple:
Let HTML5 audio work in the background.
Support OGG.
Support pre-loading audio.
Support concurrent audio.
Let us play audio without user interaction.
Do those things, Apple, you'll be the industry leader in mobile HTML5 audio, everyone will emulate you, you'll once again be leading the way, and web apps will work perfectly on your platforms, while being crippled on other mobile platforms. Yes, these features will use data and the battery, but native apps already do this. Stop crippling web apps and be the leader. Make HTML5 <audio> a first-class citizen on iOS.
</rant>
I don't believe the .play() method is supported for audio or video in iOS. Apple does not like the idea of videos or audio automatically playing upon visiting a page.
Here is a helpful reading about the state of HTML5 audio support across platforms: http://www.phoboslab.org/log/2011/03/the-state-of-html5-audio