How to play background audio in iOS with Trigger.io? - trigger.io

I'm attempting to build a streaming audio application with Trigger.io's framework.
Seeing that its API currently does not feature any audio-related methods, I tested it with basic HTML5 audio tags. Streaming MP3 and AAC worked in this case, but not when the application is backgrounded, even in iOS 5, which typically allows background audio in web apps.
Is there any to background audio in a Trigger.io application?

Update, April 2013: We added a native audio playback API on our v1.4.39 platform: http://docs.trigger.io/en/v1.4/modules/media.html#createaudioplayer
Using this API, audio will continue to play when the user changes focus from the app.
Original answer:
Trigger doesn't yet support background audio playback.
The device manufacturers (rightly!) put some restrictions on being able to do work as a background service: we've not yet linked together starting and controlling playback in your JavaScript, with the non-JavaScript code that would actually do the playing.
It's on our roadmap, however: currently scheduled for completion in May 2012.

Related

iOS screen recording - enable Safari audio?

I'm making screen recordings of an in-house web app, using iOS Safari and the built-in iOS screen recorder. No audio is recorded. If I enable microphone recording, the audio is recorded but poorly and mixed with room sound (likely recording the device's speaker). The same is true of native apps with WKWebViews.
I'm curious what's happening here. Is Safari/Webkit actively blocking audio recording? Is there a Javascript or Webkit instruction to enable audio recording?
Note that I'm not trying to get around any protection from recording commercial videos, just sounds generated from my own site/app, in order to make promotional videos. So changes made within my app or web site are fine.
Disable microphone audio and connect the device to an external audio source (e.g. headphones).

Is there a way to intercept audio output from within your app to display back an audio visualizer on iOS?

We're currently using Linphone library to make VOIP calls and they have their own solution for audio playback. However, we would like to display a visualizer for the audio that Linphone is outputting from within our own app. Is there a way that we can intercept this data (maybe through sample buffering) in order to draw up audio waves/volume meter in the user interface?
AVAudioPlayer or AVPlayer is out of the question since we do not have access to those objects. Is there a solution in place for AVAudioSession or in CoreAudio?
Only if the audio output app is exporting the audio data using Inter-App-audio or Audiobus. Otherwise the iOS security sandbox will hide that audio output from your app.

Intercept/modify audio stream on iOS

I am looking at the feasibility of getting the current raw audio stream playing and do stuff with it such as stream it over Bluetooth or equalize it, etc. Is there any way to do this in iOS 8?
For example: apps such as Pandora/Spotify are playing music and I want to access the audio they are playing.
To process audio from another app, that app needs to participate in Inter-App Audio.
I don't know if your example apps do that.

How does Audiobus for iOS work?

What SDK's does Audiobus use to provide inter-app audio routing? I am not aware of any Apple SDK that could facilitate inter-app communication for iOS and was under the impression that apps were sandboxed from each other so I'm really intrigued to hear how they pulled this off.
iOS allows inter-app communication via MIDI Sysex messages. AudioBus works by sending audio as MIDI Sysex message. You can read details from the developer himself:
http://atastypixel.com/blog/thirteen-months-of-audiobus/
My guess is that they use some sort of audio over network, because I've seen log statements when our app gets started even on a different device.
Don't really know about the details of the implementation, but this could be a way of staying in the "sandbox" constraint.
The Audiobus SDK (probably) use the Audio Session rules to "organize" all the sound output from the apps using their SDK, as you can see on their videos (on bottom of the page), the apps have an lateral menu to switch back and forwards between apps.
The Audio Session Category states:
Allows mixing: if yes, audio from other applications (such as the iPod) can continue playing when your application plays sound.
This way Audiobus can "control" the sound and allow the session to be persistent between the apps.

HTML5 Audio callback fails on safari/iOS

I have built an app, designed to play each sound as the first one is finished by using 'ended' event.
In my initial version, each sound has its own audio element, resulting in something like:
function play_next_audio(){
speaker = $('audio#' + sounds[i++]).get(0);
speaker.addEventListener('ended',play_next_audio);
speaker.play();
}
This works great on all desktop browsers, including Safari, but does not go beyond the very first letter on iOS.
I have also tried a different approach - a single audio element that loads each sound in turn. There I experimented with binding the 'ended' event as well as loading first and binding 'canplaythrough' event instead. In both cases, it fails to work even on the desktop Safari - this time successfully playing the first two letters.
Here is the isolated test:
http://dev.connectfu.com:42001/app/test-sounds.html
Note that audio.load() is commented out several places - having it in or out seems to make no difference.
What am I doing wrong? How can I play series of sounds on iOS? Thank you so much for the help!
update As of 2017, the ended event doesn't fire on Safari (or Chrome) on iOS under several conditions. More information can be found here: It's almost 2017, and HTML5 audio is still broken on iOS.
I've built an HTML5 audio player (Chavah Messianic Radio) that "works" on Safari on iOS.
By "works", I mean, it plays the best it can on Apple's crippled iOS <audio> implementation. At the time of this writing, this includes iOS 5. I've tested this on iPhone 3 and up, and iPad original, iPad 2, and iPad 3.
Here are my findings:
Apple does not allow you to call .play on any audio until user interaction. For me, this means detecting iOS, then showing the music as paused until the user clicks play. Their reasoning is that this will consume data and battery; in practice, though, it just cripples web apps and stifles the evolution of the web.
If you want to play successive sounds (one after another), use a single element. When it's time to play the next sound, set existingAudio.src, then call existingAudio.load(), then call existingAudio.play(). This will allow you to play successive sounds.
Audio events don't fire if Safari is in the background.. While audio will continue playing if the user switches to a different app, the .ended event won't fire. This means it's practically impossible to build a music player app.
Hope this helps.
<rant>
In the meantime: Apple, please, please, please give us better support for HTML5 in iOS Safari. Here are your action items, Apple:
Let HTML5 audio work in the background.
Support OGG.
Support pre-loading audio.
Support concurrent audio.
Let us play audio without user interaction.
Do those things, Apple, you'll be the industry leader in mobile HTML5 audio, everyone will emulate you, you'll once again be leading the way, and web apps will work perfectly on your platforms, while being crippled on other mobile platforms. Yes, these features will use data and the battery, but native apps already do this. Stop crippling web apps and be the leader. Make HTML5 <audio> a first-class citizen on iOS.
</rant>
I don't believe the .play() method is supported for audio or video in iOS. Apple does not like the idea of videos or audio automatically playing upon visiting a page.
Here is a helpful reading about the state of HTML5 audio support across platforms: http://www.phoboslab.org/log/2011/03/the-state-of-html5-audio

Resources