How to detect if audio can't be muted? - ios

Is there a way to detect if audio can't be muted? In iOS devices, audio cannot be changed through javascript. I have a button in my app that mutes audio played using SoundManager2 and HTML5 video. This doesn't work in iOS devices. I could check for iOS and handle accordingly, but I am not sure if any other devices are handled this way and its better practice to do more feature detection than device specific changes. I tried checking for the muted parameter and it does change, just doesn't reflect, so I can't do something like this:
soundManager.mute();
if (!soundManager.muted) {
alert("Can't Mute");
}
The mute function also just returns null, so its not like I can look at the returned value of that either.
Is there something I am missing or do I need to check for iOS specifically?

I use feature detection.
First I save the current volume, then I set it to 50%, and check if really is set to 50%
If yes, the device can change volume.
Finally I set the volume back to the original value.
var tmpvol = myAudio.volume;
myAudio.volume = 0.5
if (myAudio.volume == 0.5) {
// true
}
myAudio.volume = tmpvol;

Related

Airplay background streaming like Spotify / Amazon Music

Is it possible to do Airplay audio streaming like Spotify or Amazon Music. When i setup an Airplay stream with Audio from my App the screen (on the Apple-TV) turns black and shows only the progressbar.
Is it possible to show the small hint in the top corner with all the audio information which disappears after a few Seconds and don't block the whole Apple TV Ui?
Or is this kind of a Spotify / Amazon Music privilege?
We had this problem as well. I believe that there are some bugs here in Apple's court, but we found a decent workaround that seems to be pretty safe from side effects.
We found that setting the player's allowsExternalPlayback field to NO would sometimes correctly stream the audio without the blank video screen (along with correct display of the now playing information, correct response to volume rocker etc...). However, we found that very often it would fail to play anything at all.
Doing some runtime introspection, we found that the player would always correctly buffer from the network. But when it would hit the isPlaybackLikelyToKeepUp event, it would set the player's rate field to 1 indicating that it is playing, but more often than not, not actually play the audio. No errors were reported and so from all we could tell, the player itself thinks that it is indeed playing when it is not. We found this hangup to only occur when we had set an AirPlay device for audio output.
So we found that in certain event callbacks and other key places, if we added a simple one-liner:
if( self.avPlayer.rate == 1 ){ self.avPlayer.rate = 1; }
It would kick whatever internal hold ups were causing the player to not actually AirPlay and correctly stream the audio. If the player was already correctly playing, then no harm done.

PhoneGap Media Plugin - iOS - Play Sounds As Sound Effects

I'm using the latest PhoneGap (3.3.0) and the Media plugin - I'm not using it in a music app or anything - I just need to use my sounds for notifications (like when a new message is received). I'm using my iPad and the volume of alert sounds is always controlled by the hardware buttons on the side, which controls the Sound Effects volume, but the Media Plugin sounds are controlled by the media volume instead. It's a major problem because users with their vibrate switch on may get an unwelcome sound effect, while others will expect the sound effects to play according to the Sound Effects volume level and be surprised when they see a message they didn't hear come in. Is there a way, besides push notifications, for my app to receive sound effects that actually behave as sound effects?
I set playAudioWhenScreenIsLocked to false hoping that it would make the sound play as an actual Sound Effect, but apparently it just makes the sound obey the hardware mute switch.
var snd = new Media("sounds/xylo.wav");
snd.play({ playAudioWhenScreenIsLocked : false });
Thanks

Detect mute event and status on iPhone and other iOS devices in PhoneGap app?

There seem to be a few threads floating around this topic but no definitive answer: if a user loads the app with sound enabled but later mutes his/her iPhone, how can we detect this in PhoneGap? Is there a callback for this event? The docs don't seem to list anything.
A second, related question: how to detect the status of the mute button? If someone has mute enabled, how do you detect this to avoid playing audio? The media.play() method only seems to have an option concerning whether to play audio when the screen is locked.
Thanks!
I wanted our app to don't play sounds when the iPhone is muted.
After hours of searching I decided to try with the following parameter and works as expected:
myMedia.play({ playAudioWhenScreenIsLocked : false });
The documentation doesn't say anything that this parameter will make the sound to not play when the iPhone is muted, but it behaves like that.
I'm using PhoneGap 2.6.0 and the docs says:
Pass in this option to the play method to specify whether you want to
play the audio of the media file when the screen is locked (this
defaults to true if not set). If this is set to true, it will ignore
the state of the hardware mute button.
Bad documented?

AVComposition breaks on Airplay

I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!

Playing HTML5 video backwards on iOS

I'm trying to play an HTML5 video in reverse on an iPad (The video needs to switch between forward and reverse arbitrarily based on user input).
The HTML5 <video> element includes a property called playbackRate which allows video to be played at a faster or slower rate, or in reverse. According to Apple's documentation, this property is not supported on iOS.
Playing in reverse can be faked without the use of playbackRate by setting the currentTime property multiple times per second (e.g. 10-30 updates per second). This approach works on desktop Safari but it seems that seeking is capped on iOS devices to around 1 update per second - too slow in my case.
Is there any way to play an HTML5 video backward on an iOS device (namely an iPad)?
(I'm testing on an iPad 2 running 4.3.1)
As Mihai suggested, use the two versions and update the seek location when the user changes the playback direction.
Layer the videos in DIVs on top of one another, and when the playback direction is flipped, toggle the div visibility (and pause playback of the the one being hidden).
So this is the timeline:
User clicks playback toggle.
Pause displayed video.
Get Seek location of displayed video.
Subtract that value from the video duration.
Seek to this value in the non-displayed video.
Toggle displayed video DIVs.
Begin playback of newly displayed video.
The playbackRate attribute is now supported on iOS 6 Safari.
Why not stitch the reverse and forward versions together into one movie?
That removes the problem of unloading and loading video when the user flips the direction. With a single movie approach, when the user flips direction, all you need to do is figure out the corresponding point in the other half of the movie and seek to there.
My suggestion is to make a "fake" video player in your HTML code. Then capture the user's attempt to "play" the video using the safari/ios callback methods. Then create a MPMoviePlayerController that actually loads the video and display it over the original position of the video in the browser or only support full screen play. MPMediaPlayback protocol supports reverse playback via the currentPlaybackRate property so hopefully, this should be a temporary fix as I can't see the iOS version of Safari not eventually implementing this feature since it is supported by the native player.
NOTE: I was stupid and missed a non-trivial part of the question, so the following is pretty useless for iOS purposes
After a moment of Googling, I found the following code along with explanation on how to support reverse-playback of html5 video on Webkit based browsers:
function speedup(video, direction) {
if (direction == undefined) direction = 1; // or -1 for reverse
if (video.playbackRate != undefined) {
video.playbackRate = direction == 1 ? 2 : -2;
} else { // do it manually
video.setAttribute('data-playbackRate', setInterval ((function playbackRate () {
video.currentTime += direction;
return playbackRate; // allows us to run the function once and setInterval
})(), 500));
}
}
function playnormal(video) {
if (video.playbackRate != undefined) {
video.playbackRate = 1;
} else { // do it manually clearInterval(video.getAttribute('data-playbackRate’)); }
}
Source: http://net.tutsplus.com/tutorials/html-css-techniques/html5-audio-and-video-what-you-must-know/
You need to pass in the html5 video object to the speedup function, but then again that function could possibly be reduced into the following (I haven't tested this yet, it is an iPad specific function):
function reverseVideo(video) {
video.playbackRate = -1;
}
Feel free to play around, and search for more information on the html5 video element :)

Resources