Audio duration sometimes changes to near-zero in Safari - ios

I am encountering a strange issue on Safari, both on MacOS and iOS. Initially it seemed like audio would sometimes just not play, but not emit any errors. After adding much logging I found that when when I call audio.play() the audio.duration property would accurately reflect the duration of the clip. The pause and then ended events will then almost immediately be emitted and, after that the duration for the same clip is suddenly somewhere between 0.001 and 0.003 seconds.
I am retaining a reference to the audio element and reusing it to play the same audio multiple times. It almost always works the first time, but after the first playthrough, on about 50% of subsequent plays the symptoms described above will present themselves.
The code where I play the audio is below:
// In the constructor for the class managing audio playback:
this.mediaElement.addEventListener('pause', e => {
console.log('Media paused', this.mediaElement.duration); // This shows the very short duration if the audio did not play
});
// In the function in the class that plays the media.
try {
await this.mediaElement.play();
console.log('Media is playing', this.mediaElement.duration); // This shows an accurate duration
this.state = Playable.state.PLAYING;
} catch(e) {
console.error('Playing media failed:', e);
if (e && e.name === 'NotAllowedError') {
ErrorHandler.playbackNotAllowed(e);
this.state = Playable.state.PAUSED;
return;
} else {
this._fail(e);
this._fakePlay();
}
}
As you can see, I'm not doing anything strange or complicated here. So far I haven't been able to figure out why the duration would change in this way only after playing the audio. Is this a known bug, or is there something I may be doing that could cause this behavior. The closest thing I can think of is that sometimes I will set the currentTime to 0 if I need to start playing the audio from the beginning, but that should not change the duration.

Related

Youtube API on clicking a different times does BUFFERING (state=3) always occur?

I am using the Youtube API and was wondering if anyone knew if when you click the youtube player's time (at the bottom of the video) to progress/or go back to an earlier point in the video if the BUFFERING, or state=3 value always occur?
For example:
function onPlayerStateChange(event) {
//video is buffering, one cause is the user
//clicked to progress/go back in the video.
//Does buffering state always happen in this case?
if (event.data == 3) {
//BUFFERING
}
}
It is intended to fire 3 (buffering) due to onStateChange when pressing play button or player's time. As stated here, event fires whenever the player's state changes. The data property of the event object that the API passes your evnt listener function will specifiy an integer that corresponds to the new player state. Below are the possible values:
-1 (unstarted)
0 (ended)
1 (playing)
2 (paused)
3 (buffering)
5 (video cued).

Web Audio API on iOS Safari do not play even after user interaction

I know that there is a limitation in iOS Safari where the audio is not playing until user triggers an interaction. So I have placed the code inside a touchstart event. But unfortunately, I have tried almost every combination, and I couldn't get it to play on iOS Safari.
Here are the things I have tried:
putting the audio load outside the touchstart callback
try adding a gain node
use 0.01 as the start time
and none of the above works in iOS Safari, but they can all play in desktop Chrome and Safari. Here is the link to the gist, you can see the versions where I made the changes (P.S. the click event is used for testing on desktop)
https://gist.github.com/angelathewebdev/32e0fbd817410db5dea1
Sounds play only when currentTime starts to run, but scheduling sounds exactly at currentTime doesn't seem to work. They need to be a little bit into the future (ex: 10ms). You can use the following createAudioContext function to wait until the context is ready to make noise. User action doesn't seem to be required on iPhone, but no such success on iPad just yet.
function createAudioContext(callback, errback) {
var ac = new webkitAudioContext();
ac.createGainNode(); // .. and discard it. This gets
// the clock running at some point.
var count = 0;
function wait() {
if (ac.currentTime === 0) {
// Not ready yet.
++count;
if (count > 600) {
errback('timeout');
} else {
setTimeout(wait, 100);
}
} else {
// Ready. Pass on the valid audio context.
callback(ac);
}
}
wait();
}
Subsequently, when playing a note, don't call .noteOn(ac.currentTime), but do .noteOn(ac.currentTime + 0.01) instead.

Is it possible to get audio from an ICY stream with percentage and seek function

I'm trying to reproduce audio from an ICY stream. I'm able to reproduce that with AVPlayer and some good open source library but I'm not able to control the stream. I have no idea how I can get the percentage reproduced or how to seek to a specific time in the stream. Is that possible? Is there a good library that can help me?
Actually I'm using AFSoundManager but I'm always receiving negative numbers for percentage and I get invalid time when trying to seek the stream at a specified time.
That's the code that I'm using:
AFSoundManager.sharedManager().startStreamingRemoteAudioFromURL("http://www.abstractpath.com/files/audiosamples/sample.mp3") { (percentage, elapsedTime, timeRemaining, error, poppi) in
if error == nil {
//This block will be fired when the audio progress increases in 1%
if elapsedTime > 0 {
println(elapsedTime)
self.slider.value = Float(elapsedTime*1000)
}
} else {
//Handle the error
println(error)
}
I'm able of course to get the elapsedTime but not the percentage or the remainingTime. I always get negative numbers.
This code works perfectly with remote or local audio file but not with the stream.
This isn't possible.
These streams are live. There is nothing to seek to because what you haven't heard hasn't happened yet. Even streams that playback music end-to-end are still "live" in the sense that the audio you haven't received hasn't been encoded yet. (Small codec and transit buffers aside, of course.)

Audio and Recording Reuse on iPhone with Monotouch

I just started testing this very simple audio recording application that was built through Monotouch on actual iPhone devices today. I encountered an issue with what seemed to be the re-use of the AVAudioRecorder and AVPlayer objects after their first use and I am wondering how I might could solve it.
Basic Overview
The application consists of the following three sections :
List of Recordings (TableViewController)
Recording Details (ViewController)
New Recording (ViewController)
Workflow
When creating a recording, the user would click the "Add" button from the List of Recordings area and the application pushes the New Recording View Controller.
Within the New Recording Controller, the following variables are available:
AVAudioRecorder recorder;
AVPlayer player;
each are initialized prior to their usage:
//Initialized during the ViewDidLoad event
recorder = AVAudioRecorder.Create(audioPath, audioSettings, out error);
and
//Initialized in the "Play" event
player = new AVPlayer(audioPath);
Each of this work as intended on the initial load of the New Recording Controller area, however any further attempts do not seem to work (No Audio Playback)
The Details area also has a playback portion to allow the user to playback any recordings, however, much like the New Recording Controller, playback doesn't function there either.
Disposal
They are both disposed as follows (upon exiting / leaving the View) :
if(recorder != null)
{
recorder.Dispose();
recorder = null;
}
if(player != null)
{
player.Dispose();
player = null;
}
I have also attempted to remove any observers that could possible keep any of the objects "alive" in hopes that would solve the issue and have ensured they are each instantiated with each display of the New Recording area, however I still receive no audio playback after the initial Recording session.
I would be happy to provide more code if necessary. (This is using MonoTouch 6.0.6)
After further investigation, I determined that the issue was being caused by the AudioSession as both recording and playback were occurring within the same controller.
The two solutions that I determined were as follows:
Solution 1 (AudioSessionCategory.PlayAndRecord)
//A single declaration of this will allow both AVAudioRecorders and AVPlayers
//to perform alongside each other.
AudioSession.Category = AudioSessionCategory.PlayAndRecord;
//Upon noticing very quiet playback, I added this second line, which allowed
//playback to come through the main phone speaker
AudioSession.OverrideCategoryDefaultToSpeaker = true;
Solution 2 (AudioSessionCategory.RecordAudio & AudioSessionCategory.MediaPlayback)
void YourRecordingMethod()
{
//This sets the session to record audio explicitly
AudioSession.Category = AudioSessionCategory.RecordAudio;
MyRecorder.record();
}
void YourPlaybackMethod()
{
//This sets the session for playback only
AudioSession.Category = AudioSessionCategory.MediaPlayback;
YourAudioPlayer.play();
}
For some additional information on usage of the AudioSession, visit Apple's AudioSession Development Area.

ytplayer api event when reaching a position in a video?

Is there a way to cause an event when a video reaches a specific time? I want to get to a callback function at the time when the video has reached to a certain time, and the time it takes for the video to reach that time is unpredictable, since the user can skip part of the video, or buffering might take some time before the video resumes, or something like that, so simply setting a timed event wont work because the video might reach specific time earlier.
I can query the time of the video, but what I want is to get a callback when the video has reached a certain time. Is there a way to do this?
I'm not going to write the full code, but you should set up an interval, like this:
var time = 70; // Time in seconds, e.g. this one is one minute and 10 seconds
var reached = false;
var interval = setInterval(function(){
if(player.getCurrentTime() >= time && !reached) {
clearInterval(interval);
reached = true;
timeReached();
}
},1000);
function timeReached() {
// Do what you have to
}
You can use this Javascript wrapper for the YouTube player API.
The API provides very simple event handling. E.g:
youtubePlayer.at('5000', function() {
alert("You're five seconds into this Youtube clip");
});
use player.getCurrentTime()!
https://developers.google.com/youtube/iframe_api_reference#Playback_status

Resources