IOS Check if m3u8 stream is connected - ios

I use MPMoviePlayerController to play m3u8 stream. But it doesn't support me a method to check if the m3u8 link is working, so if the link is dead I must waiting for MPMoviePlayerController playing a while to make sure that it's not working. In this situation, I want to show an AlertView if the link is dead to keep user from not waiting before send the link to MPMoviePlayerController. Is there any way to do it ?

Try this once..
Normally we check this after some time, for that we can set a time delay
[self performSelector:#selector(movieTimedOut) withObject:nil afterDelay:20.f];
-(void)movieTimedOut
{
if (!(self.loadState & MPMovieLoadStatePlayable) || !(self.loadState & MPMovieLoadStatePlaythroughOK))
{
//AlertView
}
}

Related

Programmatically trigger the action that a headphone pause button would do

I am trying to find a way to pause any playing media on the device, so I was thinking of triggering the same logic that is fired when a user press the headphone "middle button"
I managed to prevent music from resuming (after I pause it within my app, which basically start an AVAudioSession for recording) by NOT setting the AVAudioSession active property to false and leave it hanging, but I am pretty sure thats a bad way to do it. If I deactivate it the music resumes. The other option I am thinking of is playing some kind of silent loop that would "imitate" the silence I need to do. But I think if what I am seeking is doable, it would be the best approach as I understood from this question it cannot be done using the normal means
func stopAudioSession() {
let audioSession = AVAudioSession.sharedInstance(
do {
if audioSession.secondaryAudioShouldBeSilencedHint{
print("someone is playing....")
}
try audioSession.setActive(false, options: .notifyOthersOnDeactivation)
isSessionActive = false
} catch let error as NSError {
print("Unable to deactivate audio session: \(error.localizedDescription)")
print("retying.......")
}
}
In this code snippet as the function name implies I set active to false, tried to find other options but I could not find another way of stopping my recording session and prevent resume of the other app that was already playing
If someone can guide me to which library I should look into, if for example I can tap into the H/W part and trigger it OR if I can find out which library is listening to this button press event and handling the pause/play functionality
A friend of mine who is more experienced in IOS development suggested the following workaround and it worked - I am posting it here as it might help someone trying to achieve a similar behaviour.
In order to stop/pause what is currently being played on a user device, you will need to add a music player into your app. then at the point where you need to pause/stop the current media, you just initiate the player, play and then pause/stop it - simple :)
like so:
let musicPlayer = MPMusicPlayerApplicationController.applicationQueuePlayer
func stopMedia(){
MPMediaLibrary.requestAuthorization({(newPermissionStatus: MPMediaLibraryAuthorizationStatus) in
self.musicPlayer.setQueue(with: .songs())
self.musicPlayer.play()
print("Stopping music player")
self.musicPlayer.pause()
print("Stopped music player")
})
}
the part with MPMediaLibrary.requestAuthorization is needed to avoid an authorisation error when accessing user's media library.
and of course you will need to add the Privacy - Media Library Usage Description
key into your Info.plist file

Youtube API detecting advertisements

Is there a way to detect a youtube advertisement, when it is playing and also when it ends?
Something like:
function onPlayerStateChange(event) {
if(event.data == advertisement) {
console.log("Advertisement is playing");
}
if(event.data == advertisementIsOver) {
console.log("Advertisement has finished playing");
}
}
I see the question here:
What is the YouTube's PlayerState during pre-roll ad?
And am wondering if there are any updates to the youtube api? Also, can someone provide some code of a youtube advertisement detector? I am not sure how one reliably catches when an advertisement is playing.
A little late but better late than never...
I'm currently developing a chrome extension specifically targeted towards YouTube video playback manipulation. This is one method I've found to detect whether an ad is playing:
Note: All classes I use and Id's are subject to change as YouTube developers change them.
1) Get the html5 video element first: var mainVideo = document.getElementByClassName("html5-main-video"). No element Id for it at the moment but it always has the class html5-main-video.
2) Set an event handler for when the video is ready to play that will check whether or not it's an ad and will be fired when a new video is loaded and ready to play mainVideo.oncanplay = isVideoAnAd.
3) When an ad is playing the progress bar is yellow or more specifically rgb(255, 204, 0) and this property is easily comparable using jQuery
function isVideoAnAd () {
if($(".ytp-play-progress").css("background-color") === "rgb(255, 204, 0)
{
return true;
}}
For more reliable results you can also check the movie_player elements classList to see if it contains("ad-showing"). By the way, movie_player is the id.
Tip: I found all of this with inspect element
This is really the only reliable way I've found to detect ads without having to dive into the YouTube API.
You might want to check this documentation. It is stated that you can use onAdStarted() which is called when playback of an advertisement starts.
Here is a related forum and tutorial which might help.

Get current duration of YouTube Live Event

Is there a way to get the current time of a the recorded stream when broadcasting to YouTube live? I want to be able to send an API request at certain points throughout a live stream to get the current minute/second of the stream. The end result I am trying to achieve is to be able to log a list of highlights. Essentially, a user presses a button and it gets the current time of the stream at that moment, then the user can add a note for what happened at that time. From reading all the docs though, I cannot find a way to get the current time of the recorded stream.
Looks like you can do this with the iFrame API's getDuration() method.
https://developers.google.com/youtube/iframe_api_reference#getDuration
Check out the special note for live events:
If the currently playing video is a live event, the getDuration() function will return the elapsed time since the live video stream began. Specifically, this is the amount of time that the video has streamed without being reset or interrupted. In addition, this duration is commonly longer than the actual event time since streaming may begin before the event's start time.
You didn't specify a language, so I'll post code examples in two different languages. Both utilize the iFrame API.
JavaScript:
window.onYouTubePlayerReady = function(playerId) {
window.ytplayer = document.getElementById("ytPlayer");
console.log(window.ytplayer.getDuration());
}
Objective-C (using YouTube's youtube-ios-player-helper class)
#property (weak, nonatomic) IBOutlet YTPlayerView *playerView;
// ...
- (void)viewDidLoad {
[super viewDidLoad];
[[self.playerView loadWithVideoId:#"iGTIK_8ydoI"] // live at the time answer was posted
}
// ...
- (void)getDurationOfPlayingVideo {
NSLog(#"duration: %d", [self.playerView duration]);
}
Just as a disclaimer from my personal testing: the Live Streaming API is extraordinary temperamental and unstable, and I've found that some Live Events return a duration of 0.
this is old but you can get the liveStreamingDetails.actualStartTime through the youtube API.
With the actualStartTime in hands, you can calculate how much time elapsed.
"https://www.googleapis.com/youtube/v3/videos"
"?part=liveStreamingDetails"
"&id=$id&key=$_key"

recording against a metronome of set length using remote IO

I was able to create the exact functionality I wanted to avaudioplayer and avaudiorecorder but of course experienced latency problems. So after reading pretty much every article on the web and reviewing stacks of sample code, I'm still not sure how to achieve the following:
User chooses to record a sample 2 bars long (4 beats per bar) with a pre-roll/count-in
User clicks record
A metronome starts which counts in 4 beats (accent on the first beat)
The app automatically starts recording on the start of the next bar
The app automatically turns off recording at the end of the 3rd bar (the 2 bars + the pre-roll)
The user can then playback their recording or delete it and start again.
So, with avaudioplayer and avaudiorecorder I simply created a 'caf' using audacity with a metronome set at the correct bpm (bpm is set for the app). I then setup and play the avaudioplayer and using the audiodidfinishsuccessfully delegate method, performed some logic to start the recorder, restart the player, maintain a loop count etc. to turn off recording and audio.
As I mentioned, I was pretty much able to achieve the user experience I am after but the latency problems are not acceptable.
I have been working with audio units and the remote IO and have setup a project with a playback callback and recorder callback etc. but now face the problem of working how to make this work based on the description above. I am trying to work out the following things for starters:
If I create a 1 beat caf file, how could I make use of audio units and remote IO to play x amount of beats and then stop?
How could I do the pre-roll and start the recording callback after 4 beats
Can anyone give me some ideas or point me in the right direction. As I have mentioned, I have already done a stack of research including buying the core audio book, reading every article on atastypixel.com, timbolstad.com etc and trawled through the apple docs.
Thanks in advance for your help.
I start an NSTimer. Use values based on BPM (Beats per Minute) / 60. So if user wants to record a 2 bar file with a count in might do something like this:
//timer interval=100BPM/60secs per minute
timerInterval=100/60;
metroTimer = [NSTimer scheduledTimerWithTimeInterval:timerinterval target:self selector:#selector(blinkMetroLight) userInfo:nil repeats:YES];
- (void)blinkMetroLight
{
if(beatNumber == 0)
{
beatNumber = 1;
}
else if (beatNumber == 5)
{
[self audioProcessorStart];
}
if (beatNumber == 8)
{
[self audioProcessorStop];
[metroTimer invalidate]; metroTimer = nil;
}
beatNumber++
}

Does receiving moviePreloadDidFinish imply a successful load of content

I am using the MPMoviePlayerController to play an audio stream. To verify that there isn't a problem with playback, I set a movie playback error timer and I implement moviePreloadDidFinish. When moviePreloadDidFinish is called, I check the loadState for MPMovieLoadStatePlaythroughOK. If it is not called and my timer expires, I assume the download has failed.
- (void) moviePreloadDidFinish:(NSNotification*)notification
{
if (self.moviePlayer.loadState & MPMovieLoadStatePlaythroughOK) {
NSLog(#"The movie or mp3 finished loading and will now start playing");
// cancel movie playback error timer.
}
}
Occasionally, I do not receive this notification, yet audio keeps playing until my movie playback error timer expires (30 seconds). Does the absence of this moviePreloadDidFinish imply that the download of the audio stream is going to fail soon? If not, is there a better way to programmatically determine that there is a playback problem?

Resources