Youtube API detecting advertisements - youtube

Is there a way to detect a youtube advertisement, when it is playing and also when it ends?
Something like:
function onPlayerStateChange(event) {
if(event.data == advertisement) {
console.log("Advertisement is playing");
}
if(event.data == advertisementIsOver) {
console.log("Advertisement has finished playing");
}
}
I see the question here:
What is the YouTube's PlayerState during pre-roll ad?
And am wondering if there are any updates to the youtube api? Also, can someone provide some code of a youtube advertisement detector? I am not sure how one reliably catches when an advertisement is playing.

A little late but better late than never...
I'm currently developing a chrome extension specifically targeted towards YouTube video playback manipulation. This is one method I've found to detect whether an ad is playing:
Note: All classes I use and Id's are subject to change as YouTube developers change them.
1) Get the html5 video element first: var mainVideo = document.getElementByClassName("html5-main-video"). No element Id for it at the moment but it always has the class html5-main-video.
2) Set an event handler for when the video is ready to play that will check whether or not it's an ad and will be fired when a new video is loaded and ready to play mainVideo.oncanplay = isVideoAnAd.
3) When an ad is playing the progress bar is yellow or more specifically rgb(255, 204, 0) and this property is easily comparable using jQuery
function isVideoAnAd () {
if($(".ytp-play-progress").css("background-color") === "rgb(255, 204, 0)
{
return true;
}}
For more reliable results you can also check the movie_player elements classList to see if it contains("ad-showing"). By the way, movie_player is the id.
Tip: I found all of this with inspect element
This is really the only reliable way I've found to detect ads without having to dive into the YouTube API.

You might want to check this documentation. It is stated that you can use onAdStarted() which is called when playback of an advertisement starts.
Here is a related forum and tutorial which might help.

Related

Programmatically trigger the action that a headphone pause button would do

I am trying to find a way to pause any playing media on the device, so I was thinking of triggering the same logic that is fired when a user press the headphone "middle button"
I managed to prevent music from resuming (after I pause it within my app, which basically start an AVAudioSession for recording) by NOT setting the AVAudioSession active property to false and leave it hanging, but I am pretty sure thats a bad way to do it. If I deactivate it the music resumes. The other option I am thinking of is playing some kind of silent loop that would "imitate" the silence I need to do. But I think if what I am seeking is doable, it would be the best approach as I understood from this question it cannot be done using the normal means
func stopAudioSession() {
let audioSession = AVAudioSession.sharedInstance(
do {
if audioSession.secondaryAudioShouldBeSilencedHint{
print("someone is playing....")
}
try audioSession.setActive(false, options: .notifyOthersOnDeactivation)
isSessionActive = false
} catch let error as NSError {
print("Unable to deactivate audio session: \(error.localizedDescription)")
print("retying.......")
}
}
In this code snippet as the function name implies I set active to false, tried to find other options but I could not find another way of stopping my recording session and prevent resume of the other app that was already playing
If someone can guide me to which library I should look into, if for example I can tap into the H/W part and trigger it OR if I can find out which library is listening to this button press event and handling the pause/play functionality
A friend of mine who is more experienced in IOS development suggested the following workaround and it worked - I am posting it here as it might help someone trying to achieve a similar behaviour.
In order to stop/pause what is currently being played on a user device, you will need to add a music player into your app. then at the point where you need to pause/stop the current media, you just initiate the player, play and then pause/stop it - simple :)
like so:
let musicPlayer = MPMusicPlayerApplicationController.applicationQueuePlayer
func stopMedia(){
MPMediaLibrary.requestAuthorization({(newPermissionStatus: MPMediaLibraryAuthorizationStatus) in
self.musicPlayer.setQueue(with: .songs())
self.musicPlayer.play()
print("Stopping music player")
self.musicPlayer.pause()
print("Stopped music player")
})
}
the part with MPMediaLibrary.requestAuthorization is needed to avoid an authorisation error when accessing user's media library.
and of course you will need to add the Privacy - Media Library Usage Description
key into your Info.plist file

Web Audio API on iOS Safari do not play even after user interaction

I know that there is a limitation in iOS Safari where the audio is not playing until user triggers an interaction. So I have placed the code inside a touchstart event. But unfortunately, I have tried almost every combination, and I couldn't get it to play on iOS Safari.
Here are the things I have tried:
putting the audio load outside the touchstart callback
try adding a gain node
use 0.01 as the start time
and none of the above works in iOS Safari, but they can all play in desktop Chrome and Safari. Here is the link to the gist, you can see the versions where I made the changes (P.S. the click event is used for testing on desktop)
https://gist.github.com/angelathewebdev/32e0fbd817410db5dea1
Sounds play only when currentTime starts to run, but scheduling sounds exactly at currentTime doesn't seem to work. They need to be a little bit into the future (ex: 10ms). You can use the following createAudioContext function to wait until the context is ready to make noise. User action doesn't seem to be required on iPhone, but no such success on iPad just yet.
function createAudioContext(callback, errback) {
var ac = new webkitAudioContext();
ac.createGainNode(); // .. and discard it. This gets
// the clock running at some point.
var count = 0;
function wait() {
if (ac.currentTime === 0) {
// Not ready yet.
++count;
if (count > 600) {
errback('timeout');
} else {
setTimeout(wait, 100);
}
} else {
// Ready. Pass on the valid audio context.
callback(ac);
}
}
wait();
}
Subsequently, when playing a note, don't call .noteOn(ac.currentTime), but do .noteOn(ac.currentTime + 0.01) instead.

Get current duration of YouTube Live Event

Is there a way to get the current time of a the recorded stream when broadcasting to YouTube live? I want to be able to send an API request at certain points throughout a live stream to get the current minute/second of the stream. The end result I am trying to achieve is to be able to log a list of highlights. Essentially, a user presses a button and it gets the current time of the stream at that moment, then the user can add a note for what happened at that time. From reading all the docs though, I cannot find a way to get the current time of the recorded stream.
Looks like you can do this with the iFrame API's getDuration() method.
https://developers.google.com/youtube/iframe_api_reference#getDuration
Check out the special note for live events:
If the currently playing video is a live event, the getDuration() function will return the elapsed time since the live video stream began. Specifically, this is the amount of time that the video has streamed without being reset or interrupted. In addition, this duration is commonly longer than the actual event time since streaming may begin before the event's start time.
You didn't specify a language, so I'll post code examples in two different languages. Both utilize the iFrame API.
JavaScript:
window.onYouTubePlayerReady = function(playerId) {
window.ytplayer = document.getElementById("ytPlayer");
console.log(window.ytplayer.getDuration());
}
Objective-C (using YouTube's youtube-ios-player-helper class)
#property (weak, nonatomic) IBOutlet YTPlayerView *playerView;
// ...
- (void)viewDidLoad {
[super viewDidLoad];
[[self.playerView loadWithVideoId:#"iGTIK_8ydoI"] // live at the time answer was posted
}
// ...
- (void)getDurationOfPlayingVideo {
NSLog(#"duration: %d", [self.playerView duration]);
}
Just as a disclaimer from my personal testing: the Live Streaming API is extraordinary temperamental and unstable, and I've found that some Live Events return a duration of 0.
this is old but you can get the liveStreamingDetails.actualStartTime through the youtube API.
With the actualStartTime in hands, you can calculate how much time elapsed.
"https://www.googleapis.com/youtube/v3/videos"
"?part=liveStreamingDetails"
"&id=$id&key=$_key"

OnBeforePlay .seek doesn't work on iPad

I've scoured the web, upgraded the player, rewritten it 5 times, and now completing my 5th day of failing, and still cannot accomplish what the folks at Longtail tell me will work. (Don't get me wrong, I love 'em there, but this has me ready to jump off a bridge).
I'm simply trying to load a video that will play with Flash or iOS, and upon loading it, immediately go to a specific point in the video useing the .seek() method. Longtail tells me to use the onBeforePlay() function because iOS apparently doesn't respect the start value of the playlist. This code works like smoke with Flash, but ignores the seek in iOS.
Can ANYone assist me with this - it has become the most expensive script I've ever worked on and I have made zero progress at all. :( :( :( Also, I removed all the console functions and tried that, but with the same result.
Full code/player can be seen at http://www.tempurl.us/jw6e.html. You can see that with Flash, the video starts at 60 seconds, but on iOS, it starts at 0.
jwp = jwplayer('jwp').setup({
title: 'Single File Player', width: '720', height:'240', autostart: 'false', listbar: {position: "right",size: 400},
sources:[
{ file: 'http://media3.scctv.net/insight/mp4:nursing_4_clips_400.mp4/playlist.m3u8'},
{ file: 'rtmp://fms.scctv.net/insight/nursing_4_clips_400.mp4'}
]
}
);
jwp.onReady(function() {
// Create a playlist item of the video to play
var newItem = [
{ title: 'Title4 ACUTE_ABDO_PAIN_400',
image: 'playlistitem.png',
sources:[
{ file: 'http://media3.scctv.net/insight/mp4:ACUTE_ABDO_PAIN_400.mp4/playlist.m3u8'},
{ file: 'rtmp://fms.scctv.net/insight/ACUTE_ABDO_PAIN_400.mp4'}
]
}
];
jwp.load(newItem);
});
jwp.onBeforePlay(function() {
// This Works on PC/Mac with Flash, but does nothing on iPad/iPhone
jwp.seek(60);
});
Simply to close the question, the bottom line on this problem was that iOS will not allow autostart - period. Knowing that, all the expected events that were not behaving as expected made sense. Once the user initiates the stream with Play, everything works as expected. In our case, this is still a problem because we want to start later in the stream, but knowing that made dealing with it more manageable.
If the problem is iOS will not allow autostart - period. Knowing that,
all the expected events that were not behaving as expected made sense.
Once the user initiates the stream with Play, everything works as
expected
then you can have a play button only for tablet and ios device and on Clicking the play button,
call jwplayer().play(), this could be a work around for your problem, and after you have invoked jwplayer.play, which is only possible with the touch event, after play is triggeredother events will work.
otherwise even if you try jwplayer().play() onReady(), or autostart nothing will work because of iOs will not allow autostart as you said
I've solved this problem on iOS using onBeforePlay with seek() and play(). This work on desktop flash and IOS. Doesn't work on Android using the parameter androidhls:true
jwplayer().onBeforePlay(function() { jwplayer().seek(60); });
jwplayer().play();
As Ethan JWPlayer mentioned in comment use onPlay event. To prevent "loop buffering" as you said just use flag variable:
var isFirstStart = true,
seekValue = 60;
jwplayer().onPlay(function(){
//exit if it's no first playback start
if( !isFirstStart ) {
return;
}
jwplayer().seek(seekValue);
isFirstStart = false;
});

How to bind the HTML5::stalled event from soundmanager?

I'm trying to to write a javascript app that use the [SoundManager 2][1] api and aim to run in
all desktop and mobile browsers. On the iPad platform, Soundmanager is using the HTML5 audio api since there is on flash support. Now, when I'm trying to play two audio files back to back, both loaded in response to a click event, a [HTML5::stalled][2] event is occasionally raised. How do I set an event handler to catch the stalled event?
Since sound objects in my app are created on the fly and I don't know how to access directly to tags that are created by SoundManager, I tried to use a delegate to handle the stalled event:
document.delegate('audio', 'stalled', function (event) {...});
It doesn't work. the event did not raised in respond to stalled. (I had an alert in my handler).
Also tried to use [Sound::onsuspend()][3] to listen for stalled, but onsuspend pops out
on the end of sound::play(). How can we distinguish between stalled and other events that may raise the audio::suspend? Is there any other way to access the tags that SoundManager must create in order to play HTML audio?
I solved it with the following solution. This is not documented and found by reverse engineering.
It is all about accessing the html audio object, which is availalbe under _a.
currentSound = soundManager.createSound({..});
currentSound._a.addEventListener('stalled', function() {
if (!self.currentSound) return;
var audio = this;
audio.load();
audio.play();
});
The body of the method is based on this post about html5 stalled callback in safari
I can suggest a different "fix" I use with an html5 using platform (samsung smart TV):
var mySound = soundManager.createSound({..});
mySound.load();
setTimeout(function() {
if (mySound.readyState == 1) {
// this object is probably stalled
}
}, 1500);
This works since in html5, unlike flash, the 'readystate' property jumps from '0' to '3' almost instantanously, skipping '1'. ('cause if the track started buffering it's playable...).
Hope this works for you as well.

Resources