I'm trying to use YouTube player API in my app but I don't know hot to determine if the video is live stream or not. And if anybody knows how to get real duration of the video.
Update:
I figured out a way to determine if content is live or not, I used my backend server for getting the data, but I still can't get the exact duration of the live video.
If you are using the youtube-ios-player-helper YTPlayerView, the playerView:didChangeToQuality: delegate method will return kYTPlaybackQualityAuto for Live Events.
See my pull request on the repo here as well as related discussion in this issue.
The duration of the video should be returned from the duration method on the player, but I've found this to be rather unreliable, with some Live Events returning a duration of 0. Further discussion can be found in this Stack Overflow question.
this is old but you can get the liveStreamingDetails.actualStartTime through the youtube API.
With the actualStartTime in hands, you can calculate how much time elapsed.
There is also the endTime in liveStreamingDetails.
"https://www.googleapis.com/youtube/v3/videos"
"?part=liveStreamingDetails"
"&id=$id&key=$_key"
Related
I'm working on AVPlayer + HLS with live stream,
Sometimes, the video will fall behind and I need all clients to be in-sync with the stream.
How do we know when this happens? I'd want to re-sync it when it falls behind.
Thanks!
You may use your player API to analyze playback statistics and events like rebuffering.
To do so you will need to use the method seekToTime and pass the player current time with the value of the new buffered time.
I'm trying to get a livestream working on youtube. I want to stream 360° content with H264 video and AAC audio. The stream is started with the youtube live api from my mobile app and librtmp is used to deliver video and audio packets. I easily get to the point where the livestream health is good and my broadcast and stream are bound successfully.
However, when I try to transition to "testing" like this:
YoutubeManager.this.youtube.liveBroadcasts().transition("testing", liveBroadcast.getId(), "status").execute();
I get stuck on the "startTesting" status every time (100% reproducible) while I expect it to change to testing after few seconds to allow me to change it to live.
I don't know what's going on as in the youtube live control room everything seems to be fine but the encoder won't start.
Is it a common issue? Is there a mean to access the encoder logs? If you need more information feel free to ask me.
Regards.
I found a temporary fix !
I noticed 2 things :
When the autostart option was on, the stream changed its state to startLive as soon as I stopped sending data. It suggested that the encoder was trying to start but it was too slow to do it before some other data paket was received (I guess)
When I tried to stream to the "Stream now" URL, as #noogui suggested, it worked ! So I checked out what was the difference in the stream now & event configurations.
It turned out I just had to activate the low latency option as it's done by default in the stream now configuration.
I consider it as a temporary fix because I don't really know why the encoder isn't starting otherwise and because it doesn't work with the autostart option... So I hope it wont break again if Youtube does another change on their encoder.
So, if you have to work with the Youtube api, good luck guys !
While the subscription count in
www.googleapis.com/youtube/v3/channels?part=statistics
seems to be updated instantly, the views update around daily.
A workaround that I found was to list all videos in the "uploaded" playlist with
www.googleapis.com/youtube/v3/playlistItems?part=contentDetails
and iterate through them, calling
www.googleapis.com/youtube/v3/videos?part=statistics
for each. This seems to get the most accurate results, though it requires more than 3 credits for every uploaded video, thus using my quota up relatively fast.
Is there a faster way around the problem?
I would like to implement it on an ESP8266 so it would be preferable not to require a lot of storage or processing power.
You can get the view count by getting the liveStreamingDetails, the liveStreamingDetails object contains metadata about a live video broadcast. The object will only be present in a video resource if the video is an upcoming, live, or completed live broadcast. Then, under this, you will get the concurrentViewers. It will show the number of viewers currently watching the broadcast. The property and its value will be present if the broadcast has current viewers and the broadcast owner has not hidden the viewcount for the video
EDIT
Specific to your use case, I believe a 2-part API would help with your inquiry.
I'm thinking of you calling a search query to retrieve all videos of the channel. The Search resource will have the id.videoId that you'll concatenate as part of the list call. This will give you the statistics.viewCount of each video, which you'll need to add up to get the total channel view count.
Hopefully this helps with your inquiry.
Happy coding!
I'm using a github repo to playback video on an app, specifically Player. I'm trying to better understand the code and AVFoundation in general:
If I set a NSURL for the AVAssetURL with a remote server URL video and into the AVPlayer's AVPlayerItem, is it streaming the data from the remote URL? My guess is that this is true for the first play (and that it isn't downloaded all at once and then played, please correct me if I am wrong)
And then if I continuously loop the video that I started playing (by setting the seekToTime to kCMTimeZero once it has ended), am I causing the AVPlayer/Asset to continuously re-stream/re-download the file every time it loops? Or is it cached until the AVPlayer/Asset is released?
If anyone could help me answer or point me to the right Apple docs, I would appreciate it! Thanks!
Another similar (?) question said AVAssetResourceDownloader, but I'm not looking to download the file to local disk (if that's what it does).
You don't download the file but you fill the AVPlayer buffer (a sort of cache)
If you seek to zero you don't download the files since you have the buffer.
You can compare the AVPlayer buffer to the YouTube one.
I have access to a proxy server and I can find out the time a video was requested. The log has the form (time, IP, URL). I want somehow figure out for how many seconds did a particular user using IP address A watched a YouTube video. Any suggestions?
If you only have access to requests, you obviously can't tell the difference if someone just loaded a video or watched it.
So, the best you can do is to come up with a set of heuristics that tries to 'guess' it by observing certain actions of the user. Here are a few ideas:
Does you log count the requests for the video buffer itself? If it does, you can see how much of the video was actually loaded, and the watched time can't be more than that.
If you (quite naively, I guess) assume that they're finished watching when they request another video URL, you can use this as your trigger for ending a 'video session'.
Install Wireshark or similar and start watching activity from YouTube during the video. Can you identify if there's a request when advertising is shown, or the related videos are displayed when the video finishes?
In all honesty, though, I think it will be virtually impossible trying to derive such an specific metric like seconds watched from such limited data as the point in time a video was requested. Just think of what could mess up any strategy you come up with: the user could load several videos in different tabs in a burst, or he could load a video page, pause it and forget it for several minutes or hours before he does watch it.
In short: I don't think you'll get a reliable guess using only the data you have, but if you absolutely must at least try, observing network activity between client and YouTube that only happens when a video is in the 'playing state' (pulling advertisings, related videos, some sort of internal YouTube logging, etc) is probably your best bet. Even that probably won't have a granularity nearly close to seconds, though.