For this video:
http://www.youtube.com/watch?v=3Hn3ISdjdK0
Youtube displays that the duration is 14 seconds and also a call to GData API gives 14 seconds duration.
However using the Youtube API getDuration() , I sometimes get 13.28 seconds
var videoDuration = flashPlayer.getDuration();
Why the discrepancy ?
This is how I construct the flashPlayer:
elements.container.flash({
swf : 'http://www.youtube.com/apiplayer?enablejsapi=1&version=3&start=' + settings.start ,
id : 'video_'+settings.safeID,
height : settings.height,
width : settings.width,
allowScriptAccess:'always',
wmode : 'transparent',
flashvars : {
"video_id" : settings.videoID,
"playerapiid" : settings.safeID
}
});
It seems that YouTube simply rounds it upwards, as it's more correct to say that a 13.28s video is 14 seconds long instead of 13, as it's in fact longer than 13 seconds.
Related
I am trying to get the duration of an AVPlayer asset in Hours, Minutes, Seconds. I am able to get the time but it seems to be in seconds and milliseconds.
This is how I get the time:
let duration : CMTime = (player.currentItem!.asset.duration)!
let seconds : Float64 = CMTimeGetSeconds(duration)
I am then applying that to a slider using
slider.maximumValue = Float(seconds)
The outcome of this obviously gives me the duration in seconds however I want to be able to use the duration to set the maximumValue of my slider for video clips which may be under a minute.
For Example: My code above returns 30.865 for a 30 second clip. I need it to return 0.30
This ended up working for me:
let duration : CMTime = (player.currentItem!.asset.duration)!
let timeInMinutes = Float(duration.value)
I'm using AVPlayer to play a live streaming. This stream supports one hour catch-up which means user can seek to one hour ago and play. But I have one question how do I know the accurate position that the player is playing. I need to display current position on the player view. For example,if user is playing half an hour ago then display -30:00; if user is playing the latest content, the player will show 00:00 or live. Thanks
Swift solution :
override func getLiveDuration() -> Float {
var result : Float = 0.0;
if let items = player.currentItem?.seekableTimeRanges {
if(!items.isEmpty) {
let range = items[items.count - 1]
let timeRange = range.timeRangeValue
let startSeconds = CMTimeGetSeconds(timeRange.start)
let durationSeconds = CMTimeGetSeconds(timeRange.duration)
result = Float(startSeconds + durationSeconds)
}
}
return result;
}
To get a live position poison and seek to it you can by using seekableTimeRanges of AVPlayerItem:
CMTimeRange seekableRange = [player.currentItem.seekableTimeRanges.lastObject CMTimeRangeValue];
CGFloat seekableStart = CMTimeGetSeconds(seekableRange.start);
CGFloat seekableDuration = CMTimeGetSeconds(seekableRange.duration);
CGFloat livePosition = seekableStart + seekableDuration;
[player seekToTime:CMTimeMake(livePosition, 1)];
Also when you seek some time back, you can get current playing position by calling currentTime method
CGFloat current = CMTimeGetSeconds([self.player.currentItem currentTime]);
CGFloat diff = livePosition - current;
I know this question is old, but I had the same requirement and I believe the solutions aren't addressing properly the intent of the question.
What I did for this same requirement was to gather the current point in time, the starting time, and the length of the total duration of the stream.
I'll explain something before going further, the current point in time could surpass the (starting time + total duration) this is due to the way hls is structured as ts segments. Ts segments are small chucks of playable video, you could have on your seekable range 5 ts segments of 10 seconds each. This doesn't mean that 50 secs is the full length of the live stream, there is around a full segment more (so 60 seconds of playtime total) but it isn't categorized as seekable since you shouldn't seek to that segment. If you were to do this you'll notice in most instances rebuffering (cause the source may be still creating the next ts segment when you already reached the end of playback).
What I did was checking if the current stream time is further than the seekable rage, if so this would mean were are live on stream. If it isn't you could easily calculate how far behind you are from live if you subtract the current time, starting time, and total duration.
let timeRange:CMTimeRange = player.currentItem?.seekableTimeRanges.last
let start = timeRange.start.seconds
let totalDuration = timeRange.duration.seconds
let currentTime = player.currentTime().seconds
let secondsBehindLive = currentTime - totalDuration - start
The code above will give you a negative number with the number of seconds behind "live" or more specifically the start of the lastest ts segment. Or a positive number or zero when it's playing the latest ts segment.
Tbh I don't really know when does the seekableTimeRanges will have more than 1 value, it has always been just one for the streams I have tested with, but if you find in your streams more than 1 value you may have to figure if you want to add all the ranges duration, which time range to use as the start value, etc. At least for my use case, this was enough.
I can see the nominalFrameRate for some video tracks, but not current frame in AVFoundation docs. How can I get the current frame number of the track as it is played in an AVPlayer? I know frame rates will vary, and nominalFrameRate will always be 0.0 in .m3u8 streams, but surely there must be a way to get the frame number of the currently playing track without having to multiply nominalFrameRate by currentTime?
Thanks.
For iOS 7+ you can use the currentVideoFrameRate property of AVPlayerItemTrack. Its the only consistent property that I've seen measure FPS. The nominalFrameRate property seems to be broken in HLS streams and always returns 0.0 as you mentioned.
AVPlayerItem *item = AVPlayer.currentItem; // Your current item
float fps = 0.00;
for (AVPlayerItemTrack *track in item.tracks) {
if ([track.assetTrack.mediaType isEqualToString:AVMediaTypeVideo]) {
fps = track.currentVideoFrameRate;
}
}
Recently, one could use #t=2m0s or #t=120 to set start time for direct links:
http://www.youtube.com/watch?v=Fk2bUvrv-Uc#t=2m30s
as well as for embed videos:
<iframe width="420" height="315" \
src="http://www.youtube.com/embed/Fk2bUvrv-Uc#t=2m30s" \
frameborder="0" allowfullscreen></iframe>
Now it seems YouTube dropped #t start time support and the above doesn't work anymore. How can I now place reference to videos with particular start time?
Embed videos
Looks like a different parameter is used now - start=<number of seconds> (see this blog and documentation).
Example:
<iframe width="420" height="315" frameborder="0" allowfullscreen
src="http://www.youtube.com/embed/Nc9xq-TVyHI?start=110&end=119"></iframe>
Direct links
For direct links, it is enough to simply replace # for &:
http://www.youtube.com/watch?v=Fk2bUvrv-Uc&t=2m30s
You can use &t= instead of #t=
YouTube dropped #t start time support for embed videos. Now we need ether directly use start=N&end=M parameters in seconds according to their new API, or write some extra code witch transforms our old #t=(n)m(n)s notation into new API-compatible style.
I was able to get the video clip to work based on the start and end, but the exact code needs to be as follows:
?start=x&end=y
where x is the start time in seconds and y the end time in seconds
if the '?' isn't present it doesn't work. Also the code needs to be placed directly after the URL WITHOUT any space or it doesn't work.
I had a large number of videos chapterised using this method with a JQuery plugin. I've adapted the plugin so that you can continue using the (n)m(n)s notation. You could change this fairly easily to grab the #t value from the URL instead of the data attribute that I'm storing it in.
(function( $ ) {
$.fn.videoChapters = function(iframeID) {
if(typeof window.orig_video === 'undefined') {
window.orig_video = [];
}
window.orig_video[iframeID] = $('#'+iframeID).clone();
var chapterContainer = this;
$(this).find('a').on('click', function(e) {
e.preventDefault();
$('#'+iframeID).remove();
$(chapterContainer).prevAll('.video-iframe').first().find('h3').after(window.orig_video[iframeID]);
var video = $('#'+iframeID);
if(typeof window.video_source === 'undefined') {
window.video_source = [];
}
if(typeof window.video_source[iframeID] === 'undefined') {
window.video_source[iframeID] = $(video).attr('src');
}
/* to use it with a normal anchor change this to something like the below, of course it will vary for URL
* so it'd be better to use a regex, but I'll let someone else worry about that.
*
* var str = $(e.target).attr('href').split('#t')[1];
* var nstr = $(str).split("s")[0];
* var mstr = $(nstr).split("m");
* var min = Number(mstr[0]);
* var sec = Number(mstr[1]);
*
* */
var seconds = $(e.target).data('seconds');
/* youtube dropped old min/secs format, now we convert what's there to pure seconds */
var splsec = seconds.split('m');
var min = Number(splsec[0]);
var sec = Number(splsec[1].split('s')[0]);
min = min * 60;
var total = min + sec;
var newSource = window.video_source[iframeID] + "?start="+total+"&autoplay=1&rel=0";
$(video).attr('src', newSource);
});
};
})( jQuery );
The start and end time using iframe embed code doesn't work from my test. What works for me is this code:
From:
http://www.youtube.com/embed/[VIDEOID]
To:
http://www.youtube.com/v/[VIDEOID]&start=[SECONDS]&end=[SECONDS]
<object width="640" height="385">
<param name="movie" value="http://www.youtube.com/v/[VIDEOID]&start=100&end=500" />
<param name="allowscriptaccess" value="always" />
<embed src="http://www.youtube.com/v/[VIDEOID]&start=100&end=500" type="application/x-shockwave-flash" allowscriptaccess="always" width="640" height="385" />
</object>
It seems to work (again?) with the #t= , at least for direct URLs.
The stated URL:
https://www.youtube.com/watch?v=Fk2bUvrv-Uc#t=2m30s
works, that is: it jumps to 2min 30sec.
I tested this on FF, Chrome, IE11.
But probably better to use &t= , & is the common URL argument concatenating character.
Use
/watch?v=<videoID>&t=<s>
with
<videoID>=id of the video
<t>=number of seconds to skip
Examples :
https://m.youtube.com/watch?t=58&v=9W35QxCZnK4
Video starts at 58 seconds
or
https://www.youtube.com/watch?t=120&v=9W35QxCZnK4
Video starts at 2 minutes
Im trying to figure out how to retrieve a videos frame rate via AVPlayer. AVPlayerItem has a rate variable but it only returns a value between 0 and 2 (usually 1 when playing). Anybody have an idea how to get the video frame rate?
Cheers
Use AVAssetTrack's nominalFrameRate property.
Below method to get FrameRate : Here queuePlayer is AVPlayer
-(float)getFrameRateFromAVPlayer
{
float fps=0.00;
if (self.queuePlayer.currentItem.asset) {
AVAssetTrack * videoATrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
if(videoATrack)
{
fps = videoATrack.nominalFrameRate;
}
}
return fps;
}
Swift 4 version of the answer:
let asset = avplayer.currentItem.asset
let tracks = asset.tracks(withMediaType: .video)
let fps = tracks?.first?.nominalFrameRate
Remember to handle nil checking.
There seems to be a discrepancy in this nominalFrameRate returned for the same media played on different versions of iOS. I have a video I encoded with ffmpeg at 1 frame per second (125 frames) with keyframes every 25 frames and when loading in an app on iOS 7.x the (nominal) frame rate is 1.0, while on iOS 8.x the (nominal) frame rate is 0.99. This seems like a very small difference, however in my case I need to navigate precisely to a given frame in the movie and this difference screws up such navigation (the movie is an encoding of a sequence of presentation slides). Given that I already know the frame rate of the videos my app needs to play (e.g. 1 fps) I can simply rely on this value instead of determining the frame rate dynamically (via nominalFrameRate value), however I wonder WHY there is such discrepancy between iOS versions as far as this nominalFrameRate goes. Any ideas?
The rate value on AVPlayer is the speed relative to real time to which it's playing, eg 0.5 is slow motion, 2 is double speed.
As Paresh Navadiya points out a track also has a nominalFrameRate variable however this seems to sometimes give strange results. the best solution I've found so far is to use the following:
CMTime frameDuration = [myAsset tracksWithMediaType:AVMediaTypeVideo][0].minFrameDuration;
float fps = frameDuration.timescale/(float)frameDuration.value;
The above gives slightly unexpected results for variable frame rate but variable frame rate has slightly odd behavior anyway. Other than that it matches ffmpeg -i in my tests.
EDIT ----
I've found sometimes the above gives time kCMTimeZero. The workaround I've used for this is to create an AVAssetReader with a track output,get the pts of the first frame and second frame then do a subtraction of the two.
I don't know anything in AVPlayer that can help you to calculate the frame rate.
AVPlayerItem rate property is the playback rate, nothing to do with the frame rate.
The easier options is to obtain a AVAssetTrack and read its nominalFrameRate property. Just create an AVAsset and you'll get an array of tracks.
Or use AVAssetReader to read the video frame by frame, get its presentation time and count how many frames are in the same second, then average for a few seconds or the whole video.
This is not gonna work anymore, API has changed, and this post is old. :(
The swift 4 answer is also cool, this is answer is similar.
You get the video track from the AVPlayerItem, and you check the FPS there. :)
private var numberOfRenderingFailures = 0
func isVideoRendering() -> Bool {
guard let currentItem = player.currentItem else { return false }
// Check if we are playing video tracks
let isRendering = currentItem.tracks.contains { ($0.assetTrack?.mediaType == .video) && ($0.currentVideoFrameRate > 5) }
if isRendering {
numberOfRenderingFailures = 0
return true
}
numberOfRenderingFailures += 1
if numberOfRenderingFailures < 5 {
return true
}
return false
}