I am trying to play a video on iOS using AVPlayer that is encoded with zencoder. The problem I am seeing is that the duration that the player item reports is rounded / imprecise. For example, the video duration might be 173.134 and the player item will report it as a flat 174.0. This causes problems with detecting loaded percentage and other related things. If I try to use the video without encoding everything is reported correctly and precisely.
Has anyone else ever seen this or have a solution?
I had the same problem. I just compare difference between current position and item duration and 1 second:
- (void)playing:(CMTime)time
{
CMTime itemDuration = _player.currentItem.asset.duration;
NSTimeInterval currentTime = CMTimeGetSeconds(time);
NSTimeInterval duration = CMTimeGetSeconds(itemDuration);
if (fabs(currentTime - duration) < 1)
// This is the end.
}
The problem turned out to be a problem with the source video / zencoder. The audio track was a slightly different length than the video which caused problems with the encoding. Cutting off the last second of the video so that the track durations would match up fixed the problem.
Related
I'm running into some strange issues with AVQueuePlayer. I'll first explain the purpose of my implementation, then go over the issues I am hitting.
Running: iPhone5s, iOS 10.1.1, LTE connection
Video: Progressively downloading a video. .mp4, 5mb, 4 second duration.
The purpose of the implementation is to play a progressively downloaded video that loops seamlessly. The video won't contain any sounds.
I'm using an AVQueuePlayer (supporting iOS 9 and up) to loop videos. I set it up the same way Apple recommended. Which is to listen for the current player item to change and then move the player item to the end of the queue.
https://developer.apple.com/library/content/samplecode/avloopplayer/Listings/Projects_VideoLooper_VideoLooper_QueuePlayerLooper_swift.html#//apple_ref/doc/uid/TP40014695-Projects_VideoLooper_VideoLooper_QueuePlayerLooper_swift-DontLinkElementID_11
I am hitting 2 issues.
Issue 1: My designer gave me a video that contains a video track and an audio track. Durations are the same. I am able to track the buffer progress by checking the current player item loadedTimeRanges. However, when the video loops, it isn't seamless. So we tried a video without the audio track and hit Issue 2.
Issue 2: Testing the same video, but this video contains only a video track. The video loops amazingly. It's seamless. However, when checking the loadedTimeRanges to track the buffer progress, the duration will remain 0 until the video has completely loaded. Then the duration will report the total duration of the video.
Is Issue2 a bug? I also find it strange that removing the audio track creates a much more seamless loop.
I've provided my code below that is used to check seconds buffered. Note that it will return a duration of 0 if the playerItem.loadedTimeRanges.first.timeRangeValue doesn't exist. I can confirm that value does exist and the duration is properly returned when testing both issues.
public var secondsBuffered: Float64 {
if let playerItem = self.player?.currentItem {
if let loadedTimeRange = playerItem.loadedTimeRanges.first?.timeRangeValue {
let duration: Float64 = CMTimeGetSeconds(loadedTimeRange.duration)
return duration
}
}
return 0
}
I have a progress view that represents the playing/progress of a audio track as it is played. 1.0 represents 100% completion of that progress view of course. What I need to accomplish is the progress view's progress gradually increasing throughout the duration of the audio track - need to reach 100% progress (progress=1.0) at the same time the audio track is finished. I do have the duration value of the audio track as well. What I'm confused about is the math or how to get gradually increase progress of progress view based on the duration of the audio track.
Note: I will have an array of audio tracks that have different durations. Also, I am using AVPlayer to play these audio tracks.
Any help would be greatly appreciated. Thanks in advance.
After doing some more digging, I believe approaching it this way will work:
AVPlayerItem *currentItem = yourAVPlayer.currentItem;
CMTime duration = currentItem.duration; //total time
CMTime currentTime = currentItem.currentTime; //playing time
And then in the update method...
progressView.progress = (CGFloat)currentTime / (CGFloat)duration;
Can someone please confirm if this will work as I am not at my workstation at the moment.
A simpler approach is to allow the maximum and minimum values of the progress view to be set. For example, in my app, I use a UISlider to show progress. When the user selects a track to play, I set the minimum and maximum values of the slider as follows:
_slider.minimumValue = 0;
_slider.maximumValue = itemDuration;
The progress value is then just the current time of the player.
[_slider.value setValue:currentPlayerTime animated:YES];
You do not say if your progress view uses a UISlider, but even if it does not, you can probably use this approach.
I am attempting to stitch together video assets using AVComposition based on the code here:
https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html
On OSX it works perfectly, however on iOS when playing back via AVPlayer it only works with 1 or 2 input clips. If I attempt to add a third there will be nothing played back on the AVPlayerLayer. Wierdly if I observer the AVPlayer playback time using addPeriodicTimeObserverForInterval the video appears to be playing for the correct duration, but nothing plays back on the layer.
Does anyone have any insight into why this would be?
Turns out I was creating CMTime objects with differing timeScale values, which was causing rounding errors and creating gaps in my tracks. If a track had a gap then it would just fail to play. Ensuring that all my CMTime objects had the same timeScale made everything work perfectly.
I am playing an audio clip using AvAudipPlayer, I want to know if there is a way to get progress on how much time is left or played ?
I want to animate the slider control while the audio is being played in a view is this possible ?
You should read AVAudioPlayer's documentation. You'll find that the player has two properties that pertain to this, currentTime and duration
audioPlayer.currentTime
audioPlayer.duration
Then, assuming that you're leaving your sliders min and max values at their default settings (0 - 1), you can determine progress by simply dividing currentTime by duration.
I am trying to synchronize several CABasicAnimations with AVAudioPlayer. The issue I have is that CABasicAnimation uses CACurrentMediaTime() as a reference point when scheduling animations while AVAudioPlayer uses deviceCurrentTime. Also for the animations, CFTimeInterval is used, while for sound it's NSTimeInterval (not sure if they're "toll free bridged" like other CF and NS types). I'm finding that the reference points are different as well.
Is there a way to ensure that the sounds and animations use the same reference point?
I don't know the "official" answer, but they are both double precision floating point numbers that measure a number of seconds from some reference time.
From the docs, it sounds like deviceCurrentTime is linked to the current audio session:
The time value, in seconds, of the audio output device. (read-only)
#property(readonly) NSTimeInterval deviceCurrentTime Discussion The
value of this property increases monotonically while an audio player
is playing or paused.
If more than one audio player is connected to the audio output device,
device time continues incrementing as long as at least one of the
players is playing or paused.
If the audio output device has no connected audio players that are
either playing or paused, device time reverts to 0.
You should be able to start an audio output session, call CACurrentMediaTime() then get the deviceCurrentTime of your audio session in 2 sequential statements, then calculate an offset constant to convert between them. That offset would be accurate within a few nanoseconds.
The offset would only be valid while the audio output session is active. You'd have to recalculate it each time you remove all audio players from the audio session.
I think the official answer just changed, though currently under NDA.
See "What's New in Camera Capture", in particular the last few slides about the CMSync* functions.
https://developer.apple.com/videos/wwdc/2012/?id=520