iOS PictureInPicture with AVSampleBufferDisplayLayer seek forward disabled - ios

I have an AvPictureInPictureController that should display an image while playing audio. I have created a AVSampleBufferDisplayLayer that has a CMSampleBuffer with an image and the two necessary delegates AVPictureInPictureSampleBufferPlaybackDelegate & AVPictureInPictureControllerDelegate
I am setting
pictureInPictureController.requiresLinearPlayback = false
so the picture in picture window is showing the seek back, play/pause and seek forward buttons.
Here is my function for the time range:
func pictureInPictureControllerTimeRangeForPlayback(_ pictureInPictureController: AVPictureInPictureController)
-> CMTimeRange {
let duration = player.state.duration
if duration == 0 {
return CMTimeRange(start: .negativeInfinity, duration: .positiveInfinity)
}
return CMTimeRange(
start: CMTime(
seconds: .zero,
preferredTimescale: 10_000
),
duration: CMTimeMakeWithSeconds(
duration,
preferredTimescale: 10_000
)
)
}
My problem is, that seek forward is disabled on picture in picture. Seeking back is working just fine, also play/pause. But seek forward is disabled and I haven't figured out why this is happening. Any thoughts?
I tried to calculate the CMTimeRange differently but it is always the same behaviour.

I found this project https://github.com/getsidetrack/swiftui-pipify/blob/main/Sources/PipifyController.swift#L250 where the CMTimeRange is calculated from the CACurrentMediaTime() and although I am not sure why there is such a big time range calculated (like taking one week in seconds) CACurrentMediaTime() was the right hint.
So this is how the the pictureInPictureControllerTimeRangeForPlayback looks like now:
func pictureInPictureControllerTimeRangeForPlayback(
_ pictureInPictureController: AVPictureInPictureController
) -> CMTimeRange {
if player.duration == 0 {
return CMTimeRange(start: .negativeInfinity, duration: .positiveInfinity)
}
let currentTime = CMTime(
seconds: CACurrentMediaTime(),
preferredTimescale: 60
)
let currentPosition = CMTime(
seconds: state.position,
preferredTimescale: 60
)
return CMTimeRange(
start: currentTime - currentPosition,
duration: CMTime(
seconds: player.duration,
preferredTimescale: 60
)
)
}

Related

UISlider as audio seekbar jumping to maximum value when changed

I am using a UISlider as a seek bar for audio and it works great for adjusting to change position in the track if it is not animated. If it's animated it works great and tracks along the bar in time with the track perfectly but if you try and adjust it while the animation is active, it jumps to the maximum value of the bar. I assume there is a conflict between the two processes but I'm struggling to work out a fix.
func changeProgressBar() {
let trackLength = Float(AudioService.shared.playerItem?.duration.seconds ?? 0)
Timer.scheduledTimer(withTimeInterval: 0.5, repeats: true){_ in
let currentTime = Float(AudioService.shared.player?.currentTime().seconds ?? 0)
let sliderPosition = currentTime / (trackLength / 100)
self.progressBar.setValue(sliderPosition, animated: true)
print("the current time is", currentTime)
print("the slider position is", sliderPosition)
}
}
#IBAction func progressBarValueChanged(_ sender: UISlider) {
let trackLength = AudioService.shared.playerItem?.duration.seconds ?? 0
let sliderValueFloat = progressBar.value * 100.00
let sliderValueDouble = Double(sliderValueFloat)
let targetTime = (trackLength / 100 * sliderValueDouble)
let targetTimeActual = CMTimeMake(value: Int64(targetTime), timescale: 1)
AudioService.shared.player!.seek(to: targetTimeActual)
}
I have buttons that add or subtract 30 seconds to skip forward or back in the track and they work fine even when the animation is active.
#IBAction func plus30Secs(_ sender: UIButton) {
let currentTime = Float(AudioService.shared.player?.currentTime().seconds ?? 0)
let seekTime = currentTime + 30
let seekTimeActual = CMTimeMake(value: Int64(seekTime), timescale: 1)
AudioService.shared.player!.seek(to: seekTimeActual)
}
#IBAction func minus30Secs(_ sender: UIButton) {
let currentTime = Float(AudioService.shared.player?.currentTime().seconds ?? 0)
let seekTime = currentTime - 30
let seekTimeActual = CMTimeMake(value: Int64(seekTime), timescale: 1)
AudioService.shared.player!.seek(to: seekTimeActual)
}
ok, i have fixed it.
The issue was i had progressBar.maximumValue = 100 meaning that my progressBar.value * 100.00 was giving a value 100 times what it should have been and as such a value beyond the end of the track. I removed the * 100.00 and now it works great.

Swift: Trying to control time in AVAudioPlayerNode using UISlider

I'm using an AVAudioPlayerNode attached to an AVAudioEngine to play a sound.
to get the current time of the player I'm doing this:
extension AVAudioPlayerNode {
var currentTime: TimeInterval {
get {
if let nodeTime: AVAudioTime = self.lastRenderTime, let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}
}
I have a slider that indicates the current time of the audio. When the user changes the slider value, on .ended event I have to change the current time of the player to that indicated in the slider.
To do so:
extension AVAudioPlayerNode {
func seekTo(value: Float, audioFile: AVAudioFile, duration: Float) {
if let nodetime = self.lastRenderTime{
let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodetime)!
let sampleRate = self.outputFormat(forBus: 0).sampleRate
let newsampletime = AVAudioFramePosition(Int(sampleRate * Double(value)))
let length = duration - value
let framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
self.stop()
if framestoplay > 1000 {
self.scheduleSegment(audioFile, startingFrame: newsampletime, frameCount: framestoplay, at: nil,completionHandler: nil)
}
}
self.play()
}
However, my function seekTo is not working correctly(I'm printing currentTime before and after the function and it shows always a negative value ~= -0.02). What is the wrong thing I'm doing and can I find a simpler way to change the currentTime of the player?
I ran into same issue. Apparently the framestoplay was always 0, which happened because of sampleRate. The value for playerTime.sampleRate was always 0 in my case.
So,
let framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
must be replaced with
let framestoplay = AVAudioFrameCount(Float(sampleRate) * length)

Split CMTimeRange into multiple CMTimeRange chunks

Lets assume I have a CMTimeRange constructed from start time zero, and
duration of 40 seconds.
I want to split this CMTimeRange into multiple chunks by a X seconds divider. So the total duration of the chunks will be the same duration as the original duration, and each startTime will reflect the endTime of of the previous chunk. The last chunk will be the modulus of the left over seconds.
For example, for video of 40 seconds, and divider of 15 seconds per chunk:
First CMTimeRange - start time: 0, duration: 15 seconds.
Second CMTimeRange - start time: 15, duration: 15 seconds.
Third CMTimeRange - start time: 30, duration: 10 seconds. (left overs)
What I've tried:
I tried using CMTimeSubtract on the total duration and use the result again, in recursive way untill the CMTime in invalid, But it doesn't seems to work.
Any help will be highly appreciated.
Best Regards, Roi
Starting at range.start, create ranges of the given length
until range.end is reached:
func splitIntoChunks(range: CMTimeRange, length: CMTime) -> [CMTimeRange] {
var chunks: [CMTimeRange] = []
var from = range.start
while from < range.end {
chunks.append(CMTimeRange(start: from, duration: length).intersection(range))
from = from + length
}
return chunks
}
intersection is used here to prune the last chunk to the original range.
Alternative solution:
func splitIntoChunks(range: CMTimeRange, length: CMTime) -> [CMTimeRange] {
return stride(from: range.start.seconds, to: range.end.seconds, by: length.seconds).map {
CMTimeRange(start: CMTime(seconds: $0, preferredTimescale: length.timescale), duration: length)
.intersection(range)
}
}
With a custom extension to make CMTime adopt the Strideable
protocol
extension CMTime: Strideable {
public func distance(to other: CMTime) -> TimeInterval {
return other - self
}
public func advanced(by n: TimeInterval) -> CMTime {
return self + n
}
}
this can be further simplified to
func splitIntoChunks(range: CMTimeRange, length: CMTime) -> [CMTimeRange] {
return stride(from: range.start, to: range.end, by: length.seconds).map {
CMTimeRange(start: $0, duration: length) .intersection(range)
}
}
In any case, you'll might want to add a check
precondition(length.seconds > 0, "length must be positive")
to your function, in order to detect invalid calls during development.
I too needed to stride CMTime, to deal with AVCaptureDevice exposure durations & showing these to users.
Turns out Martin's answer doesn't work anymore with the changes in Swift 4.x / XCode 10. Here's my version of CMTime conformance to Strideable:
extension CMTime: Strideable {
public func distance(to other: CMTime) -> TimeInterval {
return TimeInterval((Double(other.value) / Double(other.timescale)) - (Double(self.value) / Double(self.timescale)))
}
public func advanced(by n: TimeInterval) -> CMTime {
var retval = self
retval.value += CMTimeValue(n * TimeInterval(self.timescale))
return retval
}
}
I derped with it in a playground and it seems to work.

AVPlayer seektotime with Pangesturerecognizer

I'm trying to use seektotime with Pangesture recognizer.But its not seeking as expected.
let totalTime = self.avPlayer.currentItem!.duration
print("time: \(CMTimeGetSeconds(totalTime))")
self.avPlayer.pause()
let touchDelta = swipeGesture.translationInView(self.view).x / CGFloat(CMTimeGetSeconds(totalTime))
let currentTime = CMTimeGetSeconds((avPlayer.currentItem?.currentTime())!) + Float64(touchDelta)
print(currentTime)
if currentTime >= 0 && currentTime <= CMTimeGetSeconds(totalTime) {
let newTime = CMTimeMakeWithSeconds(currentTime, Int32(NSEC_PER_SEC))
print(newTime)
self.avPlayer.seekToTime(newTime)
}
What I'm doing wrong in here ?
Think about what's happening in this line here:
let touchDelta = swipeGesture.translationInView(self.view).x / CGFloat(CMTimeGetSeconds(totalTime))
You're dividing pixels (the translation in just the x-axis) by time. This really isn't a "delta" or absolute difference. It's a ratio of sorts. But it's not a ratio that has any meaning. Then you're getting your new currentTime by just added this ratio to the previous currentTime, so you're adding pixels per seconds to pixels, which doesn't give a logical or useful number.
What we need to do is take the x-axis translation from the gesture and apply a scale (which is a ratio) to it in order to get a useful number of seconds to advance/rewind the AVPlayer. The x-axis translation is in pixels so we'll need a scale that describes seconds per pixels and multiple the two in order to get our number of seconds. The proper scale is the ratio between the total number of seconds in the video and the total number of pixels that the user can move through in the gesture. Multiplying pixels times (seconds divided by pixels) gives us a number in seconds. In pseudocode:
scale = totalSeconds / totalPixels
timeDelta = translation * scale
currentTime = oldTime + timeDelta
So I would rewrite your code like this:
let totalTime = self.avPlayer.currentItem!.duration
print("time: \(CMTimeGetSeconds(totalTime))")
self.avPlayer.pause()
// BEGIN NEW CODE
let touchDelta = swipeGesture.translationInView(self.view).x
let scale = CGFloat(CMTimeGetSeconds(totalTime)) / self.view.bounds.width
let timeDelta = touchDelta * scale
let currentTime = CMTimeGetSeconds((avPlayer.currentItem?.currentTime())!) + Float64(timeDelta)
// END NEW CODE
print(currentTime)
if currentTime >= 0 && currentTime <= CMTimeGetSeconds(totalTime) {
let newTime = CMTimeMakeWithSeconds(currentTime, Int32(NSEC_PER_SEC))
print(newTime)
self.avPlayer.seekToTime(newTime)
}
I have same issue, then i create the UISlider and set the action method is given below,
declare AVPlayer is var playerVal = AVPlayer()
#IBAction func sliderAction(sender: UISlider) {
playerVal.pause()
displayLink.invalidate()
let newTime:CMTime = CMTimeMakeWithSeconds(Double(self.getAudioDuration() as! NSNumber) * Double(sender.value), playerVal.currentTime().timescale)
playerVal.seekToTime(newTime)
updateTime()
playerVal.play()
deepLink()
}
And another method is,
func updateTime() {
let currentTime = Float(CMTimeGetSeconds(playerItem1.currentTime()))
let minutes = currentTime/60
let seconds = currentTime - minutes * 60
let maxTime = Float(self.getAudioDuration() as! NSNumber)
let maxminutes = maxTime / 60
let maxseconds = maxTime - maxminutes * 60
startValue.text = NSString(format: "%.2f:%.2f", minutes,seconds) as String
stopValue.text = NSString(format: "%.2f:%.2f", maxminutes,maxseconds) as String
}
I have used CADisplayLink and declare var displayLink = CADisplayLink(), its used continue(automatically) playing audios. code is
func deepLink() {
displayLink = CADisplayLink(target: self, selector: ("updateSliderProgress"))
displayLink.addToRunLoop(NSRunLoop.currentRunLoop(), forMode: NSDefaultRunLoopMode)
}
func updateSliderProgress(){
let progress = Float(CMTimeGetSeconds(playerVal.currentTime())) / Float(self.getAudioDuration() as! NSNumber)
sliderView.setValue(Float(progress), animated: false)
}
if you see this above answer, you have get idea, hope its helpful

How to show AVplayer current play duration in UISlider

I am using a custom UISlider to implement the scrubbing while playing a video in AVPlayer. Trying to figure out the best way to show the current play time and the remaining duration on the respective ends of the UISlider as it usually is shown in MPMoviePlayerController.
Any help is appreciated.
Swift 4.x
let player = AVPlayer()
player.addPeriodicTimeObserver(forInterval: CMTime.init(value: 1, timescale: 1), queue: .main, using: { time in
if let duration = player.currentItem?.duration {
let duration = CMTimeGetSeconds(duration), time = CMTimeGetSeconds(time)
let progress = (time/duration)
if progress > targetProgress {
print(progress)
//Update slider value
}
}
})
or
extension AVPlayer {
func addProgressObserver(action:#escaping ((Double) -> Void)) -> Any {
return self.addPeriodicTimeObserver(forInterval: CMTime.init(value: 1, timescale: 1), queue: .main, using: { time in
if let duration = self.currentItem?.duration {
let duration = CMTimeGetSeconds(duration), time = CMTimeGetSeconds(time)
let progress = (time/duration)
action(progress)
}
})
}
}
Use
let player = AVPlayer()
player.addProgressObserver { progress in
//Update slider value
}
Put a UILabel at each end. Update them using -[AVPlayer addPeriodicTimeObserverForInterval:queue:usingBlock:]. Compute the time remaining using -[AVPlayer currentTime] and -[AVPlayerItem duration].
[slider setMaximumValue:[AVPlayerItem duration]];
//use NSTimer, run repeat function
-(void)updateValueSlide{
[slider setValue:[AVPlayerItem duration]];
}

Resources