Back button starts a UIViewPropertyAnimator? - ios

I have a ViewController with a camera for recording videos. On top there is a spinning circle to indicate that the video is being recorded. This is setup like so:
class CameraViewController: UIViewController, UIImagePickerControllerDelegate, UINavigationControllerDelegate {
private var animator: UIViewPropertyAnimator?
#objc func handleTap(_ gesture:UITapGestureRecognizer) {
if animator == nil {
createAnimation()
}
startRecording()
}
private func createAnimation() {
animator = UIViewPropertyAnimator.runningPropertyAnimator(withDuration: 4, delay: 0, options: [.curveLinear,.allowUserInteraction], animations: {
UIView.animateKeyframes(withDuration: 4, delay: 0, animations: {
UIView.addKeyframe(withRelativeStartTime: 0, relativeDuration: 1.0 / 3.0) {
self.recordingSpinner.transform = .init(rotationAngle: .pi * 2 * 1 / 3)
}
UIView.addKeyframe(withRelativeStartTime: 1.0 / 3.0, relativeDuration: 1.0 / 3.0) {
self.recordingSpinner.transform = .init(rotationAngle: .pi * 2 * 2 / 3)
}
UIView.addKeyframe(withRelativeStartTime: 2.0 / 3.0, relativeDuration: 1.0 / 3.0) {
self.recordingSpinner.transform = .identity
}
})
}, completion: { [weak self] _ in
self?.createAnimation()
})
}
func startRecording() {
if movieOutput.isRecording == false {
animator?.startAnimation()
let connection = movieOutput.connection(with: AVMediaType.video)
if (connection?.isVideoOrientationSupported)! {
connection?.videoOrientation = currentVideoOrientation()
}
if (connection?.isVideoStabilizationSupported)! {
connection?.preferredVideoStabilizationMode = AVCaptureVideoStabilizationMode.auto
}
let device = activeInput.device
if (device.isSmoothAutoFocusSupported) {
do {
try device.lockForConfiguration()
device.isSmoothAutoFocusEnabled = false
device.unlockForConfiguration()
} catch {
print("Error setting configuration: \(error)")
}
}
let outputFileName = NSUUID().uuidString
let outputFilePath = (NSTemporaryDirectory() as NSString).appendingPathComponent((outputFileName as NSString).appendingPathExtension("mov")!)
movieOutput.startRecording(to: URL(fileURLWithPath: outputFilePath), recordingDelegate: self)
}
else {
stopRecording()
}
}
func stopRecording() {
if movieOutput.isRecording == true {
animator?.pauseAnimation()
movieOutput.stopRecording()
}
}
#IBAction func unwindToCamera(sender: UIStoryboardSegue) {
}
...
}
extension CameraViewController: AVCaptureFileOutputRecordingDelegate{
func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
if (error != nil) {
print("Error recording movie: \(error!.localizedDescription)")
} else {
self.footageURL = outputFileURL as URL
//print(self.videoRecorded!)
self.performSegue(withIdentifier: "TrimFootage_Segue", sender: nil)
}
}
override func prepare(for segue: UIStoryboardSegue, sender: Any?){
if segue.identifier == "TrimFootage_Segue" {
let controller = segue.destination as! TrimFootageViewController
controller.footageURL = self.footageURL
}
}
}
So it create an animator if it doesn't exist and then calls startRecording which starts the animation. then stopRecording stops it. Then when the video finishes recording to an output file, it segues to a newView controller. When I press back on that view controller it uses an unwind segue - unwindToCameraWithSender:
When I unwind and come back to the camera, the video is not recording, but the animation is playing. What could have caused this animation to start again? How can I prevent this?

I think the animation only being paused is the reason. In the stopRecording() method try
animator?.stopAnimation(true)
instead of
animator?.pauseAnimation()

So one thing I did to get around this, but not fix it is using the UIDynamicAnimator like so:
#objc func handleTap(_ gesture:UITapGestureRecognizer) {
startRecording()
if let rotate = rotate{
animator.removeBehavior(rotate)
self.rotate = nil
} else {
rotate = UIDynamicItemBehavior(items: [self.recordingSpinner])
rotate?.allowsRotation = true
rotate?.angularResistance = 0
rotate?.addAngularVelocity(1, for: self.recordingSpinner)
animator.addBehavior(rotate!)
}
}
taken from this answer: Proper way to stop an infinitely rotating image? and how does one implement removeAllAnimations?
Interestingly it seems not to start the rotation when I perform the segue though I'm not sure why. If anyone has thoughts on why I would love to hear them

Related

How can I access the length of the video that is CURRENTLY being recorded

I have an iOS camera app where when the user starts to hold a button I want to start recording and within that longTap method be able to know how long the recording is CURRENTLY... BEFORE it has ended. I want to know how long the video is as of now or put differently how long the user has been recording for.
My main problem is how to know, within the long press method, how long the user has been recording for. So that if the recording has reached X, it can do Y.
I have currently tried:
The following in the fileOutput method for video recording (called after the user has let for of the button)
{
let videoRecorded = outputURL! as URL
let determineAsset = AVAsset(url: videoRecorded)
let determineCmTime = CMTime(value: determineAsset.duration.value, timescale: 600)
let secondsBruh = CMTimeGetSeconds(determineCmTime)
print(secondsBruh, "<--- seconds br8h")
if doNotRunPlayback == true {
print("DO NIT RUN PLAYBACK WAS TRUE")
} else {
print("here--- ", secondsBruh)
if secondsBruh <= 0.35 {
print("iiiiiiiiii")
isThisOverSeconds = true
photoOutput?.capturePhoto(with: AVCapturePhotoSettings(), delegate: self) //, put in the
} else {
isSettingThumbnail = false
playRecordedVideo(videoURL: videoRecorded)
}
}
}
Did start recording I get a thumbnail for the video. U will see a num variable but disregard it... it will always yield zero given that this is the start.
func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection]) {
print("U IN THIS DIDSTARRECORD???")
isSettingThumbnail = true
photoOutput?.capturePhoto(with: AVCapturePhotoSettings(), delegate: self)
let testUrl = fileURL as URL!
let testAsset = AVAsset(url: testUrl!)
let deCmTime = CMTime(value: testAsset.duration.value, timescale: 600)
let seconds = CMTimeGetSeconds(deCmTime)
print(seconds, "<--- ok this si seconds")
num = Int(seconds)
print("723648732648732658723465872:::::", Int(seconds))
print("OUT THIS DIDSTART RECORD")
}
LongTap method
if sender.state == .ended {
print("UIGestureRecognizerStateEnded")
if num == 0 {
print("5555555555555")
photoOutput?.capturePhoto(with: AVCapturePhotoSettings(), delegate: self)
} else {
print("num was greater than 0")
}
stopRecording()
print("didLongTapEndedend")
} else if sender.state == .began {
print("UIGestureRecognizerStateBegan.")
startCapture()
print("didLongTapBeganend")
}
What I have however is very buggy. It's pretty much unusable, definitely unreleasable.
Thanks for any help.
Use the UIControlEvents.touchDown & UIControlEvents.touchUpInside events.
Try this.
import UIKit
import PlaygroundSupport
class MyViewController : UIViewController {
var timer:Timer!
override func loadView() {
let view = UIView()
let btn = UIButton.init(frame: CGRect(x: 150, y: 200, width: 200, height: 20))
btn.setTitle("Record/Capture", for: UIControlState.normal)
btn.clipsToBounds
btn.backgroundColor = UIColor.red
view.addSubview(btn)
self.view = view
btn.addTarget(self, action: #selector(touchDown(_:)), for: UIControlEvents.touchDown)
btn.addTarget(self, action: #selector(touchUpInside(_:)), for: UIControlEvents.touchUpInside)
}
#objc func touchDown(_ sender:UIButton) {
timer = Timer.scheduledTimer(withTimeInterval: TimeInterval.init(3), repeats: false, block: { (timer) in
self.startRecording()
})
}
#objc func touchUpInside(_ sender:UIButton) {
if timer.isValid {
self.capturePhoto()
} else {
self.stopRecording()
}
timer.invalidate()
}
func startRecording() {
// Recording
print("Start Recording")
timer.invalidate()
}
func stopRecording() {
// stopRecording
print("Stop Recording")
}
func capturePhoto() {
print("Capture Photo")
}
}
// Present the view controller in the Live View window
PlaygroundPage.current.liveView = MyViewController()
Don’t look at the video. Look at the clock.
You know what time the recording started, because you started it. You know what time it is now. Subtract.

Metronome ios swift beat visuals lag

I'm trying to create an metronome app by implementing the sample code provided by apple. Everything works fine but i'm seeing an delay in the beat visuals its not properly synchronised with the player time. Here is the sample code provided by apple
let secondsPerBeat = 60.0 / tempoBPM
let samplesPerBeat = Float(secondsPerBeat * Float(bufferSampleRate))
let beatSampleTime: AVAudioFramePosition = AVAudioFramePosition(nextBeatSampleTime)
let playerBeatTime: AVAudioTime = AVAudioTime(sampleTime: AVAudioFramePosition(beatSampleTime), atRate: bufferSampleRate)
// This time is relative to the player's start time.
player.scheduleBuffer(soundBuffer[bufferNumber]!, at: playerBeatTime, options: AVAudioPlayerNodeBufferOptions(rawValue: 0), completionHandler: {
self.syncQueue!.sync() {
self.beatsScheduled -= 1
self.bufferNumber ^= 1
self.scheduleBeats()
}
})
beatsScheduled += 1
if (!playerStarted) {
// We defer the starting of the player so that the first beat will play precisely
// at player time 0. Having scheduled the first beat, we need the player to be running
// in order for nodeTimeForPlayerTime to return a non-nil value.
player.play()
playerStarted = true
}
let callbackBeat = beatNumber
beatNumber += 1
// calculate the beattime for animating the UI based on the playerbeattime.
let nodeBeatTime: AVAudioTime = player.nodeTime(forPlayerTime: playerBeatTime)!
let output: AVAudioIONode = engine.outputNode
let latencyHostTicks: UInt64 = AVAudioTime.hostTime(forSeconds: output.presentationLatency)
//calcualte the final dispatch time which will update the UI in particualr intervals
let dispatchTime = DispatchTime(uptimeNanoseconds: nodeBeatTime.hostTime + latencyHostTicks)**
// Visuals.
DispatchQueue.global(qos: .userInitiated).asyncAfter(deadline: dispatchTime) {
if (self.isPlaying) {
// send current call back beat.
self.delegate!.metronomeTicking!(self, bar: (callbackBeat / 4) + 1, beat: (callbackBeat % 4) + 1)
}
}
}
// my view controller class where i'm showing the beat number
class ViewController: UIViewController ,UIGestureRecognizerDelegate,Metronomedelegate{
#IBOutlet var rhythmlabel: UILabel!
//view did load method
override func viewDidLoad() {
}
//delegate method for getting the beat value from metronome engine and showing in the UI label.
func metronomeTicking(_ metronome: Metronome, bar: Int, beat: Int) {
DispatchQueue.main.async {
print("Playing Beat \(beat)")
//show beat in label
self.rhythmlabel.text = "\(beat)"
}
}
}
I think you are approaching this a bit too complex for no reason. All you really need is to set a DispatchTime when you start the metronome, and fire a function call whenever the DispatchTime is up, update the dispatch time based on the desired frequency, and loop as long as the metronome is enabled.
I prepared a project for you which implements this method so you can play with and use as you see fit: https://github.com/ekscrypto/Swift-Tutorial-Metronome
Good luck!
Metronome.swift
import Foundation
import AVFoundation
class Metronome {
var bpm: Float = 60.0 { didSet {
bpm = min(300.0,max(30.0,bpm))
}}
var enabled: Bool = false { didSet {
if enabled {
start()
} else {
stop()
}
}}
var onTick: ((_ nextTick: DispatchTime) -> Void)?
var nextTick: DispatchTime = DispatchTime.distantFuture
let player: AVAudioPlayer = {
do {
let soundURL = Bundle.main.url(forResource: "metronome", withExtension: "wav")!
let soundFile = try AVAudioFile(forReading: soundURL)
let player = try AVAudioPlayer(contentsOf: soundURL)
return player
} catch {
print("Oops, unable to initialize metronome audio buffer: \(error)")
return AVAudioPlayer()
}
}()
private func start() {
print("Starting metronome, BPM: \(bpm)")
player.prepareToPlay()
nextTick = DispatchTime.now()
tick()
}
private func stop() {
player.stop()
print("Stoping metronome")
}
private func tick() {
guard
enabled,
nextTick <= DispatchTime.now()
else { return }
let interval: TimeInterval = 60.0 / TimeInterval(bpm)
nextTick = nextTick + interval
DispatchQueue.main.asyncAfter(deadline: nextTick) { [weak self] in
self?.tick()
}
player.play(atTime: interval)
onTick?(nextTick)
}
}
ViewController.swift
import UIKit
class ViewController: UIViewController {
#IBOutlet weak var bpmLabel: UILabel!
#IBOutlet weak var tickLabel: UILabel!
let myMetronome = Metronome()
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
myMetronome.onTick = { (nextTick) in
self.animateTick()
}
updateBpm()
}
private func animateTick() {
tickLabel.alpha = 1.0
UIView.animate(withDuration: 0.35) {
self.tickLabel.alpha = 0.0
}
}
#IBAction func startMetronome(_: Any?) {
myMetronome.enabled = true
}
#IBAction func stopMetronome(_: Any?) {
myMetronome.enabled = false
}
#IBAction func increaseBpm(_: Any?) {
myMetronome.bpm += 1.0
updateBpm()
}
#IBAction func decreaseBpm(_: Any?) {
myMetronome.bpm -= 1.0
updateBpm()
}
private func updateBpm() {
let metronomeBpm = Int(myMetronome.bpm)
bpmLabel.text = "\(metronomeBpm)"
}
}
Note: There seems to be a pre-loading issue, the prepareToPlay() doesn't fully load the audio file before playing and it causes some timing issue with the first playback of the tick audio file. This issue will be left to the reader to figure out. The original question being synchronization, this should be demonstrated in the code above.

Swift 4 - UISlider with AVAudioPlayer

I am using AVAudioPlayer to play audio, now I am trying to user UISlider as the audio timeline
#IBOutlet var audioSlider: UISlider!
override func viewDidLoad() {
audioSlider.setValue(0, animated: false)
timer = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: (#selector(PlayerController.updateTimer)), userInfo: nil, repeats: true)
}
#objc func updateTimer() {
if(audio != nil)
{
audioSlider.setValue(Float(Int((self.audio?.currentTime)!)), animated: false)
}
}
But I don't think its working, my UISlider goes from the start to the end right away. I am expected a smooth transition with my UISlider
Here is how I am playing audio:
#IBAction func playButtonPressed(_ sender: Any) {
if(playButton.titleLabel?.text == "Play")
{
postPlay(postid: self.postid) { results in
DispatchQueue.main.async {
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
try AVAudioSession.sharedInstance().setActive(true)
}
catch {
}
self.audio?.delegate = self
self.audio?.numberOfLoops = 0
self.playButton.titleLabel?.text = "Stop"
self.audio?.play()
self.playCount = self.playCount + 1
self.plays.text = "Total Plays: " + self.playCount.description
}
}
}
else
{
playButton.titleLabel?.text = "Resume"
audio?.stop()
}
}
The UISlider expects a value between 0.0 and 1.0. To achieve this you must divide the current progress by the total time.
For Example
audioSlider.setValue(Float(self.audio?.currentTime! / self.audio?.duration!), animated: false)

Jumpy UISlider when scrubbing - Using UISlider with AVPlayer

I am using AvPlayer and am trying to set up a slider to allow scrubbing of audio files. Im having a problem with the slider jumping all over the place when its selected. It then goes back to the origin position for a second before going back to the location it was dragged to.
You cant see my cursor on the Gif, but the smooth elongated drags are me moving the knob, then the quick whips are the slider misbehaving.
Ive spent hours googling and combing through Stack Overflow and cant figure out what I'm doing wrong here, a lot of similar questions are quite old and in ObjC.
This is the section of code i think is responsible for the problem, it does handle the event of the slider being moved: Ive tried it without the if statement also and didn't see a different result.
#IBAction func horizontalSliderActioned(_ sender: Any) {
horizontalSlider.isContinuous = true
if self.horizontalSlider.isTouchInside {
audioPlayer?.pause()
let seconds : Int64 = Int64(horizontalSlider.value)
let preferredTimeScale : Int32 = 1
let seekTime : CMTime = CMTimeMake(seconds, preferredTimeScale)
audioPlayerItem?.seek(to: seekTime)
audioPlayer?.play()
} else {
let duration : CMTime = (self.audioPlayer?.currentItem!.asset.duration)!
let seconds : Float64 = CMTimeGetSeconds(duration)
self.horizontalSlider.value = Float(seconds)
}
}
I will include my entire class below for reference.
import UIKit
import Parse
import AVFoundation
import AVKit
class PlayerViewController: UIViewController, AVAudioPlayerDelegate {
#IBOutlet var horizontalSlider: UISlider!
var selectedAudio: String!
var audioPlayer: AVPlayer?
var audioPlayerItem: AVPlayerItem?
var timer: Timer?
func getAudio() {
let query = PFQuery(className: "Part")
query.whereKey("objectId", equalTo: selectedAudio)
query.getFirstObjectInBackground { (object, error) in
if error != nil || object == nil {
print("The getFirstObject request failed.")
} else {
print("There is an object now get the Audio. ")
let audioFileURL = (object?.object(forKey: "partAudio") as! PFFile).url
self.audioPlayerItem = AVPlayerItem(url: NSURL(string: audioFileURL!) as! URL)
self.audioPlayer = AVPlayer(playerItem: self.audioPlayerItem)
let playerLayer = AVPlayerLayer(player: self.audioPlayer)
playerLayer.frame = CGRect(x: 0, y: 0, width: 10, height: 10)
self.view.layer.addSublayer(playerLayer)
let duration : CMTime = (self.audioPlayer?.currentItem!.asset.duration)!
let seconds : Float64 = CMTimeGetSeconds(duration)
let maxTime : Float = Float(seconds)
self.horizontalSlider.maximumValue = maxTime
self.audioPlayer?.play()
self.timer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(PlayerViewController.audioSliderUpdate), userInfo: nil, repeats: true)
}
}
}
#IBOutlet var playerButton: UIButton!
func playerButtonTapped() {
if audioPlayer?.rate == 0 {
audioPlayer?.play()
self.playerButton.setImage(UIImage(named: "play"), for: UIControlState.normal)
} else {
audioPlayer?.pause()
self.playerButton.setImage(UIImage(named: "pause"), for: UIControlState.normal)
}
}
override func viewDidLoad() {
super.viewDidLoad()
horizontalSlider.minimumValue = 0
horizontalSlider.value = 0
self.playerButton.addTarget(self, action: #selector(PlayerViewController.playerButtonTapped), for: UIControlEvents.touchUpInside)
getAudio()
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
NotificationCenter.default.addObserver(self, selector: #selector(PlayerViewController.finishedPlaying), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: self.audioPlayerItem)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillAppear(animated)
// remove the timer
self.timer?.invalidate()
// remove the observer when leaving page
NotificationCenter.default.removeObserver(audioPlayer?.currentItem! as Any)
}
func finishedPlaying() {
// need option to play next track
self.playerButton.setImage(UIImage(named: "play"), for: UIControlState.normal)
let seconds : Int64 = 0
let preferredTimeScale : Int32 = 1
let seekTime : CMTime = CMTimeMake(seconds, preferredTimeScale)
audioPlayerItem!.seek(to: seekTime)
}
#IBAction func horizontalSliderActioned(_ sender: Any) {
horizontalSlider.isContinuous = true
if self.horizontalSlider.isTouchInside {
audioPlayer?.pause()
let seconds : Int64 = Int64(horizontalSlider.value)
let preferredTimeScale : Int32 = 1
let seekTime : CMTime = CMTimeMake(seconds, preferredTimeScale)
audioPlayerItem?.seek(to: seekTime)
audioPlayer?.play()
} else {
let duration : CMTime = (self.audioPlayer?.currentItem!.asset.duration)!
let seconds : Float64 = CMTimeGetSeconds(duration)
self.horizontalSlider.value = Float(seconds)
}
}
func audioSliderUpdate() {
let currentTime : CMTime = (self.audioPlayerItem?.currentTime())!
let seconds : Float64 = CMTimeGetSeconds(currentTime)
let time : Float = Float(seconds)
self.horizontalSlider.value = time
}
}
Swift 5, Xcode 11
I faced the same issue, it was apparently periodicTimeObserver which was causing to return incorrect time which caused lag or jump in the slider. I solved it by removing periodic time observer when the slider was changing and adding it back when seeking completion handler was called.
#objc func sliderValueChanged(_ playbackSlider: UISlider, event: UIEvent){
let seconds : Float = Float(playbackSlider.value)
let targetTime:CMTime = CMTimeMake(value: Int64(seconds), timescale: 1)
if let touchEvent = event.allTouches?.first {
switch touchEvent.phase {
case .began:
// handle drag began
//Remove observer when dragging is in progress
self.removePeriodicTimeObserver()
break
case .moved:
// handle drag moved
break
case .ended:
// handle drag ended
//Add Observer back when seeking got completed
player.seek(to: targetTime, toleranceBefore: .zero, toleranceAfter: .zero) { [weak self] (value) in
self?.addTimeObserver()
}
break
default:
break
}
}
}
you need to remove observers and invalidate timers as soon as user selects the thumb on slider and add them back again when dragging is done
to do add targets like this where you load your player:
mySlider.addTarget(self,
action: #selector(PlayerViewController.mySliderBeganTracking(_:)),
forControlEvents:.TouchDown)
mySlider.addTarget(self,
action: #selector(PlayerViewController.mySliderEndedTracking(_:)),
forControlEvents: .TouchUpInside)
mySlider.addTarget(self,
action: #selector(PlayerViewController.mySliderEndedTracking(_:)),
forControlEvents: .TouchUpOutside )
and remove observers and invalidate timers in mySliderBeganTracking then add observers in mySliderEndedTracking
for better control on what happens in your player write 2 functions : addObservers and removeObservers and call them when needed
Make sure to do the following:
isContinuous for the slider is NOT set to false.
Pause the player before seeking.
Seek to the position and use the completion handler to resume playing.
Example code:
#objc func sliderValueChanged(sender: UISlider, event: UIEvent) {
let roundedValue = sender.value.rounded()
guard let touchEvent = event.allTouches?.first else { return }
switch touchEvent.phase {
case .began:
PlayerManager.shared.pause()
case .moved:
print("Slider moved")
case .ended:
PlayerManager.shared.seek(to: roundedValue, playAfterSeeking: true)
default: ()
}
}
And here is the function for seeking:
func seek(to: Float, playAfterSeeking: Bool) {
player?.seek(to: CMTime(value: CMTimeValue(to), timescale: 1), completionHandler: { [weak self] (status) in
if playAfterSeeking {
self?.play()
}
})
}
Try using the time slider value like below:
#IBAction func timeSliderDidChange(_ sender: UISlider) {
AVPlayerManager.sharedInstance.currentTime = Double(sender.value)
}
var currentTime: Double {
get {
return CMTimeGetSeconds(player.currentTime())
}
set {
if self.player.status == .readyToPlay {
let newTime = CMTimeMakeWithSeconds(newValue, 1)
player.seek(to: newTime, toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero) { ( _ ) in
self.updatePlayerInfo()
}
}
}
}
and pass the value of slider when user release the slider, also don't update the slider value of current playing while user interaction happening on the slider
This is a temporary solution for me, I observed that the rebound is only once, so I set an int value isSeekInProgress:
When sliderDidFinish, isSeekInProgress = 0
In reply to avplayer time change:
if (self.isSeekInProgress > 1) {
float sliderValue = 1.f / (self.slider.maximumValue - self.slider.minimumValue) * progress;
// if (sliderValue > self.slider.value ) {
self.slider.value = sliderValue;
}else {
self.isSeekInProgress += 1;
}

AVPlayer progress with UISlider Swift

I'm trying to use a slider to control Audio and everything works fine, but when I try to make the slider value equal to the player current time it crashes.
However, if I print something inside the updateSlider function, it appears and works fine.
override func viewDidLoad()
{
songTitle.text = mainSongTitle
background.image = image
mainImageView.image = image
downloadFileFromURL(url: URL(string: previewURL)!)
var time = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: #selector(self.updateSlider), userInfo: nil, repeats: true)
}
func downloadFileFromURL(url : URL)
{
var downloadTask = URLSessionDownloadTask()
downloadTask = URLSession.shared.downloadTask(with: url, completionHandler:
{
customURL , response , error in
self.play(url: customURL!)
self.slider.maximumValue = Float(player.duration)
})
downloadTask.resume()
}
func play(url : URL)
{
do
{
player = try AVAudioPlayer(contentsOf: url)
player.prepareToPlay()
player.play()
}
catch
{
print(error)
}
}
#IBAction func PlayPressed(_ sender: Any)
{
if player.isPlaying
{
player.pause()
}
else
{
player.play()
}
}
#IBAction func ChangerTimePlayer(_ sender: Any)
{
player.stop()
player.currentTime = TimeInterval(slider.value)
player.prepareToPlay()
player.play()
}
func updateSlider()
{
slider.value = Float(player.currentTime)
}
Maybe you should do something like:
func updateSlider(){
slider.value = Float(player.currentTime/player.duration)
}
Your download completion task needs to be called on the main thread to safely use UIKit (your slider):
DispatchQueue.main.async {
self.play(url: customURL!)
self.slider.maximumValue = Float(player.duration)
}

Resources