I always have worked in obj-c, but I want to get my head 'round swift. I am nearly there with my code, but I just don't know how to loop my wav. It stops after one time playing. I have found some instructions, but I haven't found the solution yet for my code. I hope anyone knows the answer and can help me. So the question is: What do I have to do to make my wav loop when pressing #IBAction func playButtonTapped ?? I will give all my code, just to be sure. Thanks in advance:-)
class ViewController: UIViewController {
#IBOutlet var startAllButton: UIButton!
var audioEngine = AVAudioEngine()
var playerNode = AVAudioPlayerNode()
let timeShift = AVAudioUnitTimePitch()
let pedoMeter = CMPedometer()
let bpm: Float = 110
var avgStarted: Bool = false
var steps: Int = 0
var timer = Timer()
var adjustedBpm: Float = 110
var timerCount = 10 {
didSet {
if timerCount == 0 {
stopCountingSteps()
}
}
}
var lastTap: Date? = nil
#IBOutlet weak var tempoTap: UIButton!
#IBOutlet weak var slider: UISlider!
#IBOutlet weak var label: UILabel!
#IBOutlet weak var playButton: UIButton!
#IBOutlet weak var timerLabel: UILabel!
#IBOutlet weak var stepCountLabel: UILabel!
#IBOutlet weak var avgLabel: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
setup()
}
func setup() {
label.text = "110"
audioEngine.attach(playerNode)
audioEngine.attach(timeShift)
audioEngine.connect(playerNode, to: timeShift, format: nil)
audioEngine.connect(timeShift, to: audioEngine.mainMixerNode, format: nil)
audioEngine.prepare()
timerLabel.text = ""
stepCountLabel.text = ""
do {
try audioEngine.start()
} catch {
print("Could not start audio engine")
}
}
#IBAction func sliderValueChanged(_ sender: UISlider) {
label.text = String(sender.value)
self.label.text = String(format:"%.f", sender.value)
adjustedBpm = sender.value
timeShift.rate = adjustedBpm/bpm
}
#IBAction func playButtonTapped(_ sender: Any) {
let url = Bundle.main.url(forResource: "25loop110", withExtension: ".wav")
if let url = url {
do {
let audioFile = try AVAudioFile(forReading: url)
timeShift.rate = adjustedBpm/bpm
playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
} catch {
print("could not load audio file")
}
} else {
print("could not load audio file")
}
playerNode.play()
}
The problem is these lines:
let audioFile = try AVAudioFile(forReading: url)
playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
Where’s the loop? Nowhere. That code plays the file once.
You cannot loop with a file in AVAudioEngine. You loop with a buffer. You read the file into a buffer and call scheduleBuffer(buffer, at: nil, options: .loops).
Related
Heres the code I have so far, now when a user inputs any letter, my label display nothing, what I would like to figure out is how to turn that nothing "", into a 0. I tried doing an if statement on my "label.txt ="'s but that didn't pan out. What would be a better way of finding my desired results?
import UIKit
class ViewController: UIViewController {
#IBOutlet weak var game1: UITextField!
#IBOutlet weak var game2: UITextField!
#IBOutlet weak var game3: UITextField!
#IBOutlet weak var series: UILabel!
#IBOutlet weak var average: UILabel!
#IBOutlet weak var high: UILabel!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
#IBAction func calculate(_ sender: Any) {
self.view.endEditing(true)
guard
let text1 = game1.text,
let text2 = game2.text,
let text3 = game3.text
else { return }
guard
let game1Results = Int(text1),
let game2Results = Int(text2),
let game3Results = Int(text3)
else { return }
let gameResultsArray = [game1Results, game2Results, game3Results]
let sumArray = gameResultsArray.reduce(0, +)
let positiveArray = gameResultsArray.filter {
(item: Int) -> Bool in return item > 0
}
var avgArrayValue = 0
if positiveArray.count == 0
{
avgArrayValue = 0
}else {
avgArrayValue = sumArray / positiveArray.count
}
series.text = "\(sumArray)"
average.text = "\(avgArrayValue)"
if let maximumVal = gameResultsArray.max() {
high.text = String(maximumVal)
}
}
}
Here is what you need, convert String to Int and give the default 0. Instead of using the guard let return use this method:
Instead of this:
guard let game1Results = Int(text1) else { return }
Use this:
let game1Results = Int(text1) ?? 0
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
I want to create a simple AVAudioPlayer element but am unsure what to fix. I am making a UISlider and 2 labels that all correspond to the AVAudioPlayer.
Example:
I have a bunch of trouble trying to properly implement the slider and the labels updateTimeFunction? I tried my best...how can I do this? :)
#IBOutlet weak var playbackSlider: UISlider!
#IBOutlet weak var playbackTimeLabelFront: UILabel!
#IBOutlet weak var playbackTimeLabelBack: UILabel!
var timer: Timer!
var isPlaying = false
var audioRecorder: AVAudioRecorder!
var audioPlayer : AVAudioPlayer!
var recordingSession: AVAudioSession!
func loadRecordingUI() {
do {
audioPlayer = try AVAudioPlayer(contentsOf: getFileUrl())
audioPlayer!.delegate = self
audioPlayer!.prepareToPlay()
} catch {
print("audioPlayer error: \(error.localizedDescription)")
}
}
print("Audio Success")
}
#IBAction func playTapped(_ sender: Any) {
if isPlaying {
audioPlayer!.pause()
isPlaying = false
} else {
audioPlayer!.play()
isPlaying = true
timer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(updateTime), userInfo: nil, repeats: true)
}
}
#objc func updateTime() {
let currentTime = Int(audioPlayer.currentTime)
let minutes = currentTime/60
let seconds = currentTime - minutes * 60
playbackTimeLabelFront.text = ??
}
Here is working code
#IBOutlet weak var slider: UISlider!
#IBOutlet weak var lblTotalDuration: UILabel!
#IBOutlet weak var lblCurrentDuration: UILabel!
var timer:Timer!
var audioPlayer: AVAudioPlayer!
// Pass your Audiofile path here
func prepareAudio(path:String) {
let mp3URL = NSURL(string: path)
do {
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
} catch _ { }
do {
try AVAudioSession.sharedInstance().setActive(true)
} catch _ { }
audioPlayer = try AVAudioPlayer(contentsOf: mp3URL as! URL)
audioPlayer.delegate = self
slider.maximumValue = CFloat(audioPlayer.duration)
slider.minimumValue = CFloat(0.0)
slider.value = CFloat(0.0)
audioPlayer.prepareToPlay()
// Total Audio Duration
let time = calculateTimeFromNSTimeInterval(audioPlayer.duration)
totalLengthOfAudio = "\(time.minute):\(time.second)"
lblTotalDuration.text = totalLengthOfAudio
lblCurrentDuration.text = "00:00"
audioPlayer.play()
startTimer()
} catch let error as NSError {
print(error.localizedDescription)
}
}
Timer and Update Label
func startTimer(){
let timer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(PlayerViewController.update(_:)), userInfo: nil,repeats: true)
timer.fire()
}
func stopTimer(){
timer.invalidate()
}
func update(_ timer: Timer){
if !audioPlayer.isPlaying{
return
}
let time = calculateTimeFromNSTimeInterval(audioPlayer.currentTime)
DispatchQueue.main.async{
self.lblCurrentDuration.text = "\(time.minute):\(time.second)"
self.slider.value = CFloat(self.audioPlayer.currentTime)
}
}
Calculate Song Length
func calculateTimeFromNSTimeInterval(_ duration:TimeInterval) ->(minute:String, second:String){
let minute_ = abs(Int((duration/60).truncatingRemainder(dividingBy: 60)))
let second_ = abs(Int(duration.truncatingRemainder(dividingBy: 60)))
// var hour = hour_ > 9 ? "\(hour_)" : "0\(hour_)"
let minute = minute_ > 9 ? "\(minute_)" : "0\(minute_)"
let second = second_ > 9 ? "\(second_)" : "0\(second_)"
return (minute,second)
}
when the user gets from CallKit at that time I am switching root on accept button click of Call but somehow root controller object always found nil and application crashed
Case : this is happing when the application running in the background and phone state is locked.
func provider(_ provider: CXProvider, perform action: CXAnswerCallAction) {
endCallTimer()
guard let call = callManager.callWithUUID(uuid: action.callUUID) else {
action.fail()
return
}
let mainStoryboard: UIStoryboard = UIStoryboard(name: "Main", bundle: nil)
let incomingCall = mainStoryboard.instantiateViewController(withIdentifier: "CallConnectedVC") as! IncomingController
incomingCall.connectToCalling(duration:self.callDict["duration"] as! Int, consumerId: "\(self.callDict["consumerId"] as! Int)", categoryName: self.callDict["categoryTopic"] as! String, categoryImage: "", callId: self.callDict["callId"] as! String)
let nav = UINavigationController(rootViewController: incomingCall)
UIApplication.shared.keyWindow?.makeKeyAndVisible()
UIApplication.shared.keyWindow?.rootViewController = nav
configureAudioSession()
call.answer()
action.fulfill()
}
IncomingController code -
import UIKit
import TwilioVideo
import AVFoundation
import SDWebImage
import FirebaseAnalytics
class IncomingController: UIViewController {
var camera: TVICameraCapturer?
#IBOutlet weak var img_user: UIImageView!
#IBOutlet weak var lblName: UILabel!
#IBOutlet weak var lbltopic: UILabel!
#IBOutlet weak var lblTimer: UILabel!
#IBOutlet weak var btn_speaker : UIButton!
#IBOutlet weak var btn_video: UIButton!
#IBOutlet weak var btn_mute: UIButton!
#IBOutlet weak var btn_extendcall: UIButton!
#IBOutlet weak var btn_endcall: UIButton!
//Video
#IBOutlet weak var btn_toogleMic : UIButton!
#IBOutlet weak var btnConnectAudio : UIButton!
#IBOutlet weak var btnFlipCamera : UIButton!
#IBOutlet weak var view_video: UIView!
#IBOutlet weak var lblviedoName: UILabel!
#IBOutlet weak var lblvideoTimer: UILabel!
#IBOutlet weak var lblvideoTopic: UILabel!
#IBOutlet weak var provider_previewView: TVIVideoView!
#IBOutlet weak var provider_remoteView: TVIVideoView!
#IBOutlet weak var provider_previewViewShadow: UIView!
#IBOutlet weak var btnView : UIView!
#IBOutlet weak var constrain_btnView: NSLayoutConstraint!
//UpdatedVideo
#IBOutlet weak var viewDisableVideo: UIView!
#IBOutlet weak var lblDisableViedoName: UILabel!
#IBOutlet weak var lblDisableVideoTimer: UILabel!
#IBOutlet weak var lblDisableTopic: UILabel!
#IBOutlet weak var lblDisableText: UILabel!
#IBOutlet weak var imgUserDisable: UIImageView!
var NotificationDict = NSMutableDictionary()
var Calltimer: Timer? = Timer()
var sec = 60
var min = 6
var audioDevice: TVIDefaultAudioDevice = TVIDefaultAudioDevice(block: {
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, mode: AVAudioSessionModeVoiceChat, options: .mixWithOthers)
try AVAudioSession.sharedInstance().setPreferredSampleRate(48000)
try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(0.01)
} catch {
print(error)
}
})
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
TwilioVideo.audioDevice = self.audioDevice
self.prepareLocalMedia()
connectToCalling(duration:NotificationDict["duration"] as! Int, consumerId: "\(NotificationDict["consumerId"] as! Int)", categoryName: NotificationDict["categoryTopic"] as! String, categoryImage: "", callId: NotificationDict["callId"] as! String)
}
//MARK:- Speaker Method
func setAudioOutputSpeaker(_ enabled: Bool) {
let session = AVAudioSession.sharedInstance()
try? session.setCategory(AVAudioSessionCategoryPlayAndRecord)
try? session.setMode(AVAudioSessionModeVoiceChat)
if enabled {
try? session.overrideOutputAudioPort(AVAudioSessionPortOverride.speaker)
} else {
try? session.overrideOutputAudioPort(AVAudioSessionPortOverride.none)
}
try? session.setActive(true)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
//MARK:- Custom Method
func connectToCalling(duration:Int, consumerId:String, categoryName:String, categoryImage:String, callId:String)
{
let Url = "\(Constant.tokenUrl)"
let Param = ["duration":duration, "consumerId":consumerId, "categoryName":categoryName, "categoryImage":categoryImage, "callId":callId] as [String:Any]
ApiResponse.onResponseKeyPost(url: Url, parms: Param as NSDictionary, completion: { (dict, errr) in
print("dict responce",dict)
if(errr == ""){
OperationQueue.main.addOperation {
accessToken = dict["providerToken"] as! String
let connectOptions = TVIConnectOptions.init(token: accessToken) { (builder) in
if let videoTrack = ProvideCallObject.localVideoTrack {
builder.videoTracks = [videoTrack]
}
// We will share a local audio track only if ExampleAVAudioEngineDevice is selected.
if let audioTrack = ProvideCallObject.localAudioTrack {
builder.audioTracks = [audioTrack]
}
userDef.set("\(dict["roomId"]!)", forKey: "roomId")
builder.roomName = "\(dict["roomId"]!)"
}
ProvideCallObject.Provideroom = TwilioVideo.connect(with: connectOptions, delegate: self)
}
}
})
}
#objc func updateCountDown() {
if btn_endcall.isUserInteractionEnabled == false
{
btn_endcall.isUserInteractionEnabled = true
}
if sec == 0 {
if min == 0
{
sec = 0
min = 0
goToFeedback()
Calltimer?.invalidate()
}else
{
sec = 59
min = min - 1
if min == 0 {
SystemSoundID.playFileNamed(fileName: "60 Seconds v2", withExtenstion: "m4a")
}
}
}else
{
var timeString = ""
sec = sec - 1
if min < 10
{
timeString = timeString + "0" + String(min)
if min == 2 && sec == 0{
SystemSoundID.playFileNamed(fileName: "2 Mins v2", withExtenstion: "m4a")
}
}
else
{
timeString = timeString + String(min)
}
if sec < 10
{
timeString = timeString + ":0" + String(sec)
}
else
{
timeString = timeString + ":" + String(sec)
}
lblTimer.text = "\(timeString) Free Minutes"
lblvideoTimer.text = "\(timeString) Free Minutes"
lblDisableVideoTimer.text = "\(timeString) remaining"
}
}
func prepareLocalMedia() {
if (ProvideCallObject.localAudioTrack == nil) {
ProvideCallObject.localAudioTrack = TVILocalAudioTrack.init(options: nil, enabled: true, name: "Microphone")
if (ProvideCallObject.localAudioTrack == nil) {
print("Failed to create audio track")
}
}
if (ProvideCallObject.localVideoTrack == nil) {
self.startPreview()
}
changeButtonImage(isVideoEnable: true)
}
//Video
func setupRemoteVideoView() {
self.provider_previewViewShadow.frame = CGRect(x: self.view_video.bounds.width - 132, y: self.view_video.bounds.height - 239, width: 112, height: 149)
self.provider_previewView.frame = self.provider_previewViewShadow.bounds
self.provider_remoteView.bringSubview(toFront: self.provider_previewViewShadow)
self.provider_remoteView.isHidden = false
}
// MARK: Private
func startPreview() {
if PlatformUtils.isSimulator {
return
}
camera = TVICameraCapturer(source: .frontCamera, delegate: self)
ProvideCallObject.localVideoTrack = TVILocalVideoTrack.init(capturer: camera!, enabled: true, constraints: nil, name: "Camera")
if (ProvideCallObject.localVideoTrack == nil) {
print("Failed to create video track")
} else {
ProvideCallObject.localVideoTrack!.addRenderer(self.provider_previewView)
let tap = UITapGestureRecognizer(target: self, action: #selector(IncomingController.flipCamera))
self.provider_previewView.addGestureRecognizer(tap)
}
}
#objc func flipCamera() {
if (self.camera?.source == .frontCamera) {
self.camera?.selectSource(.backCameraWide)
} else {
self.camera?.selectSource(.frontCamera)
}
}
func cleanupRemoteParticipant() {
if ((ProvideCallObject.remoteParticipant) != nil) {
if ((ProvideCallObject.remoteParticipant?.videoTracks.count)! > 0) {
let remoteVideoTrack = ProvideCallObject.remoteParticipant?.remoteVideoTracks[0].remoteTrack
remoteVideoTrack?.removeRenderer(self.provider_remoteView!)
self.provider_remoteView?.isHidden = true
}
}
ProvideCallObject.remoteParticipant = nil
}
}
// MARK:- TVIRoomDelegate
extension IncomingController : TVIRoomDelegate {
func callDetails(room_sid:String,callID:String) {
let params = ["room_sid":room_sid,"callId": callID,"isCallEnd":0] as [String : Any]
ApiResponse.onResponsePost(url: Constant.callDetailsTwilio, parms: params as NSDictionary) { (response, error) in
}
}
func didConnect(to room: TVIRoom) {
ProvideCallObject.localParticipant = ProvideCallObject.Provideroom?.localParticipant
ProvideCallObject.localParticipant?.delegate = self
Calltimer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(updateCountDown), userInfo: nil, repeats: true)
NotificationDict["roomsid"] = room.sid
btn_video.isUserInteractionEnabled = true
-----------------------------------------
if (room.remoteParticipants.count > 0) {
ProvideCallObject.remoteParticipant = room.remoteParticipants[0]
ProvideCallObject.remoteParticipant?.delegate = self
self.callDetails(room_sid: room.sid, callID: NotificationDict["callId"] as! String)
}
if !isHeadPhoneAvailabel(){
self.setAudioOutputSpeaker(true)
}
}
func room(_ room: TVIRoom, didDisconnectWithError error: Error?) {
self.cleanupRemoteParticipant()
}
func room(_ room: TVIRoom, didFailToConnectWithError error: Error) {
}
func room(_ room: TVIRoom, participantDidConnect participant: TVIRemoteParticipant) {
if (ProvideCallObject.remoteParticipant == nil) {
ProvideCallObject.remoteParticipant = participant
ProvideCallObject.remoteParticipant?.delegate = self
}
// print("Participant \(participant.identity) connected with \(participant.remoteAudioTracks.count) audio and \(participant.remoteVideoTracks.count) video tracks")
}
func room(_ room: TVIRoom, participantDidDisconnect participant: TVIRemoteParticipant) {
if (ProvideCallObject.remoteParticipant == participant) {
cleanupRemoteParticipant()
}
goToFeedback()
}
}
// MARK: TVIRemoteParticipantDelegate
extension IncomingController : TVILocalParticipantDelegate {
func localParticipant(_ participant: TVILocalParticipant, publishedVideoTrack: TVILocalVideoTrackPublication) {
ProvideCallObject.localVideoTrack = publishedVideoTrack.videoTrack as? TVILocalVideoTrack
}
}
// MARK: TVIRemoteParticipantDelegate
extension IncomingController : TVIRemoteParticipantDelegate {
func subscribed(to videoTrack: TVIRemoteVideoTrack,
publication: TVIRemoteVideoTrackPublication,
for participant: TVIRemoteParticipant) {
if (ProvideCallObject.remoteParticipant == participant) {
setupRemoteVideoView()
videoTrack.addRenderer(self.provider_remoteView!)
}
}
func remoteParticipant(_ participant: TVIRemoteParticipant,
enabledVideoTrack publication: TVIRemoteVideoTrackPublication) {
// print("Participant \(participant.identity) enabled \(publication.trackName) video track")
self.viewDisableVideo.isHidden = true
changeButtonImage(isVideoEnable: true)
}
func remoteParticipant(_ participant: TVIRemoteParticipant,
disabledVideoTrack publication: TVIRemoteVideoTrackPublication) {
self.viewDisableVideo.isHidden = false
changeButtonImage(isVideoEnable: false)
// print("Participant \(participant.identity) disabled \(publication.trackName) video track")
}
}
extension IncomingController : TVICameraCapturerDelegate {
func cameraCapturer(_ capturer: TVICameraCapturer, didStartWith source: TVICameraCaptureSource) {
// Layout the camera preview with dimensions appropriate for our orientation.
self.provider_previewView.shouldMirror = (source == .frontCamera)
}
}
// MARK: TVIVideoViewDelegate
extension IncomingController : TVIVideoViewDelegate {
func videoView(_ view: TVIVideoView, videoDimensionsDidChange dimensions: CMVideoDimensions) {
self.view.setNeedsLayout()
}
}
so it is possible to switch or change root controller in define case?
Thanks
I am currently creating a feed with a table view playing videos from internet and then I am saving them in the cache, this is working fine. the problem is that after I scroll up and scroll down a couple of times to test that is working on other cells, the uiview where I am adding the videos gets black and doesn't reproduce the video. video preview The weird part, is that it works great on the simulator and this never happens but it does on my phone. I appreciate your help.
this is the code for mi tableViewCell, I check is the post has a video, if so I do a cache from the url and assign it to the avplayer
import UIKit
import AVFoundation
import Alamofire
import Firebase
import AVKit
class PostCell: UITableViewCell {
#IBOutlet weak var postImageView: UIImageView!
#IBOutlet weak var numberOfLikes: UIButton!
#IBOutlet weak var usernameLbl: UILabel!
#IBOutlet weak var likesBtn: UIButton!
#IBOutlet weak var commentsBubbleBtn: UIButton!
#IBOutlet weak var addCommentBtn: UIButton!
#IBOutlet weak var dotsBtn: UIButton!
#IBOutlet weak var profilePic: UIImageView!
#IBOutlet weak var postCaptionLbl: UILabel!
#IBOutlet weak var timeAgoLbl: UILabel!
#IBOutlet weak var videoView: UIView!
#IBOutlet weak var soundBtn: UIButton!
var userUIDtoSend = ""
var audio = true
var like = Bool()
var postID = ""
var player: AVPlayer!
var playerLayer: AVPlayerLayer?
var playerCache = [String: AVPlayer]()
var videoIsPlaying = false
var url : URL?
func updateUI(postImageView: String, caption: String, type: String, profilePic: String, username: String, postUID: String) {
postCaptionLbl.text = caption
timeAgoLbl.text = "1h ago"
self.postID = postUID
soundBtn.layer.borderWidth = 2.0
soundBtn.layer.cornerRadius = 4.0
soundBtn.layer.borderColor = soundBtn.tintColor.cgColor
soundBtn.layer.masksToBounds = true
self.profilePic.downloadImage(from: profilePic)
self.profilePic.sd_setImage(with: URL(string: profilePic))
self.usernameLbl.text = username
let tapLikes = UITapGestureRecognizer()
tapLikes.addTarget(self, action: #selector(likeBtnWasPressed))
likesBtn.addGestureRecognizer(tapLikes)
likesBtn.isUserInteractionEnabled = true
checkLikes()
if type == "video" {
// print("Video is PLAYING")
self.videoIsPlaying = true
self.soundBtn.isHidden = false
let tapRec = UITapGestureRecognizer()
tapRec.addTarget(self, action: #selector(self.audioToggle))
self.soundBtn.addGestureRecognizer(tapRec)
self.soundBtn.isUserInteractionEnabled = true
self.videoView.isHidden = false
self.postImageView.isHidden = true
CacheManager.shared.getFileWith(stringUrl: postImageView) { result in
switch result {
case .success(let url2):
//self.url = URL(fileURLWithPath: "\(url2)")
self.url = URL(string: "\(url2)")
self.player = AVPlayer(url: self.url!)
//
//self.player?.automaticallyWaitsToMinimizeStalling = true
self.playerLayer = AVPlayerLayer(player: self.player)
self.playerLayer?.frame = self.videoView.bounds
// //self.playerLayer?.videoGravity = AVLayerVideoGravity.resizeAspect
self.player?.play()
self.videoView.layer.addSublayer(self.playerLayer!)
self.player!.isMuted = self.audio
NotificationCenter.default.addObserver(self, selector: #selector(self.playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: self.player!.currentItem)
print("\n\n\n\n")
print("error in the player:", self.player.error ?? "no error")
// print("Post tile: ", self.postCaptionLbl.text ?? "")
print("ready for display: " ,self.playerLayer!.isReadyForDisplay)
case .failure(let error):
// handle errror
print("this is the fucking error from video: ", error)
self.url = URL(string: postImageView)!
self.player = AVPlayer(url: self.url!)
self.playerLayer?.videoGravity = AVLayerVideoGravity.resizeAspect
self.player?.automaticallyWaitsToMinimizeStalling = false
self.player?.isMuted = self.audio
self.playerLayer = AVPlayerLayer(player: self.player)
self.playerLayer?.frame = self.videoView.frame
self.videoView.layer.addSublayer(self.playerLayer!)
self.player?.play()
NotificationCenter.default.addObserver(self, selector: #selector(self.playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: self.player?.currentItem)
print("this is player cache", self.playerCache)
}
}
} else {
self.videoIsPlaying = false
NotificationCenter.default.removeObserver(self)
// print("DISPLAYING IMAGE")
self.soundBtn.isHidden = true
self.videoView.isHidden = true
self.postImageView.isHidden = false
player?.pause()
self.postImageView.sd_setShowActivityIndicatorView(true)
self.postImageView.sd_setIndicatorStyle(.gray)
self.postImageView.sd_setImage(with: URL(string: postImageView))
}
}
func pauseVideo() {
if self.player != nil {
self.player!.seek(to: kCMTimeZero)
self.player!.pause()
}
}
override func prepareForReuse() {
super.prepareForReuse()
playerLayer?.removeFromSuperlayer()
player?.pause()
}
#objc fileprivate func playerItemDidReachEnd(_ notification: Notification) {
if self.player != nil {
self.player!.seek(to: kCMTimeZero)
self.player!.play()
}
}
#objc func audioToggle(){
print("tapped")
if self.player?.isMuted == true {
self.player?.isMuted = false
self.soundBtn.setTitle("Mute", for: .normal)
} else {
self.player?.isMuted = true
self.soundBtn.setTitle("Sound", for: .normal)
}
}
}
I tried in a collection view in another viewController and this happens as well. It seems it something wring whit the avplayer when it runs in actual device.
How can I get AvPlayer to work like AvAudioPlayer. What I mean is, I cannot seem to get duration to work at all. Here is what I need to convert to AvPlayer:
import UIKit
import AVKit
import AVFoundation
class ModalViewController: UIViewController {
var audioPlayer = AVAudioPlayer()
let varSend = VarSend.sharedInstance
var timer:NSTimer!
var toggleState = 2
#IBOutlet weak var slider: UISlider!
#IBOutlet weak var sermonImage: UIImageView!
#IBOutlet weak var sermont: UILabel!
#IBOutlet weak var sermond: UILabel!
#IBOutlet weak var sermonE: UILabel!
#IBOutlet weak var sermonL: UILabel!
#IBOutlet weak var play: UIButton!
#IBOutlet var layer: UIView!
override func viewDidLoad() {
super.viewDidLoad()
let url = varSend.url
print("Setting up.")
do {
let data1 = NSData(contentsOfURL: NSURL(string:url)!)
audioPlayer = try AVAudioPlayer(data: data1!)
audioPlayer.prepareToPlay()
audioPlayer.volume = 1.0
audioPlayer.play()
} catch {
print("Error getting the audio file")
}
slider.maximumValue = Float(audioPlayer.duration)
timer = NSTimer.scheduledTimerWithTimeInterval(0.1, target: self, selector: Selector("updateSlider"), userInfo: nil, repeats: true)
slider.setThumbImage(UIImage(named: "circle"), forState: .Normal)
slider.setThumbImage(UIImage(named: "circle"), forState: .Highlighted)
let title = varSend.sermonName
self.sermont.text = title
let date = varSend.sermonDate
self.sermond.text = date
let image = varSend.sermonPic
ImageLoader.sharedLoader.imageForUrl(image, completionHandler:{(image: UIImage?, url: String) in
self.sermonImage.image = image!
})
}
#IBAction func ChangeAudioTime(sender: AnyObject) {
audioPlayer.stop()
audioPlayer.currentTime = NSTimeInterval(slider.value)
audioPlayer.prepareToPlay()
audioPlayer.volume = 1.0
audioPlayer.play()
}
func updateSlider() {
slider.value = Float(audioPlayer.currentTime)
let currentTime = Int(audioPlayer.currentTime)
let minutes = currentTime / 60
let seconds = currentTime - minutes * 60
let con = Int(audioPlayer.duration)
let currentItem = con - currentTime
let minutesU = Int(currentItem / 60)
let secondsU = Int(currentItem % 60)
sermonE.text = NSString(format: "%02d:%02d", minutes,seconds) as String
let timeLeft = NSString(format: "%02d:%02d", minutesU,secondsU) as String
sermonL.text = "-\(timeLeft)"
if currentItem == 1 {
//audioPlayer.pause()
toggleState = 1
print(toggleState)
play.setImage(UIImage(named:"play.png"),forState:UIControlState.Normal)
}
}
#IBAction func playPauseButton(sender: AnyObject) {
let playBtn = sender as! UIButton
if toggleState == 1 {
audioPlayer.play()
toggleState = 2
playBtn.setImage(UIImage(named:"pause.png"),forState:UIControlState.Normal)
} else {
audioPlayer.pause()
toggleState = 1
playBtn.setImage(UIImage(named:"play.png"),forState:UIControlState.Normal)
}
}
#IBAction func play(sender: AnyObject) {
audioPlayer.play()
}
#IBAction func pause(sender: AnyObject) {
audioPlayer.pause()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func close(sender: AnyObject) {
self.dismissViewControllerAnimated(true, completion: nil)
audioPlayer.stop()
}
}
And here is what I have tried already:
import UIKit
import Alamofire
import SwiftyJSON
import AVKit
import AVFoundation
import CoreMedia
class test: UIViewController {
var audioPlayer:AVPlayer!
var playerItem:AVPlayerItem!
var timer:NSTimer!
#IBOutlet weak var sermonE: UILabel!
#IBOutlet weak var sermonL: UILabel!
#IBOutlet weak var slider: UISlider!
override func viewDidLoad() {
super.viewDidLoad()
//let playerItem:AVPlayerItem!;
let playerURL = "example.com"
let steamingURL:NSURL = NSURL(string:playerURL)!
audioPlayer = AVPlayer(URL: steamingURL)
timer = NSTimer.scheduledTimerWithTimeInterval(0.1, target: self, selector: Selector("updateSlider"), userInfo: nil, repeats: true)
}
func updateSlider() {
let item = audioPlayer?.currentItem
let durationInSeconds = CMTimeGetSeconds(item!.duration)
print(durationInSeconds)
}
#IBAction func ChangeAudioTime(sender: AnyObject) {
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
#IBAction func Play(sender: AnyObject) {
audioPlayer.play()
}
}
I have been searching for days and Apple's docs are very hard to make out.
I even tried
self.player.currentItem.asset.duration
from: How to get Duration from AVPlayer (Not AVAudioPlayer)?
Any help would be much appreciated.
Swift 2.0
loadedTimeRanges The array contains NSValue objects containing a CMTimeRange value
indicating the times ranges for which the player item has media data
readily available. The time ranges returned may be discontinuous.
So I call myplayer.getCurrentTrackDuration every 1 second and noticed that when I'm streaming I got the correct final duration after 3-4 second.
extension AVPlayer {
//run this every 1 second of streaming (or use KVO)
//In Http stream the duration it going to increase and probably finallize near to 7% of the total duration of the song
func getCurrentTrackDuration () -> Float64 {
guard let currentItem = self.currentItem else { return 0.0 }
guard currentItem.loadedTimeRanges.count > 0 else { return 0.0 }
let timeInSecond = CMTimeGetSeconds((currentItem.loadedTimeRanges[0].CMTimeRangeValue).duration);
return timeInSecond >= 0.0 ? timeInSecond : 0.0
}
}