I need to create a Audio Player for streamed URL (m3u8 format). I have created music player using AVPlayer. But I need to show visualizer for streamed song. I have tried different solution but not found any working example of it.
I have created visualizer using AVAudioPlayer(averagePower) but it won't support streamed URL.
Any help to show visualizer for AVPlayer? Thanks in advance.
I have also tried using MYAudioTapProcessor which most of the people suggested, but for streamed URL, tracks always returns null.
Added the MYAudioTapProcessor.h and MYAudioTapProcessor.m in project
//Initialization of player
let playerItem = AVPlayerItem( url:NSURL( string:"https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8" ) as! URL )
let audioPlayer: AVPlayer = AVPlayer(playerItem:playerItem)
//Added periodic time observer
audioPlayer!.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1, 1), queue: DispatchQueue.main) { (CMTime) -> Void in
if audioPlayer!.currentItem?.status == .readyToPlay
{
if let playerItem: AVPlayerItem = audioPlayer!.currentItem {
print(playerItem.asset.tracks.count)
if (playerItem.asset.tracks) != nil {
self.tapProcessor = MYAudioTapProcessor(avPlayerItem: playerItem)
playerItem.audioMix = self.tapProcessor.audioMix
self.tapProcessor.delegate = self
}
}
}
}
//Delegate callback method for MYAudioTapProcessor
func audioTabProcessor(_ audioTabProcessor: MYAudioTapProcessor!, hasNewLeftChannelValue leftChannelValue: Float, rightChannelValue: Float) {
print("volume: \(leftChannelValue) : \(rightChannelValue)")
volumeSlider.value = leftChannelValue
}
Also tried by adding the "Track" observer.
playerItem.addObserver(self, forKeyPath: "tracks", options: NSKeyValueObservingOptions.new, context: nil);
Now if play mp3 file, the callback method calls but for m3u8 callback method doesn't call. The main reason for failing m3u8 URL is it always show tracks array count zero whereas for mp3 files tracks array has one item.
You cannot get tracks for HLS via AVPLayer. You should use progressive download or local file for getting audio tracks while playing media.
Related
I use AVPlayer to play a simple MP4 video. As I need to know when the video finishes, I add an observer for the notification AVPlayerItemDidPlayToEndTime.
This works fine, if the video is played on the device, but the notification is never posted if we are streaming the video from a device to an Apple TV (4th gen) with tvOS 12.2 via AirPlay.
Strangely, this problem only occurs with MP4 videos, not with HLS streams (m3u8), which makes me wonder if this might be a tvOS (or AirPlay) bug.
I created a small POC to isolate the problem, and this is what I do:
let bunny = "https://www.sample-videos.com/video123/mp4/720/big_buck_bunny_720p_1mb.mp4"
// create a new player with every call to startVideo() for demo purposes
func startVideo() {
// remove layer of current play from playerContainer as
// a new one will be created below
if let sublayers = playerContainer.layer.sublayers {
for layer in sublayers {
layer.removeFromSuperlayer()
}
}
if let videoUrl = URL(string: bunny) {
// create player with sample video URL
self.player = AVPlayer(url: videoUrl)
guard let player = self.player else {return}
// add player layer to playerContainer
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = playerContainer.bounds
playerLayer.videoGravity = .resizeAspectFill
playerContainer.layer.addSublayer(playerLayer)
// add block-based observer to listen for AVPlayerItemDidPlayToEndTime event
let _ = NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime,
object: player.currentItem,
queue: .main) { notification in
print("Item did finish playing! \(notification)")
}
player.play()
}
}
Just to clarify again, this works fine, if one of the following conditions is true:
The video is played on the device, and no AirPlay is involved.
The video is streamed to an AppleTV 3rd gen.
An HLS instead of an MP4 video is used.
I also tried the following things, but the result was the same:
Don't set currentItem as object when adding observer.
Don't specify the queue when adding observer.
Don't use block-based but selector-based method to add an observer.
Any help is appreciated. This might be a duplicate of this question, but as there are no answers, and linked question is about HLS and not MP4, I decided to post this question anyway.
I'm trying to get my audio file duration to display on my app but my code always return 0. I hope there is a method that notifies me when the AVplayer has data from the file and then I can call my code after that to get the data. Any suggestions?
func loadAudioUrl() {
guard let url = URL(string: sampleShortAudioUrl) else {return}
audioPlayer = AVPlayer(url: url)
audioPlayer?.play()
if let duration = audioPlayer?.currentItem?.duration{
print(duration)
}
}
You can get duration, but you need to wait because content is loading. Your code assumes that it is loaded instantly.
You need to use AVPlayerItem with AVPlayer.
When AVPlayerItem status is ready to play, you can ask for duration. Complete code example is right from Apple here:
AVPlayerItem
You can't get duration from a remote audio url , you must store it's duration remotely in your database and grap it while listening or downloading . . .
I'm working on implementing a video player in Swift that will detect if a video has stopped playing, and then play the second one. When the second one has stopped playing, the first video should play again.
Here's where I set up the player, assets, and player items:
//Create URLs
let movieOneURL: URL = URL(fileURLWithPath: movieOnePath)
let movieTwoURL: URL = URL(fileURLWithPath: movieTwoPath)
//Create Assets
let assetOne = AVAsset(url: movieOneURL)
let assetTwo = AVAsset(url: movieTwoURL)
//Create Player Items
avPlayerItemOne = AVPlayerItem(asset: assetOne)
avPlayerItemTwo = AVPlayerItem(asset: assetTwo)
avplayer = AVPlayer(playerItem: avPlayerItemOne)
let avPlayerLayer = AVPlayerLayer(player: avplayer)
avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
avPlayerLayer.frame = UIScreen.main.bounds
movieView.layer.addSublayer(avPlayerLayer)
//Config player
avplayer .seek(to: kCMTimeZero)
avplayer.volume = 0.0
And here's where I set up a notification to detect if the player reached the end of the video file:
NotificationCenter.default.addObserver(self, selector: #selector(self.playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: avplayer.currentItem)
...which calls this selector:
func playerItemDidReachEnd(_ notification: Notification) {
// avplayer.seek(to: kCMTimeZero)
changePlayerAsset()
// avplayer.play()
}
...which will then switch out the asset:
func changePlayerAsset(){
if avplayer.currentItem == avPlayerItemOne {
avplayer.replaceCurrentItem(with: avPlayerItemTwo)
avplayer.play()
} else if avplayer.currentItem == avPlayerItemTwo {
avplayer.replaceCurrentItem(with: avPlayerItemOne)
avplayer.play()
}
}
This works perfectly the first time through - when the first movie has finished playing, the next one will then start playing.
The problem I'm having is that my notification observer only seems to register once; at the end of the first video...the notification isn't fired when the second video stops playing at all.
Anyone have an idea why that would be the case
The reason your notification handler isn’t getting called for the second item is this bit, at the end of where you register the notification handler: object: avplayer.currentItem. The handler gets called once, when that item finishes playing, but then when the next item finishes, the notification gets posted with a different object—the other item—which doesn’t match what you registered for, and so your handler doesn’t get called. If you change object to nil when you register the handler, it’ll get called when any item finishes, which is closer to what you’re after.
That said, this isn’t a great way to do what you want—manually swapping out items is likely to incur the cost and delay of loading each item each time it’s about to play. You’d be much better off using the built-in functionality for playing videos in sequence and looping them, namely AVQueuePlayer and AVPlayerLooper. There’s an example of how to use both in the answer to this question.
I try to download file using Alamofire and I done that with progress. What I am trying to do now is building a type of local stream from video URL.
Alamofire store the downloaded file at destination.
let destination = Alamofire.Request.suggestedDownloadDestination(directory: .DocumentDirectory, domain: .LocalDomainMask)
This destination is not valid to play in my case. eg.
if totalBytesRead >= 51831 {
let player = AVPlayer(URL: NSURL(fileURLWithPath: "\(destination)"))
let playerController = AVPlayerViewController()
playerController.player = player
self.presentViewController(playerController, animated: true) {
player.play()
}
}
Is there another way to do this ? OR can I store downloaded bytes to an array/buffer ?
Thanks
You don't need Alamofire in order to stream content which is in your case a video.
You can actually utilize existing Apple's classes such as:
AVPlayerItem
AVPlayerLayer
AVPlayer
Using those 3 classes you can create your own class like VideoPlayer that will use them for streaming.
Here's a code snip for the constructor to get you started:
playerItem = AVPlayerItem(URL: videoURL)
player = AVPlayer(playerItem: self.playerItem!)
playerLayer = AVPlayerLayer(player: self.player)
playerLayer?.frame = newFrame
playerLayer?.videoGravity = AVLayerVideoGravityResizeAspect
Now, you can use this class inside another View Controller:
self.containerView.layer.addSublayer(self.videoPlayer.playerLayer)
This will add the video player class you created and now you only need to play the video:
// Of type AVPlayer
player.play()
This is the easy part, if you want you will have to to implement your own buffer & rate and of course implement an observer to observe the rate and buffer:
// Observe the Rate of the player. when the video playing or paused
player.addObserver(theSelf, forKeyPath: "rate", options: .New, context: nil)
// Observe when the Video player Buffer is empty
playerItem.addObserver(theSelf, forKeyPath: "playbackBufferEmpty", options: .New, context: nil)
// Observe the loaded Buffer of the video
playerItem.addObserver(theSelf, forKeyPath: "loadedTimeRanges", options: NSKeyValueObservingOptions.New, context: nil)
Also, don't forget to remove them when you are done with them.
EDIT:
As asked, if you need to switch to a better quality during run time you will first need to observe the user's internet speed. After you solved this issue you will now need to stream the high quality video by for example adding another VideoPlayer layer on top of the current layer but only do it if your new high quality video buffer has at least 5 secs or so, so the user won't notice the change.
Good luck.
I'm trying to allow users to be able to cycle through videos, changing the AVPlayer URL on the fly without refreshing the view. However, right now I'm just instantiating AVPlayer objects every time a video is played (resulting in audio to be played over one another), which I feel isn't the best way to do this. Is there a more efficient way similar to changing the image in an imageView?
This is the code where I play the clip:
player = AVPlayer(URL: fileURL)
playerLayer = AVPlayerLayer(player: player)
playerLayer!.frame = self.view.bounds
self.view.layer.addSublayer(playerLayer!)
player!.play()
Do not use an AVPlayer.
Instead use an AVQueuePlayer which allows you to insert and remove items from a queue.
//create local player in setup methods
self.localPlayer = AVQueuePlayer.init()
to add items you can simply use
//optional: clear current queue if playing straight away
self.localPlayer.removeAllItems()
//get url of track
let url : URL? = URL.init(string: "http://urlOfItem")
if url != nil {
let playerItem = AVPlayerItem.init(url: url!)
//you can use the after property to insert
//it at a specific location or leave it nil
self.localPlayer.insert(playerItem, after: nil)
self.localPlayer.play()
}
AVQueuePlayer supports all of the functionality of the AVPlayer but has the added functionality of adding and removing items from a queue.
Use AVPlayerItem to add and remove outputs to an AVPlayer object.
Instead of adding a video to the AVPlayer when you create it, create an empty AVPlayer instance, and then use the addOutput method of the AVPlayerItem class to add the video.
To remove the video and add a new one, use the removeOutput method of the AVPlayerItem class to remove the old video, and then the addOutput method again to insert the new one.
Sample code is available from Apple's developer site at;
https://developer.apple.com/library/prerelease/content/samplecode/AVBasicVideoOutput/Introduction/Intro.html
It provides the same thing I would, were I to post code of my own.
Create AVPlayer Instance globally then override it again when you want to play a new video from new URL.
I am able to accomplish what you are looking for by doing this...
I have a tableView of song names, for which the mp3 files are stored on Parse.com. In didSelectRowAtIndexPath I do...
override func tableView(tableView: UITableView, didSelectRowAtIndexPath indexPath: NSIndexPath) {
SelectedSongNumber = indexPath.row
grabSong()
}
func grabSong () {
let songQuery = PFQuery(className: "Songs")
songQuery.getObjectInBackgroundWithId(iDArray[SelectedSongNumber], block: {
(object: PFObject?, error : NSError?) -> Void in
if let audioFile = object?["SongFile"] as? PFFile {
let audioFileUrlString: String = audioFile.url!
let audioFileUrl = NSURL(string: audioFileUrlString)!
myAVPlayer = AVPlayer(URL: audioFileUrl)
myAVPlayer.play()
currentUser?.setObject(audioFileUrlString, forKey: "CurrentSongURL")
currentUser?.saveInBackground()
}
})
}
when I run this, i select a row and the song starts playing. If i then wait a few seconds and select a different row, the AVPlayer plays the song from the new cell that i selected and does NOT play one song over the other. My AVPlayer is declared as a public variable for all classes to see.