Is this possible to play Video File synchronous downloading with Almofire? - ios

I try to download file using Alamofire and I done that with progress. What I am trying to do now is building a type of local stream from video URL.
Alamofire store the downloaded file at destination.
let destination = Alamofire.Request.suggestedDownloadDestination(directory: .DocumentDirectory, domain: .LocalDomainMask)
This destination is not valid to play in my case. eg.
if totalBytesRead >= 51831 {
let player = AVPlayer(URL: NSURL(fileURLWithPath: "\(destination)"))
let playerController = AVPlayerViewController()
playerController.player = player
self.presentViewController(playerController, animated: true) {
player.play()
}
}
Is there another way to do this ? OR can I store downloaded bytes to an array/buffer ?
Thanks

You don't need Alamofire in order to stream content which is in your case a video.
You can actually utilize existing Apple's classes such as:
AVPlayerItem
AVPlayerLayer
AVPlayer
Using those 3 classes you can create your own class like VideoPlayer that will use them for streaming.
Here's a code snip for the constructor to get you started:
playerItem = AVPlayerItem(URL: videoURL)
player = AVPlayer(playerItem: self.playerItem!)
playerLayer = AVPlayerLayer(player: self.player)
playerLayer?.frame = newFrame
playerLayer?.videoGravity = AVLayerVideoGravityResizeAspect
Now, you can use this class inside another View Controller:
self.containerView.layer.addSublayer(self.videoPlayer.playerLayer)
This will add the video player class you created and now you only need to play the video:
// Of type AVPlayer
player.play()
This is the easy part, if you want you will have to to implement your own buffer & rate and of course implement an observer to observe the rate and buffer:
// Observe the Rate of the player. when the video playing or paused
player.addObserver(theSelf, forKeyPath: "rate", options: .New, context: nil)
// Observe when the Video player Buffer is empty
playerItem.addObserver(theSelf, forKeyPath: "playbackBufferEmpty", options: .New, context: nil)
// Observe the loaded Buffer of the video
playerItem.addObserver(theSelf, forKeyPath: "loadedTimeRanges", options: NSKeyValueObservingOptions.New, context: nil)
Also, don't forget to remove them when you are done with them.
EDIT:
As asked, if you need to switch to a better quality during run time you will first need to observe the user's internet speed. After you solved this issue you will now need to stream the high quality video by for example adding another VideoPlayer layer on top of the current layer but only do it if your new high quality video buffer has at least 5 secs or so, so the user won't notice the change.
Good luck.

Related

Notification AVPlayerItemDidPlayToEndTime not fired when streaming via AirPlay

I use AVPlayer to play a simple MP4 video. As I need to know when the video finishes, I add an observer for the notification AVPlayerItemDidPlayToEndTime.
This works fine, if the video is played on the device, but the notification is never posted if we are streaming the video from a device to an Apple TV (4th gen) with tvOS 12.2 via AirPlay.
Strangely, this problem only occurs with MP4 videos, not with HLS streams (m3u8), which makes me wonder if this might be a tvOS (or AirPlay) bug.
I created a small POC to isolate the problem, and this is what I do:
let bunny = "https://www.sample-videos.com/video123/mp4/720/big_buck_bunny_720p_1mb.mp4"
// create a new player with every call to startVideo() for demo purposes
func startVideo() {
// remove layer of current play from playerContainer as
// a new one will be created below
if let sublayers = playerContainer.layer.sublayers {
for layer in sublayers {
layer.removeFromSuperlayer()
}
}
if let videoUrl = URL(string: bunny) {
// create player with sample video URL
self.player = AVPlayer(url: videoUrl)
guard let player = self.player else {return}
// add player layer to playerContainer
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = playerContainer.bounds
playerLayer.videoGravity = .resizeAspectFill
playerContainer.layer.addSublayer(playerLayer)
// add block-based observer to listen for AVPlayerItemDidPlayToEndTime event
let _ = NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime,
object: player.currentItem,
queue: .main) { notification in
print("Item did finish playing! \(notification)")
}
player.play()
}
}
Just to clarify again, this works fine, if one of the following conditions is true:
The video is played on the device, and no AirPlay is involved.
The video is streamed to an AppleTV 3rd gen.
An HLS instead of an MP4 video is used.
I also tried the following things, but the result was the same:
Don't set currentItem as object when adding observer.
Don't specify the queue when adding observer.
Don't use block-based but selector-based method to add an observer.
Any help is appreciated. This might be a duplicate of this question, but as there are no answers, and linked question is about HLS and not MP4, I decided to post this question anyway.

how to change the playing speed of AVPlayerItem in swift?

I am using a library named Jukebox , to play audios files , I want to make it player faster or slower.
It inherits AVPlayerItem , but I can't find how.
Any one can help me?
I think you can use the AVPlayer rate property below
var rate: Float { get set }
For instance:-
if let playerItem = AVPlayerItem(URL: yourUrl), let player = AVPlayer(playerItem: playerItem) {
//This will update your player speed faster or slower accordingly
player.rate = Float(rateValue)
}

AVPlayer with Visualizer

I need to create a Audio Player for streamed URL (m3u8 format). I have created music player using AVPlayer. But I need to show visualizer for streamed song. I have tried different solution but not found any working example of it.
I have created visualizer using AVAudioPlayer(averagePower) but it won't support streamed URL.
Any help to show visualizer for AVPlayer? Thanks in advance.
I have also tried using MYAudioTapProcessor which most of the people suggested, but for streamed URL, tracks always returns null.
Added the MYAudioTapProcessor.h and MYAudioTapProcessor.m in project
//Initialization of player
let playerItem = AVPlayerItem( url:NSURL( string:"https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8" ) as! URL )
let audioPlayer: AVPlayer = AVPlayer(playerItem:playerItem)
//Added periodic time observer
audioPlayer!.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1, 1), queue: DispatchQueue.main) { (CMTime) -> Void in
if audioPlayer!.currentItem?.status == .readyToPlay
{
if let playerItem: AVPlayerItem = audioPlayer!.currentItem {
print(playerItem.asset.tracks.count)
if (playerItem.asset.tracks) != nil {
self.tapProcessor = MYAudioTapProcessor(avPlayerItem: playerItem)
playerItem.audioMix = self.tapProcessor.audioMix
self.tapProcessor.delegate = self
}
}
}
}
//Delegate callback method for MYAudioTapProcessor
func audioTabProcessor(_ audioTabProcessor: MYAudioTapProcessor!, hasNewLeftChannelValue leftChannelValue: Float, rightChannelValue: Float) {
print("volume: \(leftChannelValue) : \(rightChannelValue)")
volumeSlider.value = leftChannelValue
}
Also tried by adding the "Track" observer.
playerItem.addObserver(self, forKeyPath: "tracks", options: NSKeyValueObservingOptions.new, context: nil);
Now if play mp3 file, the callback method calls but for m3u8 callback method doesn't call. The main reason for failing m3u8 URL is it always show tracks array count zero whereas for mp3 files tracks array has one item.
You cannot get tracks for HLS via AVPLayer. You should use progressive download or local file for getting audio tracks while playing media.

Using NotificationCenter to alternate videos files in AVPlayer

I'm working on implementing a video player in Swift that will detect if a video has stopped playing, and then play the second one. When the second one has stopped playing, the first video should play again.
Here's where I set up the player, assets, and player items:
//Create URLs
let movieOneURL: URL = URL(fileURLWithPath: movieOnePath)
let movieTwoURL: URL = URL(fileURLWithPath: movieTwoPath)
//Create Assets
let assetOne = AVAsset(url: movieOneURL)
let assetTwo = AVAsset(url: movieTwoURL)
//Create Player Items
avPlayerItemOne = AVPlayerItem(asset: assetOne)
avPlayerItemTwo = AVPlayerItem(asset: assetTwo)
avplayer = AVPlayer(playerItem: avPlayerItemOne)
let avPlayerLayer = AVPlayerLayer(player: avplayer)
avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
avPlayerLayer.frame = UIScreen.main.bounds
movieView.layer.addSublayer(avPlayerLayer)
//Config player
avplayer .seek(to: kCMTimeZero)
avplayer.volume = 0.0
And here's where I set up a notification to detect if the player reached the end of the video file:
NotificationCenter.default.addObserver(self, selector: #selector(self.playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: avplayer.currentItem)
...which calls this selector:
func playerItemDidReachEnd(_ notification: Notification) {
// avplayer.seek(to: kCMTimeZero)
changePlayerAsset()
// avplayer.play()
}
...which will then switch out the asset:
func changePlayerAsset(){
if avplayer.currentItem == avPlayerItemOne {
avplayer.replaceCurrentItem(with: avPlayerItemTwo)
avplayer.play()
} else if avplayer.currentItem == avPlayerItemTwo {
avplayer.replaceCurrentItem(with: avPlayerItemOne)
avplayer.play()
}
}
This works perfectly the first time through - when the first movie has finished playing, the next one will then start playing.
The problem I'm having is that my notification observer only seems to register once; at the end of the first video...the notification isn't fired when the second video stops playing at all.
Anyone have an idea why that would be the case
The reason your notification handler isn’t getting called for the second item is this bit, at the end of where you register the notification handler: object: avplayer.currentItem. The handler gets called once, when that item finishes playing, but then when the next item finishes, the notification gets posted with a different object—the other item—which doesn’t match what you registered for, and so your handler doesn’t get called. If you change object to nil when you register the handler, it’ll get called when any item finishes, which is closer to what you’re after.
That said, this isn’t a great way to do what you want—manually swapping out items is likely to incur the cost and delay of loading each item each time it’s about to play. You’d be much better off using the built-in functionality for playing videos in sequence and looping them, namely AVQueuePlayer and AVPlayerLooper. There’s an example of how to use both in the answer to this question.

AVPlayer not playing video from server?

In my application I've to stream videos from server. For that I've used below Code
-(void)playingSong:(NSURL*) url{
AVAsset *asset = [AVAsset assetWithURL:url];
duration = asset.duration;
playerItem = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:playerItem];
[player play];
}
All are Global Variables
It's playing all videos when network is good, but unable to play videos with big size, when network is slow.
Means It's not playing for big size videos and it's playing small videos;
I'm using http Server not https;
for ex : 3min video it's playing but for 1hr video it's not.
Why so?
Seems like you have to download the whole video before you can begin playback. It can also be because of your server not AVPlayer.
when you serve videos on a site using plain HTTP – known as
progressive download – the position of the header becomes very
important. Either the header is placed in the beginning of the file or
it’s places in the end of the file. In case of the latter, you’ll have
to download the whole thing before you can begin playback – because
without the header, the player can’t start decoding.
Have look at this guide if your problem is because of videos source.
Have a look at this thread and change you implementation accordingly.
Download video in local and then play in avplayer.
DispatchQueue.global(qos: .background).async {
do {
let data = try Data(contentsOf: url)
DispatchQueue.main.async {
// store "data" in document folder
let fileUrl = URL(fileURLWithPath: <#localVideoURL#>)
let asset = AVAsset(url: fileUrl)
let item = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: item)
let layer = AVPlayerLayer(player: player)
layer.bounds = self.view.bounds
self.view.layer.addSublayer(layer)
player.play()
}
} catch {
print(error.localizedDescription)
}
}

Resources