AVPlayer not playing video from server? - ios

In my application I've to stream videos from server. For that I've used below Code
-(void)playingSong:(NSURL*) url{
AVAsset *asset = [AVAsset assetWithURL:url];
duration = asset.duration;
playerItem = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:playerItem];
[player play];
}
All are Global Variables
It's playing all videos when network is good, but unable to play videos with big size, when network is slow.
Means It's not playing for big size videos and it's playing small videos;
I'm using http Server not https;
for ex : 3min video it's playing but for 1hr video it's not.
Why so?

Seems like you have to download the whole video before you can begin playback. It can also be because of your server not AVPlayer.
when you serve videos on a site using plain HTTP – known as
progressive download – the position of the header becomes very
important. Either the header is placed in the beginning of the file or
it’s places in the end of the file. In case of the latter, you’ll have
to download the whole thing before you can begin playback – because
without the header, the player can’t start decoding.
Have look at this guide if your problem is because of videos source.
Have a look at this thread and change you implementation accordingly.

Download video in local and then play in avplayer.
DispatchQueue.global(qos: .background).async {
do {
let data = try Data(contentsOf: url)
DispatchQueue.main.async {
// store "data" in document folder
let fileUrl = URL(fileURLWithPath: <#localVideoURL#>)
let asset = AVAsset(url: fileUrl)
let item = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: item)
let layer = AVPlayerLayer(player: player)
layer.bounds = self.view.bounds
self.view.layer.addSublayer(layer)
player.play()
}
} catch {
print(error.localizedDescription)
}
}

Related

Azure AMS : How to get Sidecar WebVTT for showing Captions/Subtitles in iOS native player?

We have successfully configured Subtitles/Captions in Azure Media Player which plays media on the Web side.
But, how do we configure the same for playing the media managed by AMS in iOS for Native AVPlayer? We know that captions/subtitles can be played in native iOS player with Sidecar WebVTT file, but is the "transcript.vtt" file generated by AMS, the Sidecar WebVTT file ?
If not, how do we generate the sidecar WebVTT file?
We have implemented the code as below with Media file being accessed from AMS link and a locally downloaded transcript.vtt file, but it fails.
[EDITED : 20200413]
However, when we have local media file and local transcript.vtt file, or when we directly access the media file in the media storage account (https://mediastorageaccount.blob.core.windows.net/container/file.mp4) it works fine. But, when we access the encoded file from the link generated by AMS Transform (https://mediaservice-inct.streaming.media.azure.net/788888-6666-4444-aaaa-823422j218/file.ism/manifest(format=m3u8-cmaf)) it fails.
What is wrong here?
func playVideo()
{
let strUrl = "https://mediaservice-inct.streaming.media.azure.net/79510-6eb-340-a90-824218/German-FAST_Lesson-2-Dialog.ism/manifest(format=m3u8-cmaf)"
localVideoAsset = AVURLAsset(url: URL(string: strUrl)!)
//We have to add tracks in AVMutableComposition same like bellow
//First we have to add video track on AVMutableComposition
let videoTrack = videoPlusSubtitles.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
do{
guard localVideoAsset!.tracks.count > 0 else{
// error msg
return
}
try? videoTrack?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: localVideoAsset!.duration),
of:localVideoAsset!.tracks(withMediaType: .video)[0],
at: seconds)
}
//After that we have to add subtitle track in AVMutableComposition
if isEnglishSubtitle {
setSubtitleTrack(subtitle: "transcript")
}else{
setSubtitleTrack(subtitle: "transcript_tr")
}
//After set the video track and subtitle track we have to set in the player same like bellow
player = AVPlayer(playerItem: AVPlayerItem(asset: videoPlusSubtitles))
playerLayer.removeFromSuperlayer()
playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.videoView.bounds
playerLayer.videoGravity = .resizeAspect
self.videoView.layer.addSublayer(playerLayer)
player.play()
}
func setSubtitleTrack(subtitle : String){
print(subtitle)
print(seconds)
//Here we have to check if any pre track available. If available then we have to remove it same like bellow
if subtitleTrack != nil{
videoPlusSubtitles.removeTrack(subtitleTrack!)
}
//We have to get subtitle file from path same like bellow
let subtitleAsset = AVURLAsset(url: Bundle.main.url(forResource: subtitle, withExtension: ".vtt")!)
// And we have to add new track from here
subtitleTrack = videoPlusSubtitles.addMutableTrack(withMediaType: .text, preferredTrackID: kCMPersistentTrackID_Invalid)
do{
guard subtitleAsset.tracks.count > 0 else{
//error msg
return
}
try? subtitleTrack?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: localVideoAsset!.duration),
of:subtitleAsset.tracks(withMediaType: .text)[0],
at: seconds)
}
}
I suspect the issue is not caused by the AMS stream. To double check, you may want to try using another stream HLS(e.g. try a HLS provided by Apple). Apple has specific requirements for playing VTT in AVPlayer. I've included an Apple doc link which has a lot of examples on streaming, and other links that may be helpful:
https://developer.apple.com/streaming/examples/
How to add external .vtt subtitle file to AVPlayerViewController in tvOS
AVUrlAsset and WebVTTs

how to change the playing speed of AVPlayerItem in swift?

I am using a library named Jukebox , to play audios files , I want to make it player faster or slower.
It inherits AVPlayerItem , but I can't find how.
Any one can help me?
I think you can use the AVPlayer rate property below
var rate: Float { get set }
For instance:-
if let playerItem = AVPlayerItem(URL: yourUrl), let player = AVPlayer(playerItem: playerItem) {
//This will update your player speed faster or slower accordingly
player.rate = Float(rateValue)
}

iOS play video from data URI

I am attempting to play a video by using a data URI (data:video/mp4;base64,AAAAHGZ0eXBtcDQyAAAAAG1wNDJpc29......). Here is my code thus far:
func videoDataWasLoaded(data: NSData) {
let moviePlayer = MPMoviePlayerController()
let base64 = data.base64EncodedStringWithOptions(NSDataBase64EncodingOptions(rawValue: 0))
let dataUri = NSURL(string: "data:video/mp4;base64,\(base64)")
moviePlayer.contentURL = dataUri
moviePlayer.play()
}
I have confirmed that the video plays by writing the data (NSData) to a tmp file and then using that for the contentURL. However, writing to disk is slow, and I figured that the data URI approach would be faster especially since my movie files are small (around 5 seconds each).
UPDATE: This question is not so much concerned about which method (AVPlayer, MPMoviePlayerController) is used to play the video. Rather, it is concerned with the possibility of playing a video from a data URI. Here is a link which describes what I am wanting to do in terms of HTML5.
This code plays a movie from a URL ... assuming that is your question?
let videoURL = self.currentSlide.aURL
self.playerItem = AVPlayerItem(URL: videoURL)
self.player = AVPlayer(playerItem: self.playerItem)
self.playerLayer = AVPlayerLayer(player: self.player)
self.streamPlayer = AVPlayerViewController()
self.streamPlayer.player = self.player
self.streamPlayer.view.frame = CGRect(x: 128, y: 222, width: 512, height: 256)
self.presentViewController(self.streamPlayer, animated: true) {
self.streamPlayer.player!.play()
}
But sorry, that is to play a URL; you want an URI. I though you had mis-typed your question, my error. I looked up URI this time :|
The answer must surely lie in the conversion of your video source to a playable stream, as in an M3u8. Here is an excellent post on the subject it seems.
http://stackoverflow.com/questions/6592485/http-live-streaming

How to play RTMP video streaming in ios app?

HI I'm developing Broadcast App for that I'm using Videocore library now how can i play that streaming video in ios app i tried with the MpMoviePlayer but it won't support the rtmp stream. so is there any third party libraries available for RTMP supported Players please help me
If you already have the RTMP live stream ready and playing as HLS then you can simply add .m3u8 after the stream name and make RTMP link to http. For example you have RTMP link like this:
rtmp://XY.Y.ZX.Z/hls/chid
You have to just make the url like this:
http://XY.Y.ZX.Z/hls/chid.m3u8
and it will play smoothly in iOS. I have tried following code and it is working fine.
func setPlayer()
{
// RTMP URL rtmp://XY.Y.ZX.Z/hls/chid be transcripted like this http://XY.Y.ZX.Z/hls/chid.m3u8 it will play normally.
let videoURL = URL(string: "http://XY.Y.ZX.Z/hls/chid.m3u8")
let playerItem = AVPlayerItem(url: videoURL!)
let adID = AVMetadataItem.identifier(forKey: "X-TITLE", keySpace: .hlsDateRange)
let metadataCollector = AVPlayerItemMetadataCollector(identifiers: [adID!.rawValue], classifyingLabels: nil)
//metadataCollector.setDelegate(self, queue: DispatchQueue.main)
playerItem.add(metadataCollector)
let player = AVPlayer(playerItem: playerItem)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.view.bounds
self.view.layer.addSublayer(playerLayer)
self.player = player
player.play()
}
But it will be slow and laggy because of the high resolution video stream upload. If you make the resolution to low when uploading the video stream, it will work smooth in low bandwidth network as well.

AVPlayer "freezes" the app at the start of buffering an audio stream

I am using a subclass of AVQueuePlayer and when I add new AVPlayerItem with a streaming URL the app freezes for about a second or two. By freezing I mean that it doesn't respond to touches on the UI. Also, if I have a song playing already and then add another one to the queue, AVQueuePlayer automatically starts preloading the song while it is still streaming the first one. This makes the app not respond to touches on the UI for two seconds just like when adding the first song but the song is still playing. So that means AVQueuePlayer is doing something in main thread that is causing the apparent "freeze".
I am using insertItem:afterItem: to add my AVPlayerItem. I tested and made sure that this was the method that was causing the delay. Maybe it could be something that AVPlayerItem does when it gets activated by AVQueuePlayer at the moment of adding it to the queue.
Must point out that I am using the Dropbox API v1 beta to get the streaming URL by using this method call:
[[self restClient] loadStreamableURLForFile:metadata.path];
Then when I receive the stream URL I send it to AVQueuePlayer as follows:
[self.player insertItem:[AVPlayerItem playerItemWithURL:url] afterItem:nil];
So my question is: How do I avoid this?
Should I do the preloading of an audio stream on my own without the help of AVPlayer? If so, how do I do this?
Thanks.
Don't use playerItemWithURL it's sync.
When you receive the response with the url try this:
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSArray *keys = #[#"playable"];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
[self.player insertItem:[AVPlayerItem playerItemWithAsset:asset] afterItem:nil];
}];
Bump, since this is a highly rated question and similar questions online either has outdated answers or aren't great. The whole idea is pretty straight forward with AVKit and AVFoundation, which means no more depending on third party libraries. The only issue is that it took some tinkering around and put the pieces together.
AVFoundation's Player() initialization with url is apparently not thread safe, or rather it's not meant to be. Which means, no matter how you initialize it in a background thread the player attributes are going to be loaded in the main queue causing freezes in the UI especially in UITableViews and UICollectionViews. To solve this issue Apple provided AVAsset which takes a URL and assists in loading the media attributes like track, playback, duration etc. and can do so asynchronously, with a best part being that this loading process is cancellable (unlike other Dispatch queue background threads where ending a task may not be that straight forward). This means, there is no need to worry about lingering zombie threads in the background as you scroll fast on a table view or collection view, ultimately piling up on the memory with a whole bunch of unused objects. This cancellable feature is great, and allows us to cancel any lingering AVAsset async load if it is in progress but only during cell dequeue. The async loading process can be invoked by the loadValuesAsynchronously method, and can be cancelled (at will) at any later time (if still in progress).
Don't forget to exception handle properly using the results of loadValuesAsynchronously. In Swift (3/4), here's how you would would load a video asynchronously and handle situations if the async process fails (due to slow networks, etc.)-
TL;DR
TO PLAY A VIDEO
let asset = AVAsset(url: URL(string: self.YOUR_URL_STRING))
let keys: [String] = ["playable"]
var player: AVPlayer!
asset.loadValuesAsynchronously(forKeys: keys, completionHandler: {
var error: NSError? = nil
let status = asset.statusOfValue(forKey: "playable", error: &error)
switch status {
case .loaded:
DispatchQueue.main.async {
let item = AVPlayerItem(asset: asset)
self.player = AVPlayer(playerItem: item)
let playerLayer = AVPlayerLayer(player: self.player)
playerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
playerLayer.frame = self.YOUR_VIDEOS_UIVIEW.bounds
self.YOUR_VIDEOS_UIVIEW.layer.addSublayer(playerLayer)
self.player.isMuted = true
self.player.play()
}
break
case .failed:
DispatchQueue.main.async {
//do something, show alert, put a placeholder image etc.
}
break
case .cancelled:
DispatchQueue.main.async {
//do something, show alert, put a placeholder image etc.
}
break
default:
break
}
})
NOTE:
Based on what your app wants to achieve you may still have to do some amount of tinkering to tune it to get smoother scroll in a UITableView or UICollectionView. You may also need to implement some amount of KVO on the AVPlayerItem properties for it to work and there's plenty of posts here in SO that discuss AVPlayerItem KVOs in detail.
TO LOOP THROUGH ASSETS (video loops/GIFs)
To loop a video, you can use the same method above and introducing AVPlayerLooper. Here's a sample code to loop a video (or perhaps a short video in GIF style). Note the use of duration key which is required for our video loop.
let asset = AVAsset(url: URL(string: self.YOUR_URL_STRING))
let keys: [String] = ["playable","duration"]
var player: AVPlayer!
var playerLooper: AVPlayerLooper!
asset.loadValuesAsynchronously(forKeys: keys, completionHandler: {
var error: NSError? = nil
let status = asset.statusOfValue(forKey: "duration", error: &error)
switch status {
case .loaded:
DispatchQueue.main.async {
let playerItem = AVPlayerItem(asset: asset)
self.player = AVQueuePlayer()
let playerLayer = AVPlayerLayer(player: self.player)
//define Timerange for the loop using asset.duration
let duration = playerItem.asset.duration
let start = CMTime(seconds: duration.seconds * 0, preferredTimescale: duration.timescale)
let end = CMTime(seconds: duration.seconds * 1, preferredTimescale: duration.timescale)
let timeRange = CMTimeRange(start: start, end: end)
self.playerLooper = AVPlayerLooper(player: self.player as! AVQueuePlayer, templateItem: playerItem, timeRange: timeRange)
playerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
playerLayer.frame = self.YOUR_VIDEOS_UIVIEW.bounds
self.YOUR_VIDEOS_UIVIEW.layer.addSublayer(playerLayer)
self.player.isMuted = true
self.player.play()
}
break
case .failed:
DispatchQueue.main.async {
//do something, show alert, put a placeholder image etc.
}
break
case .cancelled:
DispatchQueue.main.async {
//do something, show alert, put a placeholder image etc.
}
break
default:
break
}
})
EDIT : As per the documentation, AVPlayerLooper requires the duration property of the asset to be fully loaded in order to be able to loop through videos. Also, the timeRange: timeRange with the start and end timerange in the AVPlayerLooper initialization is really optional if you want an infinite loop. I have also realized since I posted this answer that AVPlayerLooper is only about 70-80% accurate in looping videos, especially if your AVAsset needs to stream the video from a URL. In order to solve this issue there is a totally different (yet simple) approach to loop a video-
//this will loop the video since this is a Gif
let interval = CMTime(value: 1, timescale: 2)
self.timeObserverToken = self.player?.addPeriodicTimeObserver(forInterval: interval, queue: DispatchQueue.main, using: { (progressTime) in
if let totalDuration = self.player?.currentItem?.duration{
if progressTime == totalDuration{
self.player?.seek(to: kCMTimeZero)
self.player?.play()
}
}
})
Gigisommo's answer for Swift 3 including the feedback from the comments:
let asset = AVAsset(url: url)
let keys: [String] = ["playable"]
asset.loadValuesAsynchronously(forKeys: keys) {
DispatchQueue.main.async {
let item = AVPlayerItem(asset: asset)
self.playerCtrl.player = AVPlayer(playerItem: item)
}
}

Resources