I am following Apple's documentation on caching HLS (.m3u8) video.
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
Under Playing Offline Assets in the documentation, it is instructed to use AVAssetDownloadTask's asset to simultaneously start playing.
func downloadAndPlayAsset(_ asset: AVURLAsset) {
// Create new AVAssetDownloadTask for the desired asset
// Passing a nil options value indicates the highest available bitrate should be downloaded
let downloadTask = downloadSession.makeAssetDownloadTask(asset: asset,
assetTitle: assetTitle,
assetArtworkData: nil,
options: nil)!
// Start task
downloadTask.resume()
// Create standard playback items and begin playback
let playerItem = AVPlayerItem(asset: downloadTask.urlAsset)
player = AVPlayer(playerItem: playerItem)
player.play()
}
The issue is that the same asset is downloaded twice.
Right after AVPlayer is initialized it starts to buffer the asset. Initially, I assumed that the data from the buffer must be used to create cache but AVAssetDownloadTask doesn't start to download the data for caching until AVPlayer finishes playing the asset. The buffered data is basically discarded.
I used KVO on currentItem.loadedTimeRanges to check state of buffer.
playerTimeRangesObserver = currentPlayer.observe(\.currentItem?.loadedTimeRanges, options: [.new, .old]) { (player, change) in
let time = self.currentPlayer.currentItem?.loadedTimeRanges.firs.
if let t = time {
print(t.timeRangeValue.duration.seconds)
}
}
Below method to check the downloading status of AVAssetDownloadTask.
/// Method to adopt to subscribe to progress updates of an AVAssetDownloadTask.
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didLoad timeRange: CMTimeRange, totalTimeRangesLoaded loadedTimeRanges: [NSValue], timeRangeExpectedToLoad: CMTimeRange) {
// This delegate callback should be used to provide download progress for your AVAssetDownloadTask.
guard let asset = activeDownloadsMap[assetDownloadTask] else { return }
var percentComplete = 0.0
for value in loadedTimeRanges {
let loadedTimeRange: CMTimeRange = value.timeRangeValue
percentComplete +=
loadedTimeRange.duration.seconds / timeRangeExpectedToLoad.duration.seconds
}
print("PercentComplete for \(asset.stream.name) = \(percentComplete)")
}
Is this the right behaviour or am I doing something wrong?
I want to be able to use the video data that is being cached (AVAssetDownloadTask downloading is in progress) to play in AVPlayer.
Your AVAssetDownloadTask must be configured to download differing HLS variants than your AVPlayerItem is requesting.
If you already have some data downloaded by AVAssetDownloadTask, your AVPlayerItem will subsequently use it.
But if you already have some data downloaded by AVPlayerItem, your AVAssetDownloadTask may ignore it, as it needs to satisfy the requirements of your download configuration.
Related
I am trying to build a music app in Swift in which I got a remote URL of a song and now I want to play the song on button click. I also want a slider or progress bar to display progress of the song according to playback progress.
This is a sample audio file I want to stream.
https://audio-ssl.itunes.apple.com/itunes-assets/AudioPreview113/v4/99/c4/84/99c48467-71dd-0a95-8388-3c5d4d433ee2/mzaf_6642611679343132363.std.aac.p.m4a
Have a look at Apple's AVFoundation framework. There are several classes you can use like AVPlayer, AVAudioPlay, AVQueuePlayer?, it all depends on your requiements.
here's an example to get you started: (you need to enable "Allow Outgoing Connections" in xcode under signing and capabilities for the audio to play.)
for the slider, you find out the duaration of the audio and set the slider maxVaue to duration. using addPeriodicTimeObserver we update the slider in real time with slider.doubleValue = ProgressTime.seconds
you can also be notified when the audio has ended with AVPlayerItemDidPlayToEndTime.
import AVFoundation
var player: AVPlayer! = nil //(needs to be outside of class)
#IBOutlet weak var slider: NSSliderCell!
let urlToPlay = URL(string: "https://audio-ssl.itunes.apple.com/itunes-assets/AudioPreview113/v4/99/c4/84/99c48467-71dd-0a95-8388-3c5d4d433ee2/mzaf_6642611679343132363.std.aac.p.m4a")
let asset = AVAsset(url: urlToPlay!)
let playerItem = AVPlayerItem(asset: asset)
if player == nil {
player = AVPlayer(playerItem: playerItem)
player.play()
}
//time observer to update slider.
player.addPeriodicTimeObserver(forInterval: CMTime(value: 1, timescale: 2), // used to monitor the current play time and update slider
queue: DispatchQueue.global(), using: { [weak self] (progressTime) in
DispatchQueue.main.async {
self!.slider.maxValue = player.currentItem!.asset.duration.seconds
self!.slider.doubleValue = progressTime.seconds
}
})
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I'm building a video list view(using collectionView) like the Tiktok app. I'm adding AVPlayerLayer on imageView(for each cellItem) and playing AVPlayer on that, taking a few time to load the video layer. Can anyone suggest how we can fetch video data for the player before going on the video page to make the video page more smooth???
Please check the below code what I'm doing wrong in that??
func setupVideoFor(url: String, completion: #escaping COMPLETION_HANDLER = {_ in}) {
if self.videoCache.object(forKey: url as NSString) != nil {
return
}
guard let URL = URL(string: url) else {
return
}
didVideoStartPlay = completion
let asset = AVURLAsset(url: URL)
let requestedKeys = ["playable"]
asset.loadValuesAsynchronously(forKeys: requestedKeys) { [weak self] in
guard let strongSelf = self else {
return
}
/**
Need to check whether asset loaded successfully, if not successful then don't create
AVPlayer and AVPlayerItem and return without caching the videocontainer,
so that, the assets can be tried to be downloaded again when need be.
*/
var error: NSError? = nil
let status = asset.statusOfValue(forKey: "playable", error: &error)
switch status {
case .loaded:
break
case .failed, .cancelled:
print("Failed to load asset successfully")
return
default:
print("Unkown state of asset")
return
}
let player = AVPlayer()
let item = AVPlayerItem(asset: asset)
DispatchQueue.main.async {
let videoContainer = VideoContainer(player: player, item: item, url: url)
strongSelf.videoCache.setObject(videoContainer, forKey: url as NSString)
videoContainer.player.replaceCurrentItem(with: videoContainer.playerItem)
/**
Try to play video again in case when playvideo method was called and
asset was not obtained, so, earlier video must have not run
*/
if strongSelf.videoURL == url, let layer = strongSelf.currentLayer {
strongSelf.duration = asset.duration
strongSelf.playVideo(withLayer: layer, url: url)
}
}
}
}
It depends on a few factors...
AVPlayer is a way to control what happens to an AVPlayerItem, and AVPlayerLayer is just the display layer for that.
You want to look into AVPlayerItem. You can initialize a number of AVPlayerItem objects without passing them to the AVPlayer. You can observe each of their status properties (with KVO) to know when they are ready to play. You could do this before showing any video layer at all, then pass the ready AVPlayerItem objects to the AVPlayer, and that could give the perception of speeded up video.
Also, you might consider looking at your video's HLS manifest. You can check errors of the manifest itself with mediastreamvalidator which can be found (along with other tools) over here. https://developer.apple.com/documentation/http_live_streaming/about_apple_s_http_live_streaming_tools
This tool will inspect how the playlist is set up, and report any number of errors, including ones that would affect performance. For example, if the initial bitrate (what the player will try to play before it figures out data about network conditions, etc) is set too high, this could lead to long loading times.
I'm using an AVPlayer to play a remote progressive download (i.e. non-HLS) video. But, I can't figure out how to control its buffering behavior.
I would like to pre-fetch 2 seconds of the video before it's ready to play, and also to stop buffering when the video is paused.
Here's my setup:
let asset = AVURLAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer()
I tried the following, without success:
// doesn't start buffering
playerItem.preferredForwardBufferDuration = 2.0
// doesn't stop buffering
playerItem.preferredForwardBufferDuration = 2.0
player.replaceCurrentItem(with: playerItem)
I tried player.automaticallyWaitsToMinimizeStalling = true in both cases, and in combination with various player.pause() or player.rate = 0 - doesn't work.
A potential approach that comes to mind is to observe for loadedTimeRanges until the first 2 seconds loaded and set current item of the player to nil.
let c = playerItem.publisher(for: \.loadedTimeRanges, options: .new)
.compactMap { $0.first as? CMTimeRange }
.sink {
if $0.duration.seconds - $0.start.seconds > 2 {
player.replaceCurrentItem(with: nil)
}
}
This would work for pre-buffer, but it doesn't work for pausing, because it makes the video blank instead of paused. (And at this point, I feel I'm attempting to reimplement/interfere with some core buffering functionality)
I have an audio data stream coming in from a http response. I receive packets of bytes using the URLSessionDataDelegate method:
urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive data: Data)
I have successfully played the audio after appending all the data packets into a single Data object, using an AVAudioPlayer object and it's initWithData: initializer method.
What I really want to do is start audio playback while data is still coming in - streaming audio effectively. I haven't seen any answers that seem elegant for this use-case.
Options I've seen are:
Using the AudioToolbox: Audio File Stream Services & Audio Queues
Using the NSStream API, writing to a file and playing audio from that file concurrently
How would I achieve audio streaming playback from the Data packets coming in?
The easiest way is to use AVPlayer of AVFoundation framework. Instantiate the playerItem with your URL and pass it the player. Following code will do for you.
let urlString = "your url string"
guard let url = URL.init(string: urlString)
else {
return
}
let playerItem = AVPlayerItem.init(url: url)
player = AVPlayer.init(playerItem: playerItem)
player.play()
Consider AVPlayer for your requirement, something like this :
import AVKit
var player: AVPlayer?
func audioPlayer() {
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
player = AVPlayer(url: URL.init(string: "your url")!)
//This is for a player screen, if you don't want to show a player screen you comment this part
let controller = AVPlayerViewController()
controller.player = player
controller.showsPlaybackControls = false
self.addChildViewController(controller)
let screenSize = UIScreen.main.bounds.size
let videoFrame = CGRect(x: 0, y: 130, width: screenSize.width, height: (screenSize.height - 130) / 2)
controller.view.frame = videoFrame
self.view.addSubview(controller.view)
// till here
player?.play()
} catch {
}
}
For more please read this : https://developer.apple.com/documentation/avfoundation/avplayer
Our app lets users record a video, after which the app adds subtitles and exports the edited video.
The goal is to replay the video immediately, but the AVPlayer only appears after the video finishes (and only plays audio, which is a separate issue).
Here's what happens now: we show a preview so the user can see what he is recording in real-time. After the user is done recording, we want to play back the video for review. Unfortunately, no video appears, and only audio plays back. An image representing some frame of the video appears when the audio is done playing back.
Why is this happening?
func exportDidFinish(exporter: AVAssetExportSession) {
println("Finished exporting video")
// Save video to photo album
let assetLibrary = ALAssetsLibrary()
assetLibrary.writeVideoAtPathToSavedPhotosAlbum(exporter.outputURL, completionBlock: {(url: NSURL!, error: NSError!) in
println("Saved video to album \(exporter.outputURL)")
self.playPreview(exporter.outputURL)
if (error != nil) {
println("Error saving video")
}
})
}
func playPreview(videoUrl: NSURL) {
let asset = AVAsset.assetWithURL(videoUrl) as? AVAsset
let playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = view.frame
view.layer.addSublayer(playerLayer)
player.play()
}
Perhaps this can help:
let assetLibrary = ALAssetsLibrary()
assetLibrary.writeVideoAtPathToSavedPhotosAlbum(exporter.outputURL, completionBlock: {(url: NSURL!, error: NSError!) in
if (error != nil) {
println("Error saving video")
}else{
println("Saved video to album \(url)")
self.playPreview(url)
}
})
Send "url" to "playPreview" leaving "completionBlock" and not that which comes from "AVAssetExportSession"
Perhaps...!
The answer was we had an incorrectly composed video in the first place, as described here: AVAssetExportSession export fails non-deterministically with error: "Operation Stopped, NSLocalizedFailureReason=The video could not be composed.".
The other part of the question (audio playing long before images/video appears) was answered here: Long delay before seeing video when AVPlayer created in exportAsynchronouslyWithCompletionHandler
Hope these help someone avoid the suffering we endured! :)