Our app lets users record a video, after which the app adds subtitles and exports the edited video.
The goal is to replay the video immediately, but the AVPlayer only appears after the video finishes (and only plays audio, which is a separate issue).
Here's what happens now: we show a preview so the user can see what he is recording in real-time. After the user is done recording, we want to play back the video for review. Unfortunately, no video appears, and only audio plays back. An image representing some frame of the video appears when the audio is done playing back.
Why is this happening?
func exportDidFinish(exporter: AVAssetExportSession) {
println("Finished exporting video")
// Save video to photo album
let assetLibrary = ALAssetsLibrary()
assetLibrary.writeVideoAtPathToSavedPhotosAlbum(exporter.outputURL, completionBlock: {(url: NSURL!, error: NSError!) in
println("Saved video to album \(exporter.outputURL)")
self.playPreview(exporter.outputURL)
if (error != nil) {
println("Error saving video")
}
})
}
func playPreview(videoUrl: NSURL) {
let asset = AVAsset.assetWithURL(videoUrl) as? AVAsset
let playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = view.frame
view.layer.addSublayer(playerLayer)
player.play()
}
Perhaps this can help:
let assetLibrary = ALAssetsLibrary()
assetLibrary.writeVideoAtPathToSavedPhotosAlbum(exporter.outputURL, completionBlock: {(url: NSURL!, error: NSError!) in
if (error != nil) {
println("Error saving video")
}else{
println("Saved video to album \(url)")
self.playPreview(url)
}
})
Send "url" to "playPreview" leaving "completionBlock" and not that which comes from "AVAssetExportSession"
Perhaps...!
The answer was we had an incorrectly composed video in the first place, as described here: AVAssetExportSession export fails non-deterministically with error: "Operation Stopped, NSLocalizedFailureReason=The video could not be composed.".
The other part of the question (audio playing long before images/video appears) was answered here: Long delay before seeing video when AVPlayer created in exportAsynchronouslyWithCompletionHandler
Hope these help someone avoid the suffering we endured! :)
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I'm building a video list view(using collectionView) like the Tiktok app. I'm adding AVPlayerLayer on imageView(for each cellItem) and playing AVPlayer on that, taking a few time to load the video layer. Can anyone suggest how we can fetch video data for the player before going on the video page to make the video page more smooth???
Please check the below code what I'm doing wrong in that??
func setupVideoFor(url: String, completion: #escaping COMPLETION_HANDLER = {_ in}) {
if self.videoCache.object(forKey: url as NSString) != nil {
return
}
guard let URL = URL(string: url) else {
return
}
didVideoStartPlay = completion
let asset = AVURLAsset(url: URL)
let requestedKeys = ["playable"]
asset.loadValuesAsynchronously(forKeys: requestedKeys) { [weak self] in
guard let strongSelf = self else {
return
}
/**
Need to check whether asset loaded successfully, if not successful then don't create
AVPlayer and AVPlayerItem and return without caching the videocontainer,
so that, the assets can be tried to be downloaded again when need be.
*/
var error: NSError? = nil
let status = asset.statusOfValue(forKey: "playable", error: &error)
switch status {
case .loaded:
break
case .failed, .cancelled:
print("Failed to load asset successfully")
return
default:
print("Unkown state of asset")
return
}
let player = AVPlayer()
let item = AVPlayerItem(asset: asset)
DispatchQueue.main.async {
let videoContainer = VideoContainer(player: player, item: item, url: url)
strongSelf.videoCache.setObject(videoContainer, forKey: url as NSString)
videoContainer.player.replaceCurrentItem(with: videoContainer.playerItem)
/**
Try to play video again in case when playvideo method was called and
asset was not obtained, so, earlier video must have not run
*/
if strongSelf.videoURL == url, let layer = strongSelf.currentLayer {
strongSelf.duration = asset.duration
strongSelf.playVideo(withLayer: layer, url: url)
}
}
}
}
It depends on a few factors...
AVPlayer is a way to control what happens to an AVPlayerItem, and AVPlayerLayer is just the display layer for that.
You want to look into AVPlayerItem. You can initialize a number of AVPlayerItem objects without passing them to the AVPlayer. You can observe each of their status properties (with KVO) to know when they are ready to play. You could do this before showing any video layer at all, then pass the ready AVPlayerItem objects to the AVPlayer, and that could give the perception of speeded up video.
Also, you might consider looking at your video's HLS manifest. You can check errors of the manifest itself with mediastreamvalidator which can be found (along with other tools) over here. https://developer.apple.com/documentation/http_live_streaming/about_apple_s_http_live_streaming_tools
This tool will inspect how the playlist is set up, and report any number of errors, including ones that would affect performance. For example, if the initial bitrate (what the player will try to play before it figures out data about network conditions, etc) is set too high, this could lead to long loading times.
I am following Apple's documentation on caching HLS (.m3u8) video.
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
Under Playing Offline Assets in the documentation, it is instructed to use AVAssetDownloadTask's asset to simultaneously start playing.
func downloadAndPlayAsset(_ asset: AVURLAsset) {
// Create new AVAssetDownloadTask for the desired asset
// Passing a nil options value indicates the highest available bitrate should be downloaded
let downloadTask = downloadSession.makeAssetDownloadTask(asset: asset,
assetTitle: assetTitle,
assetArtworkData: nil,
options: nil)!
// Start task
downloadTask.resume()
// Create standard playback items and begin playback
let playerItem = AVPlayerItem(asset: downloadTask.urlAsset)
player = AVPlayer(playerItem: playerItem)
player.play()
}
The issue is that the same asset is downloaded twice.
Right after AVPlayer is initialized it starts to buffer the asset. Initially, I assumed that the data from the buffer must be used to create cache but AVAssetDownloadTask doesn't start to download the data for caching until AVPlayer finishes playing the asset. The buffered data is basically discarded.
I used KVO on currentItem.loadedTimeRanges to check state of buffer.
playerTimeRangesObserver = currentPlayer.observe(\.currentItem?.loadedTimeRanges, options: [.new, .old]) { (player, change) in
let time = self.currentPlayer.currentItem?.loadedTimeRanges.firs.
if let t = time {
print(t.timeRangeValue.duration.seconds)
}
}
Below method to check the downloading status of AVAssetDownloadTask.
/// Method to adopt to subscribe to progress updates of an AVAssetDownloadTask.
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didLoad timeRange: CMTimeRange, totalTimeRangesLoaded loadedTimeRanges: [NSValue], timeRangeExpectedToLoad: CMTimeRange) {
// This delegate callback should be used to provide download progress for your AVAssetDownloadTask.
guard let asset = activeDownloadsMap[assetDownloadTask] else { return }
var percentComplete = 0.0
for value in loadedTimeRanges {
let loadedTimeRange: CMTimeRange = value.timeRangeValue
percentComplete +=
loadedTimeRange.duration.seconds / timeRangeExpectedToLoad.duration.seconds
}
print("PercentComplete for \(asset.stream.name) = \(percentComplete)")
}
Is this the right behaviour or am I doing something wrong?
I want to be able to use the video data that is being cached (AVAssetDownloadTask downloading is in progress) to play in AVPlayer.
Your AVAssetDownloadTask must be configured to download differing HLS variants than your AVPlayerItem is requesting.
If you already have some data downloaded by AVAssetDownloadTask, your AVPlayerItem will subsequently use it.
But if you already have some data downloaded by AVPlayerItem, your AVAssetDownloadTask may ignore it, as it needs to satisfy the requirements of your download configuration.
I am using the AVPlayer for playing the video. But my problem i.e, the the player is occurring the error. while the same url is already to play in to the Android device and Safari web browser also. If this url replaced by the other url it's working fine.
This is the error.
player.error==========>>>>>>>>>>Optional(Error Domain=AVFoundationErrorDomain Code=-11848 "Cannot Open" UserInfo={NSUnderlyingError=0x156d78f30 {Error Domain=NSOSStatusErrorDomain Code=-12925 "(null)"}, NSLocalizedFailureReason=The media cannot be used on this device., NSLocalizedDescription=Cannot Open})
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
let videoUrl = "http://telvuehls_t03007-i.akamaihd.net/hls/live/217085/T03007-calkins/playlist.m3u8"
let playerItem = AVPlayerItem(URL: NSURL(string: videoUrl as String)!)
let playerObj = AVPlayer(playerItem: playerItem)
self.player = playerObj
if playerItem.error == nil{
playerObj.play()
}else{
print("player.error==========>>>>>>>>>>\(playerItem.error)")
}
}
Please refer the following link Supportable Formats for supportable formats of AVPlayer
I think the format you are using to play is not able to open by AVPlayer which is clearly mentioned in the error.
As for i know AVPlayer need chunk of data to play.
Have a parse class "Response", with one of the fields being of type File. Am uploading the files to this column for each row manually by selecting the cell and clicking "upload a file".
Now I need to get this file (which as I understand should be PFFile type) and play this file (its a video file) in my iOS app.
Please help!
Assuming you just want to stream the video file and not actually download it (which may take a while), you simply need to fetch the PFObject in your "Response" class. Once you have the object, you can get a reference to the PFFile where the video is saved and access its URL property:
// define these as class properties:
var player:AVPlayer!
var playerLayer:AVPlayerLayer!
// write all of the below somewhere in your ViewController, e.g. in viewDidLoad:
var videoUrl:String!
let query = PFQuery(className: "Response")
query.getObjectInBackgroundWithId("objectId123") {
(object:PFObject?, error:NSError?) in
if (error == nil && object != nil) {
let videoFile = object!["keyForVideoPFFile"] as! PFFile
videoUrl = videoFile.url
self.setupVideoPlayerWithURL(NSURL(string: videoUrl)!)
}
}
In the above code, you feed the video's URL to an AVPlayer object which will stream the video. Note that you need to import AVKit to use AVPlayer and AVPlayerLayer. The function for setting up the player is as follows:
func setupVideoPlayerWithURL(url:NSURL) {
player = AVPlayer(URL: url)
playerLayer = AVPlayerLayer(player: self.player)
playerLayer.videoGravity = AVLayerVideoGravityResizeAspect
playerLayer.frame = self.view.frame // take up entire screen
self.view.layer.addSublayer(self.playerLayer)
player.play()
}
I would recommend checking out Apple's docs on AVPlayer and AVPlayerLayer to learn more about video playback.
AVPlayer: https://developer.apple.com/library/prerelease/ios/documentation/AVFoundation/Reference/AVPlayer_Class/index.html
AVPlayerLayer: https://developer.apple.com/library/prerelease/ios/documentation/AVFoundation/Reference/AVPlayerLayer_Class/index.html#//apple_ref/occ/cl/AVPlayerLayer
I have built an app that records an audio clip, and saves it to a file called xxx.m4a.
Once the recording is made, I can find this xxx.m4a file on my system and play it back - and it plays back fine. The problem isn't recording this audio track.
My problem is playing this audio track back from within the app.
// to demonstrate how I am building the recordedFileURL
let currentFileName = "xxx.m4a"
let dirPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
let docsDir: AnyObject = dirPaths[0]
let recordedFilePath = docsDir.stringByAppendingPathComponent(currentFileName)
var recordedFileURL = NSURL(fileURLWithPath: recordedFilePath)
// quick check for my own sanity
var checkIfExists = NSFileManager.defaultManager()
if checkIfExists.fileExistsAtPath(recordedFilePath){
println("audio file exists!")
}else{
println("audio file does not exist")
}
// play the recorded audio
var error: NSError?
let player = AVAudioPlayer(contentOfURL: recordedFileURL!, error: &error)
if player == nil {
println("error creating avaudioplayer")
if let e = error {
println(e.localizedDescription)
}
}
println("about to play")
player.delegate = self
player.prepareToPlay()
player.volume = 1.0
player.play()
println("told to play")
// -------
// further on in this class I have the following delegate methods:
func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool){
println("finished playing (successfully? \(flag))")
}
func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer!, error: NSError!){
println("audioPlayerDecodeErrorDidOccur: \(error.localizedDescription)")
}
My console output when I run this code is like this:
audio file exists!
about to play
told to play
I dont get anything logged into the console from either audioPlayerDidFinishPlaying or audioPlayerDecodeErrorDidOccur.
Can someone explain to me why I my audio clip isnt playing?
Many thanks.
You playing code is fine; your only problem is with the scope of your AVAudioPlayer object. Basically, you need to keep a strong reference to the player around all the time it's playing, otherwise it'll be destroyed by ARC, as it'll think you don't need it any more.
Normally you'd make the AVAudioPlayer object a property of whatever class your code is in, rather than making it a local variable in a method.
In the place where you want to use the Playback using speakers before you initialize your AVAudioPlayer, add the following code:
let recordingSession = AVAudioSession.sharedInstance()
do{
try recordingSession.setCategory(AVAudioSessionCategoryPlayback)
}catch{
}
and when you are just recording using AVAudioRecorder use this before initializing that:
do {
recordingSession = AVAudioSession.sharedInstance()
try recordingSession.setCategory(AVAudioSessionCategoryRecord)
}catch {}