I have a video chat messaging app using AVFoundation and Firebase to record, store, and playback 1 minute-length videos.
Everything works accordingly, but there is a...
Delay in playing a fetched video
Delay in uploading a recorded video, especially long here
Ideally...
It'd be nice to "pre-load" a fetched video to play immediately on command, but it doesn't seem possible with AVPlayer where the loading and play happen only when the method .play() is invoked.
Would simultaneously uploading while the actual recording is taking place be something that's even possible? Or, does Firebase Storage work where once the network call to upload the video is first triggered, the app can enter the background and still complete?
I am admittedly a complete beginner in managing videos and I haven't found any concrete guides on how to eliminate or optimize the reduction of the delay for a better UX (i.e. Instagram playing and uploading an Instagram video story). Any help would be appreciated..
func playVideo(with outputFileURL: URL) {
DispatchQueue.main.async {
self.setView(view: self.progressBar, hidden: true)
self.progressBar.progress = 0
let asset = AVAsset(url: outputFileURL)
let item = AVPlayerItem(asset: asset)
self.avPlayer.replaceCurrentItem(with: item)
let previewLayer = AVPlayerLayer(player: self.avPlayer)
previewLayer.frame = self.view.bounds
previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
self.previewView.layer.addSublayer(previewLayer)
self.view.layoutIfNeeded()
self.avPlayer.play()
}
}
func uploadVideo(_ url: URL) {
let filename = "x"
let ref = Storage.storage().reference().child("videos").child("xyz").child(filename)
let uploadTask = ref.putFile(from: url, metadata: nil, completion: { (_, err) in
if let err = err {
print("Failed to upload movie:", err)
return
}
ref.downloadURL(completion: { (downloadUrl, err) in
if let err = err {
print("Failed to get download url:", err)
return
}
guard let downloadUrl = downloadUrl else { return }
if let thumbnailImage = self.thumbnailImageForFileUrl(url) {
self.uploadToFirebaseStorageUsingImage(thumbnailImage, completion: { (imageUrl) in
print("saved video url: \(downloadUrl) and saved image url: \(imageUrl)")
})
}
})
})
uploadTask.observe(.progress) { (snapshot) in
print("In Progress")
}
uploadTask.observe(.success) { (snapshot) in
print("Done")
}
}
func thumbnailImageForFileUrl(_ fileUrl: URL) -> UIImage? {
let asset = AVAsset(url: fileUrl)
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
do {
let thumbnailCGImage = try imageGenerator.copyCGImage(at: CMTimeMake(value: 2, timescale: 60), actualTime: nil)
return UIImage(cgImage: thumbnailCGImage)
} catch let err {
print(err)
}
return nil
}
As a temporary workaround, I looked into compressing the file.
The upload is much faster, but the quality is worse, which ideally should not have to be compromised.
If anyone has a better solution, would really appreciate and love to hear it!
Related
I am having a weird situation and have no clue how to handle this , I am downloading the videos from firestorage and caching into device for future use , meanwhile the background thread is already doing its job , I am passing a video url to the function to play the video. The issue is that sometimes avplayer is playing the right video and sometimes taking some other video url from the cache.
you can find the code in below :
func cacheVideo(for exercise: Exercise) {
print(exercise.imageFileName)
guard let filePath = filePathURL(for: exercise.imageFileName) else { return }
if fileManager.fileExists(atPath: filePath.path) {
// print("already exists")
} else {
exercise.loadRealURL { (url) in
print(url)
self.getFileWith(with: url, saveTo: filePath)
}
}
}
writing file here
func getFileWith(with url: URL, saveTo saveFilePathURL: URL) {
DispatchQueue.global(qos: .background).async {
print(saveFilePathURL.path)
if let videoData = NSData(contentsOf: url) {
videoData.write(to: saveFilePathURL, atomically: true)
DispatchQueue.main.async {
// print("downloaded")
}
} else {
DispatchQueue.main.async {
let error = NSError(domain: "SomeErrorDomain", code: -2001 /* some error code */, userInfo: ["description": "Can't download video"])
print(error.debugDescription)
}
}
}
}
now playing the video using this
func startPlayingVideoOnDemand(url : URL) {
activityIndicatorView.startAnimating()
activityIndicatorView.isHidden = false
print(url)
let cachingPlayerItem = CachingPlayerItem(url: url)
cachingPlayerItem.delegate = self
cachingPlayerItem.download()
// cachingPlayerItem.preferredPeakBitRate = 0
let avasset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: avasset)
let player = AVPlayer(playerItem: playerItem)
player.automaticallyWaitsToMinimizeStalling = false
initializeVideoLayer(for: player)
}
any suggestions would be highly appreciated.
this was solved because the data model which i was using to download bunch of videos files was accessed in background thread and meanwhile i was trying to assign the url to the same data model class in order to fetch the video and play in avplayer. Hence this was the issue and resolved by simply adding a new attribute into data model for assigning the url to play right away.
When I use this code below and I pull my https video link from Firebase over Wifi everything is smooth, the video immediately plays with zero issues. When I use this same code over Cellular everything moves extremely slow, like the video pauses and takes forever to load.
If it plays from file wether I'm on Cellular or Wifi shouldn't matter. What is the issue here?
DataModel:
class Video {
var httpsStr: String?
var videoURL: URL?
convenience init(dict: [String: Any] {
self.init()
if let httpsStr = dict["httpsStr"] as? String {
self.httpsStr = httpsStr
let url = URL(string: httpsStr)!
let assetKeys = [ "playable", "duration"]
let asset = AVURLAsset(url: url)
asset.loadValuesAsynchronously(forKeys: assetKeys, completionHandler: {
DispatchQueue.main.async {
self.videoURL = asset.url
// save videoURL to FileManager to play video from disk
}
})
}
}
}
Firebase Pull:
ref.observeSingleEvent(of: .value) { (snapshot) in
guard let dict = snapshot.value as? [String: Any] else { return }
let video = Video(dict: dict)
self.video = video
DispatchQueue.main.asyncAfter(deadline: .now() + 2, execute: {
self.playVideo()
}
}
Play Video:
func playVideo() {
// init AVPlayer ...
guard let videoURL = self.video.videoURL else { return }
let lastPathComponent = videoURL.lastPathComponent
let file = FileManager...appendingPathComponent(lastPathComponent)
if FileManager.default.fileExists(atPath: file.path) {
let asset = AVAsset(url: file)
play(asset)
} else {
let asset = AVAsset(url: videoURL)
play(asset)
}
}
func play(_ asset: AVAsset) {
self.playerItem = AVPlayerItem(asset: asset)
self.player?.automaticallyWaitsToMinimizeStalling = false // I also set this to true
self.playerItem?.preferredForwardBufferDuration = TimeInterval(1)
self.player?.replaceCurrentItem(with: playerItem!)
// play video
}
I followed this answer and now everything seems to work smoothly while on Cellular Data. I needed to include the tracks property in the assetKeys.
You create an asset from a URL using AVURLAsset. Creating the asset,
however, does not necessarily mean that it’s ready for use. To be
used, an asset must have loaded its tracks.
class Video {
var httpsStr: String?
var videoURL: URL?
convenience init(dict: [String: Any] {
self.init()
if let httpsStr = dict["httpsStr"] as? String {
self.httpsStr = httpsStr
let url = URL(string: httpsStr)!
let assetKeys = ["playable", "duration", "tracks"] // <----- "tracks" added here
let asset = AVURLAsset(url: url)
asset.loadValuesAsynchronously(forKeys: assetKeys, completionHandler: {
var error: NSError? = nil
let status = asset.statusOfValue(forKey: "tracks", error: &error)
switch status {
case .loaded:
// Sucessfully loaded, continue processing
DispatchQueue.main.async {
self.videoURL = asset.url
// save videoURL to FileManager to play video from disk
}
case .failed:
// Examine NSError pointer to determine failure
print("Error", error?.localizedDescription as Any)
default:
// Handle all other cases
print("default")
}
})
}
}
}
I am using a below method to generate a video thumbnail from a remote server url. If my url is not a firebase url for eg.
https://sample-videos.com/video123/mp4/720/big_buck_bunny_720p_1mb.mp4
Then I am able to generate the thumbnail but if my url is firebase url like below
https://firebasestorage.googleapis.com/v0/b/shaberi-a249e.appspot.com/o/message-videos%2F8EDAC3FC-D754-4165-990A-97F6ECE120A6.mp4?alt=media&token=b3271370-a408-467d-abbc-7df2beef45c7
Then video thumbnail is not generated.
Method for getting video thumbnail
func createThumbnailOfVideoFromRemoteUrl(url: String) -> UIImage? {
let asset = AVAsset(url: URL(string: url)!)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
//Can set this to improve performance if target size is known before hand
//assetImgGenerate.maximumSize = CGSize(width,height)
let time = CMTimeMakeWithSeconds(1.0, 100)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
return thumbnail
} catch {
print(error.localizedDescription)
return nil
}
}
Please let me know what is the issue ?
Please try below code it's working for me.
func createThumbnailOfVideoFromRemoteUrl(url: String) {
if let asset = AVAsset(url: URL(string: url)!) as? AVAsset {
//let durationSeconds = CMTimeGetSeconds(asset.duration)
let generator = AVAssetImageGenerator(asset: asset)
generator.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(3.0, preferredTimescale: 600)
//var thumbnailImage: CGImage
generator.generateCGImagesAsynchronously(forTimes: [NSValue(time: time)]) { (time, thumbnail, cmtime, result, error) in
if (thumbnail != nil) {
DispatchQueue.main.async {
self.profileImgView.image = UIImage(cgImage: thumbnail!)
}
}
}
}
}
self.createThumbnailOfVideoFromRemoteUrl(url: "https://firebasestorage.googleapis.com/v0/b/shaberi-a249e.appspot.com/o/message-videos%2F8EDAC3FC-D754-4165-990A-97F6ECE120A6.mp4?alt=media&token=b3271370-a408-467d-abbc-7df2beef45c7")
I'm currently displaying a video in my app and I want the user to be able to save it to its device gallery/album photo/camera roll.
Here it's what I'm doing but the video is not saved in the album :/
func downloadVideo(videoImageUrl:String)
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), {
//All stuff here
print("downloadVideo");
let url=NSURL(string: videoImageUrl);
let urlData=NSData(contentsOfURL: url!);
if((urlData) != nil)
{
let documentsPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0];
let fileName = videoImageUrl; //.stringByDeletingPathExtension
let filePath="\(documentsPath)/\(fileName)";
//saving is done on main thread
dispatch_async(dispatch_get_main_queue(), { () -> Void in
urlData?.writeToFile(filePath, atomically: true);
print("videoSaved");
})
}
})
}
I'va also look into this :
let url:NSURL = NSURL(string: fileURL)!;
PHPhotoLibrary.sharedPhotoLibrary().performChanges({
let assetChangeRequest = PHAssetChangeRequest.creationRequestForAssetFromVideoAtFileURL(url);
let assetPlaceHolder = assetChangeRequest!.placeholderForCreatedAsset;
let albumChangeRequest = PHAssetCollectionChangeRequest(forAssetCollection: self.assetCollection)
albumChangeRequest!.addAssets([assetPlaceHolder!])
}, completionHandler: saveVideoCallBack)
But I have the error "Unable to create data from file (null)". My "assetChangeRequest" is nil. I don't understand as my url is valid and when I go to it with a browser, it download a quick time file.
If anyone can help me, it would be appreciated ! I'm using Swift and targeting iOS 8.0 min.
Update
Wanted to update the answer for Swift 3 using URLSession and figured out that the answer already exists in related topic here. Use it.
Original Answer
The code below saves a video file to Camera Roll. I reused your code with a minor change - I removed let fileName = videoImageUrl; because it leads to incorrect file path.
I tested this code and it saved the asset into camera roll. You asked what to place into creationRequestForAssetFromVideoAtFileURL - put a link to downloaded video file as in the example below.
let videoImageUrl = "http://www.sample-videos.com/video/mp4/720/big_buck_bunny_720p_1mb.mp4"
DispatchQueue.global(qos: .background).async {
if let url = URL(string: urlString),
let urlData = NSData(contentsOf: url) {
let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0];
let filePath="\(documentsPath)/tempFile.mp4"
DispatchQueue.main.async {
urlData.write(toFile: filePath, atomically: true)
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: URL(fileURLWithPath: filePath))
}) { completed, error in
if completed {
print("Video is saved!")
}
}
}
}
}
Swift 3 version of the code from #Nimble:
DispatchQueue.global(qos: .background).async {
if let url = URL(string: urlString),
let urlData = NSData(contentsOf: url)
{
let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0];
let filePath="\(documentsPath)/tempFile.mp4"
DispatchQueue.main.async {
urlData.write(toFile: filePath, atomically: true)
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: URL(fileURLWithPath: filePath))
}) { completed, error in
if completed {
print("Video is saved!")
}
}
}
}
}
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: video.url!)}) {
saved, error in
if saved {
print("Save status SUCCESS")
}
}
following #Nimble and #Yuval Tal solution, it is much more preferable to use the URLSession dataTask(with:completionHandler:) method to download a file before writing it as stated in the warning section of NSData(contentsOf:) Apple documentation
Important
Don't use this synchronous initializer to request network-based URLs.
For network-based URLs, this method can block the current thread for
tens of seconds on a slow network, resulting in a poor user
experience, and in iOS, may cause your app to be terminated.
Instead, for non-file URLs, consider using the
dataTask(with:completionHandler:) method of the URLSession
a correct implementation could be :
let defaultSession = URLSession(configuration: .default)
var dataTask: URLSessionDataTask? = nil
func downloadAndSaveVideoToGallery(videoURL: String, id: String = "default") {
DispatchQueue.global(qos: .background).async {
if let url = URL(string: videoURL) {
let filePath = FileManager.default.temporaryDirectory.appendingPathComponent("\(id).mp4")
print("work started")
self.dataTask = self.defaultSession.dataTask(with: url, completionHandler: { [weak self] data, res, err in
DispatchQueue.main.async {
do {
try data?.write(to: filePath)
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: filePath)
}) { completed, error in
if completed {
print("Saved to gallery !")
} else if let error = error {
print(error.localizedDescription)
}
}
} catch {
print(error.localizedDescription)
}
}
self?.dataTask = nil
})
self.dataTask?.resume()
}
}
}
One more advantage is that you can pause, resume and terminate your download by calling the corresponding method on dataTask: URLSessionDataTask .resume() .suspend() .cancel()
I am trying to compress video taken with the users camera from UIImagePickerController (Not an existing video but one on the fly) to upload to my server and take a small amount of time to do so, so a smaller size is ideal instead of 30-45 mb on newer quality cameras.
Here is the code to do a compression in swift for iOS 8 and it compresses wonderfully, i go from 35 mb down to 2.1 mb easily.
func convertVideo(inputUrl: NSURL, outputURL: NSURL)
{
//setup video writer
var videoAsset = AVURLAsset(URL: inputUrl, options: nil) as AVAsset
var videoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
var videoSize = videoTrack.naturalSize
var videoWriterCompressionSettings = Dictionary(dictionaryLiteral:(AVVideoAverageBitRateKey,NSNumber(integer:960000)))
var videoWriterSettings = Dictionary(dictionaryLiteral:(AVVideoCodecKey,AVVideoCodecH264),
(AVVideoCompressionPropertiesKey,videoWriterCompressionSettings),
(AVVideoWidthKey,videoSize.width),
(AVVideoHeightKey,videoSize.height))
var videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoWriterSettings)
videoWriterInput.expectsMediaDataInRealTime = true
videoWriterInput.transform = videoTrack.preferredTransform
var videoWriter = AVAssetWriter(URL: outputURL, fileType: AVFileTypeQuickTimeMovie, error: nil)
videoWriter.addInput(videoWriterInput)
var videoReaderSettings: [String:AnyObject] = [kCVPixelBufferPixelFormatTypeKey:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]
var videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
var videoReader = AVAssetReader(asset: videoAsset, error: nil)
videoReader.addOutput(videoReaderOutput)
//setup audio writer
var audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: nil)
audioWriterInput.expectsMediaDataInRealTime = false
videoWriter.addInput(audioWriterInput)
//setup audio reader
var audioTrack = videoAsset.tracksWithMediaType(AVMediaTypeAudio)[0] as AVAssetTrack
var audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil) as AVAssetReaderOutput
var audioReader = AVAssetReader(asset: videoAsset, error: nil)
audioReader.addOutput(audioReaderOutput)
videoWriter.startWriting()
//start writing from video reader
videoReader.startReading()
videoWriter.startSessionAtSourceTime(kCMTimeZero)
//dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue", nil)
var queue = dispatch_queue_create("processingQueue", nil)
videoWriterInput.requestMediaDataWhenReadyOnQueue(queue, usingBlock: { () -> Void in
println("Export starting")
while videoWriterInput.readyForMoreMediaData
{
var sampleBuffer:CMSampleBufferRef!
sampleBuffer = videoReaderOutput.copyNextSampleBuffer()
if (videoReader.status == AVAssetReaderStatus.Reading && sampleBuffer != nil)
{
videoWriterInput.appendSampleBuffer(sampleBuffer)
}
else
{
videoWriterInput.markAsFinished()
if videoReader.status == AVAssetReaderStatus.Completed
{
if audioReader.status == AVAssetReaderStatus.Reading || audioReader.status == AVAssetReaderStatus.Completed
{
}
else {
audioReader.startReading()
videoWriter.startSessionAtSourceTime(kCMTimeZero)
var queue2 = dispatch_queue_create("processingQueue2", nil)
audioWriterInput.requestMediaDataWhenReadyOnQueue(queue2, usingBlock: { () -> Void in
while audioWriterInput.readyForMoreMediaData
{
var sampleBuffer:CMSampleBufferRef!
sampleBuffer = audioReaderOutput.copyNextSampleBuffer()
println(sampleBuffer == nil)
if (audioReader.status == AVAssetReaderStatus.Reading && sampleBuffer != nil)
{
audioWriterInput.appendSampleBuffer(sampleBuffer)
}
else
{
audioWriterInput.markAsFinished()
if (audioReader.status == AVAssetReaderStatus.Completed)
{
videoWriter.finishWritingWithCompletionHandler({ () -> Void in
println("Finished writing video asset.")
self.videoUrl = outputURL
var data = NSData(contentsOfURL: outputURL)!
println("Byte Size After Compression: \(data.length / 1048576) mb")
println(videoAsset.playable)
//Networking().uploadVideo(data, fileName: "Test2")
self.dismissViewControllerAnimated(true, completion: nil)
})
break
}
}
}
})
break
}
}
}// Second if
}//first while
})// first block
// return
}
Here is the code for my UIImagePickerController that calls the compress method
func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [NSObject : AnyObject])
{
// Extract the media type from selection
let type = info[UIImagePickerControllerMediaType] as String
if (type == kUTTypeMovie)
{
self.videoUrl = info[UIImagePickerControllerMediaURL] as? NSURL
var uploadUrl = NSURL.fileURLWithPath(NSTemporaryDirectory().stringByAppendingPathComponent("captured").stringByAppendingString(".mov"))
var data = NSData(contentsOfURL: self.videoUrl!)!
println("Size Before Compression: \(data.length / 1048576) mb")
self.convertVideo(self.videoUrl!, outputURL: uploadUrl!)
// Get the video from the info and set it appropriately.
/*self.dismissViewControllerAnimated(true, completion: { () -> Void in
//self.next.enabled = true
})*/
}
}
As i mentioned above this works as far as file size reduction, but when i get the file back (it is still of type .mov) quicktime cannot play it. Quicktime does try to convert it initially but fails halfway through (1-2 seconds after opening the file.) I've even tested the video file in AVPlayerController but it doesn't give any info about the movie, its just a play button without ant loading and without any length just "--" where the time is usually in the player. IE a corrupt file that won't play.
Im sure it has something to do with the settings for writing the asset out wether it is the video writing or the audio writing I'm not sure at all. It could even be the reading of the asset that is causing it to be corrupt. I've tried changing the variables around and setting different keys for reading and writing but i haven't found the right combination and this sucks that i can compress but get a corrupt file out of it. I'm not sure at all and any help would be appreciated. Pleeeeeeeeease.
This answer has been completely rewritten and annotated to support Swift 4.0. Keep in mind that changing the AVFileType and presetName values allows you to tweak the final output in terms of size and quality.
import AVFoundation
extension ViewController: AVCaptureFileOutputRecordingDelegate {
// Delegate function has been updated
func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
// This code just exists for getting the before size. You can remove it from production code
do {
let data = try Data(contentsOf: outputFileURL)
print("File size before compression: \(Double(data.count / 1048576)) mb")
} catch {
print("Error: \(error)")
}
// This line creates a generic filename based on UUID, but you may want to use your own
// The extension must match with the AVFileType enum
let path = NSTemporaryDirectory() + UUID().uuidString + ".m4v"
let outputURL = URL.init(fileURLWithPath: path)
let urlAsset = AVURLAsset(url: outputURL)
// You can change the presetName value to obtain different results
if let exportSession = AVAssetExportSession(asset: urlAsset,
presetName: AVAssetExportPresetMediumQuality) {
exportSession.outputURL = outputURL
// Changing the AVFileType enum gives you different options with
// varying size and quality. Just ensure that the file extension
// aligns with your choice
exportSession.outputFileType = AVFileType.mov
exportSession.exportAsynchronously {
switch exportSession.status {
case .unknown: break
case .waiting: break
case .exporting: break
case .completed:
// This code only exists to provide the file size after compression. Should remove this from production code
do {
let data = try Data(contentsOf: outputFileURL)
print("File size after compression: \(Double(data.count / 1048576)) mb")
} catch {
print("Error: \(error)")
}
case .failed: break
case .cancelled: break
}
}
}
}
}
Below is the original answer as written for Swift 3.0:
extension ViewController: AVCaptureFileOutputRecordingDelegate {
func capture(_ captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAt outputFileURL: URL!, fromConnections connections: [Any]!, error: Error!) {
guard let data = NSData(contentsOf: outputFileURL as URL) else {
return
}
print("File size before compression: \(Double(data.length / 1048576)) mb")
let compressedURL = NSURL.fileURL(withPath: NSTemporaryDirectory() + NSUUID().uuidString + ".m4v")
compressVideo(inputURL: outputFileURL as URL, outputURL: compressedURL) { (exportSession) in
guard let session = exportSession else {
return
}
switch session.status {
case .unknown:
break
case .waiting:
break
case .exporting:
break
case .completed:
guard let compressedData = NSData(contentsOf: compressedURL) else {
return
}
print("File size after compression: \(Double(compressedData.length / 1048576)) mb")
case .failed:
break
case .cancelled:
break
}
}
}
func compressVideo(inputURL: URL, outputURL: URL, handler:#escaping (_ exportSession: AVAssetExportSession?)-> Void) {
let urlAsset = AVURLAsset(url: inputURL, options: nil)
guard let exportSession = AVAssetExportSession(asset: urlAsset, presetName: AVAssetExportPresetMediumQuality) else {
handler(nil)
return
}
exportSession.outputURL = outputURL
exportSession.outputFileType = AVFileTypeQuickTimeMovie
exportSession.shouldOptimizeForNetworkUse = true
exportSession.exportAsynchronously { () -> Void in
handler(exportSession)
}
}
}
Figured it out!
Ok so there were 2 problems: 1 problem was with the videoWriter.finishWritingWithCompletionHandler function call. when this completion block gets executed it DOES NOT MEAN that the video writer has finished writing to the output url. So i had to check if the status was completed before i uploaded the actual video file. It's kind of a hack but this is what i did
videoWriter.finishWritingWithCompletionHandler({() -> Void in
while true
{
if videoWriter.status == .Completed
{
var data = NSData(contentsOfURL: outputURL)!
println("Finished: Byte Size After Compression: \(data.length / 1048576) mb")
Networking().uploadVideo(data, fileName: "Video")
self.dismissViewControllerAnimated(true, completion: nil)
break
}
}
})
The second problem I was having was a Failed status and that was because i kept writing to the same temp directory as shown in the code for the UIImagePickerController didFinishSelectingMediaWithInfo method in my question. So i just used the current date as a directory name so it would be unique.
var uploadUrl = NSURL.fileURLWithPath(NSTemporaryDirectory().stringByAppendingPathComponent("\(NSDate())").stringByAppendingString(".mov"))
[EDIT]: BETTER SOLUTION
Ok so after a lot of experimenting and months later I've found a damn good and much simpler solution for getting a video down from 45 mb down to 1.42 mb with pretty good quality.
Below is the function to call instead of the original convertVideo function. note that i had to write my own completion handler paramater which is called after the asynchronous export has finished. i just called it handler.
func compressVideo(inputURL: NSURL, outputURL: NSURL, handler:(session: AVAssetExportSession)-> Void)
{
var urlAsset = AVURLAsset(URL: inputURL, options: nil)
var exportSession = AVAssetExportSession(asset: urlAsset, presetName: AVAssetExportPresetMediumQuality)
exportSession.outputURL = outputURL
exportSession.outputFileType = AVFileTypeQuickTimeMovie
exportSession.shouldOptimizeForNetworkUse = true
exportSession.exportAsynchronouslyWithCompletionHandler { () -> Void in
handler(session: exportSession)
}
}
And here is the code in the uiimagepickercontrollerDidFinisPickingMediaWithInfo function.
self.compressVideo(inputURL!, outputURL: uploadUrl!, handler: { (handler) -> Void in
if handler.status == AVAssetExportSessionStatus.Completed
{
var data = NSData(contentsOfURL: uploadUrl!)
println("File size after compression: \(Double(data!.length / 1048576)) mb")
self.picker.dismissViewControllerAnimated(true, completion: nil)
}
else if handler.status == AVAssetExportSessionStatus.Failed
{
let alert = UIAlertView(title: "Uh oh", message: " There was a problem compressing the video maybe you can try again later. Error: \(handler.error.localizedDescription)", delegate: nil, cancelButtonTitle: "Okay")
alert.show()
})
}
})
Your conversion method is asynchronous, yet doesn't have a completion block. So how can your code know when the file is ready? Maybe you're using the file before it is been completely written.
The conversion itself also looks strange - audio and video are usually written in parallel, not in series.
Your miraculous compression ratio might indicate that you've written out fewer frames than you actually think.