How to compress a video in swift? - ios

TLDR: Skip to the updates. I am looking for a way to compress or lower the quality of video output, preferably after not directly after creating, but if that is the only way then so be it
Also if you know of any good cocoa pods which can accomplish this that would be good.
Update 3:
I am looking for a function which can output the compressed URL, and I should be able to control the compression quality...
Update 2:
After trying to make the function work in its current state it doe not work. Yeilding nil. I think as a result of the following:
let outputURL = urlToCompress
assetWriter = try AVAssetWriter(outputURL: outputURL, fileType: AVFileType.mov)
I am trying to compress video in swift. So far all solutions to this have been for use during creation. I am wondering if there is a way to compress after creation? only using the video URL?
If not then how can I make a compression function which compresses the video and returns the compressed URL?
Code I have been working with:
func compressVideo(videoURL: URL) -> URL {
let data = NSData(contentsOf: videoURL as URL)!
print("File size before compression: \(Double(data.length / 1048576)) mb")
let compressedURL = NSURL.fileURL(withPath: NSTemporaryDirectory() + NSUUID().uuidString + ".mov")
compressVideoHelperMethod(inputURL: videoURL , outputURL: compressedURL) { (exportSession) in
}
return compressedURL
}
func compressVideoHelperMethod(inputURL: URL, outputURL: URL, handler:#escaping (_ exportSession: AVAssetExportSession?)-> Void) {
let urlAsset = AVURLAsset(url: inputURL, options: nil)
guard let exportSession = AVAssetExportSession(asset: urlAsset, presetName: AVAssetExportPresetMediumQuality) else {
handler(nil)
return
}
exportSession.outputURL = outputURL
exportSession.outputFileType = AVFileType.mov
exportSession.shouldOptimizeForNetworkUse = true
exportSession.exportAsynchronously { () -> Void in
handler(exportSession)
}
}
Update:
So I have found the code below. I am yet to test it, but I don't know how I can make it so that I choose the quality of the compression:
var assetWriter:AVAssetWriter?
var assetReader:AVAssetReader?
let bitrate:NSNumber = NSNumber(value:250000)
func compressFile(urlToCompress: URL, outputURL: URL, completion:#escaping (URL)->Void){
//video file to make the asset
var audioFinished = false
var videoFinished = false
let asset = AVAsset(url: urlToCompress);
let duration = asset.duration
let durationTime = CMTimeGetSeconds(duration)
print("Video Actual Duration -- \(durationTime)")
//create asset reader
do{
assetReader = try AVAssetReader(asset: asset)
} catch{
assetReader = nil
}
guard let reader = assetReader else{
fatalError("Could not initalize asset reader probably failed its try catch")
}
let videoTrack = asset.tracks(withMediaType: AVMediaType.video).first!
let audioTrack = asset.tracks(withMediaType: AVMediaType.audio).first!
let videoReaderSettings: [String:Any] = [(kCVPixelBufferPixelFormatTypeKey as String?)!:kCVPixelFormatType_32ARGB ]
// ADJUST BIT RATE OF VIDEO HERE
if #available(iOS 11.0, *) {
let videoSettings:[String:Any] = [
AVVideoCompressionPropertiesKey: [AVVideoAverageBitRateKey:self.bitrate],
AVVideoCodecKey: AVVideoCodecType.h264,
AVVideoHeightKey: videoTrack.naturalSize.height,
AVVideoWidthKey: videoTrack.naturalSize.width
]
} else {
// Fallback on earlier versions
}
let assetReaderVideoOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
let assetReaderAudioOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
if reader.canAdd(assetReaderVideoOutput){
reader.add(assetReaderVideoOutput)
}else{
fatalError("Couldn't add video output reader")
}
if reader.canAdd(assetReaderAudioOutput){
reader.add(assetReaderAudioOutput)
}else{
fatalError("Couldn't add audio output reader")
}
let audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
let videoInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoReaderSettings)
videoInput.transform = videoTrack.preferredTransform
//we need to add samples to the video input
let videoInputQueue = DispatchQueue(label: "videoQueue")
let audioInputQueue = DispatchQueue(label: "audioQueue")
do{
assetWriter = try AVAssetWriter(outputURL: outputURL, fileType: AVFileType.mov)
}catch{
assetWriter = nil
}
guard let writer = assetWriter else{
fatalError("assetWriter was nil")
}
writer.shouldOptimizeForNetworkUse = true
writer.add(videoInput)
writer.add(audioInput)
writer.startWriting()
reader.startReading()
writer.startSession(atSourceTime: CMTime.zero)
let closeWriter:()->Void = {
if (audioFinished && videoFinished){
self.assetWriter?.finishWriting(completionHandler: {
print("------ Finish Video Compressing")
completion((self.assetWriter?.outputURL)!)
})
self.assetReader?.cancelReading()
}
}
audioInput.requestMediaDataWhenReady(on: audioInputQueue) {
while(audioInput.isReadyForMoreMediaData){
let sample = assetReaderAudioOutput.copyNextSampleBuffer()
if (sample != nil){
audioInput.append(sample!)
}else{
audioInput.markAsFinished()
DispatchQueue.main.async {
audioFinished = true
closeWriter()
}
break;
}
}
}
videoInput.requestMediaDataWhenReady(on: videoInputQueue) {
//request data here
while(videoInput.isReadyForMoreMediaData){
let sample = assetReaderVideoOutput.copyNextSampleBuffer()
if (sample != nil){
let timeStamp = CMSampleBufferGetPresentationTimeStamp(sample!)
let timeSecond = CMTimeGetSeconds(timeStamp)
let per = timeSecond / durationTime
print("Duration --- \(per)")
videoInput.append(sample!)
}else{
videoInput.markAsFinished()
DispatchQueue.main.async {
videoFinished = true
closeWriter()
}
break;
}
}
}
}
How can I change this to be able to set the quality? I am looking for compression of about 0.6
I am now playing around with the following code, the issue is that it keeps printing the error (does not seem to work):
func convertVideoToLowQuailty(withInputURL inputURL: URL?, outputURL: URL?, handler: #escaping (AVAssetExportSession?) -> Void) {
do {
if let outputURL = outputURL {
try FileManager.default.removeItem(at: outputURL)
}
} catch {
}
var asset: AVURLAsset? = nil
if let inputURL = inputURL {
asset = AVURLAsset(url: inputURL, options: nil)
}
var exportSession: AVAssetExportSession? = nil
if let asset = asset {
exportSession = AVAssetExportSession(asset: asset, presetName:AVAssetExportPresetMediumQuality)
}
exportSession?.outputURL = outputURL
exportSession?.outputFileType = .mov
exportSession?.exportAsynchronously(completionHandler: {
handler(exportSession)
})
}
func compressVideo(videoURL: URL) -> URL {
var outputURL = URL(fileURLWithPath: "/Users/alexramirezblonski/Desktop/output.mov")
convertVideoToLowQuailty(withInputURL: videoURL, outputURL: outputURL, handler: { exportSession in
print("fdshljfhdlasjkfdhsfsdljk")
if exportSession?.status == .completed {
print("completed\n", exportSession!.outputURL!)
outputURL = exportSession!.outputURL!
} else {
print("error\n")
outputURL = exportSession!.outputURL!//this needs to be fixed and may cause errors
}
})
return outputURL
}

I have looked at your code. Actually, you are compressing video in medium quality which will be around the same as the original video which you have. So, you have to change presetName in export session initialization as follow:
exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetLowQuality)
You can pass AVAssetExportPresetMediumQuality, So it's could be compressed as you expect.
There is the following list of format available to compress video.
1. Available from iOS 11.0
AVAssetExportPresetHEVCHighestQuality
AVAssetExportPresetHEVC1920x1080
AVAssetExportPresetHEVC3840x2160
2. Available from iOS 4.0
AVAssetExportPresetLowQuality
AVAssetExportPresetMediumQuality
AVAssetExportPresetHighestQuality
AVAssetExportPreset640x480
AVAssetExportPreset960x540
AVAssetExportPreset1280x720
AVAssetExportPreset1920x1080
AVAssetExportPreset3840x2160
AVAssetExportPresetAppleM4A
You can use above all format to compress your video based on your requirements.
I hope this will help you.

If you want more customizable compression filters using AVAssetWriter, consider this library that I wrote. You can compress the video with general quality settings or with more detail filters like bitrate, fps, scale, and more.
FYVideoCompressor().compressVideo(yourVideoPath, quality: .lowQuality) { result in
switch result {
case .success(let compressedVideoURL):
case .failure(let error):
}
}
or with more custom configuration:
let config = FYVideoCompressor.CompressionConfig(videoBitrate: 1000_000,
videomaxKeyFrameInterval: 10,
fps: 24,
audioSampleRate: 44100,
audioBitrate: 128_000,
fileType: .mp4,
scale: CGSize(width: 640, height: 480))
FYVideoCompressor().compressVideo(yourVideoPath, config: config) { result in
switch result {
case .success(let compressedVideoURL):
case .failure(let error):
}
}
More: Batch compression is now supported.

Related

how to get video and audio from MKV? (swift)

i want to convert MKV (Matroska) to MP4 in swift
when i add a file with MKV format my code break in line 8 , what should i do to fix that?
this is my code:
let composition = AVMutableComposition()
do {
let sourceUrl = Bundle.main.url(forResource: "sample", withExtension: "mov")!
let asset = AVURLAsset(url: sourceUrl)
8 ->here guard let videoAssetTrack = asset.tracks(withMediaType: AVMediaType.video).first else { return }
guard let audioCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid) else { return }
try audioCompositionTrack.insertTimeRange(videoAssetTrack.timeRange, of: videoAssetTrack, at: CMTime.zero)
} catch {
print(error)
}
// Create an export session
let exportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetPassthrough)!
exportSession.outputFileType = AVFileType.mp4
exportSession.outputURL = browseURL
// Export file
exportSession.exportAsynchronously {
guard case exportSession.status = AVAssetExportSession.Status.completed else { return }
DispatchQueue.main.async {
// Present a UIActivityViewController to share audio file
print("completed")
}
}

kCMSampleBufferAttachmentKey_TrimDurationAtStart crash

I have the following function that takes a url of an existing video file and compresses it into an output file. I am using Avasset Reader and writter.
var assetReader:AVAssetReader?
var assetWriter:AVAssetWriter?
func compressFile(urlToCompress: URL, outputURL: URL, completion:#escaping (URL)->Void){
//video file to make the asset
var audioFinished = false
var videoFinished = false
let asset = AVAsset(url: urlToCompress);
//create asset reader
do{
assetReader = try AVAssetReader(asset: asset)
} catch{
assetReader = nil
}
guard let reader = assetReader else{
fatalError("Could not initalize asset reader probably failed its try catch")
}
let videoTrack = asset.tracks(withMediaType: AVMediaType.video).first!
let audioTrack = asset.tracks(withMediaType: AVMediaType.audio).first!
let videoReaderSettings: [String:Any] = [kCVPixelBufferPixelFormatTypeKey as String!:kCVPixelFormatType_32ARGB ]
// ADJUST BIT RATE OF VIDEO HERE
let width = UIScreen.main.bounds.width - 20
let scale = width / videoTrack.naturalSize.width
let height = Int(videoTrack.naturalSize.height * scale)
let videoSettings:[String:Any] = [
AVVideoCompressionPropertiesKey: [AVVideoAverageBitRateKey:4000000],
AVVideoCodecKey: AVVideoCodecH264,
AVVideoHeightKey: height,
AVVideoWidthKey: Int(width)
]
let assetReaderVideoOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
let assetReaderAudioOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
if reader.canAdd(assetReaderVideoOutput){
reader.add(assetReaderVideoOutput)
}else{
fatalError("Couldn't add video output reader")
}
if reader.canAdd(assetReaderAudioOutput){
reader.add(assetReaderAudioOutput)
}else{
fatalError("Couldn't add audio output reader")
}
let audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
let videoInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoSettings)
videoInput.transform = videoTrack.preferredTransform
//we need to add samples to the video input
let videoInputQueue = DispatchQueue(label: "videoQueue")
let audioInputQueue = DispatchQueue(label: "audioQueue")
do{
assetWriter = try AVAssetWriter(outputURL: outputURL, fileType: AVFileType.mov)
}catch{
assetWriter = nil
}
guard let writer = assetWriter else{
fatalError("assetWriter was nil")
}
writer.shouldOptimizeForNetworkUse = true
writer.add(videoInput)
writer.add(audioInput)
writer.startWriting()
reader.startReading()
writer.startSession(atSourceTime: CMTime.zero)
let closeWriter:()->Void = {
if (audioFinished && videoFinished){
self.assetWriter?.finishWriting(completionHandler: {
self.checkFileSize(sizeUrl: (self.assetWriter?.outputURL)!, message: "The file size of the compressed file is: ")
completion((self.assetWriter?.outputURL)!)
})
self.assetReader?.cancelReading()
}
}
audioInput.requestMediaDataWhenReady(on: audioInputQueue) {
while(audioInput.isReadyForMoreMediaData){
let sample = assetReaderAudioOutput.copyNextSampleBuffer()
if (sample != nil){
audioInput.append(sample!)
}else{
audioInput.markAsFinished()
DispatchQueue.main.async {
audioFinished = true
closeWriter()
}
break;
}
}
}
videoInput.requestMediaDataWhenReady(on: videoInputQueue) {
//request data here
while(videoInput.isReadyForMoreMediaData){
let sample = assetReaderVideoOutput.copyNextSampleBuffer()
if (sample != nil){
videoInput.append(sample!)
}else{
videoInput.markAsFinished()
DispatchQueue.main.async {
videoFinished = true
closeWriter()
}
break;
}
}
}
}
func checkFileSize(sizeUrl: URL, message:String){
let data = NSData(contentsOf: sizeUrl)!
print(message, (Double(data.length) / 1048576.0), " mb")
}
For some reason i am getting the error bellow. my question is how do i set this key and more importantly why should i set it and to what value. and what is encoder delay? thank you
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetWriterInput appendSampleBuffer:] Cannot append sample buffer: First input buffer must have an appropriate kCMSampleBufferAttachmentKey_TrimDurationAtStart since the codec has encoder delay'

Is there any way to compress video quickly?

I am using AVAssetWriter for compressing video but it taking 20 to 25 seconds to compress a video of size 160 in 12 mb. i want fix bitrate of 1600 kbps and natural frame of video. Is there any way to compress quickly?
func compressFile(urlToCompress: URL, outputURL: URL, completion:#escaping (URL)->Void) {
//video file to make the asset
var audioFinished = false
var videoFinished = false
let asset = AVAsset(url: urlToCompress);
//create asset reader
do{
assetReader = try AVAssetReader(asset: asset)
} catch{
assetReader = nil
}
guard let reader = assetReader else{
fatalError("Could not initalize asset reader probably failed its try catch")
}
let videoTrack = asset.tracks(withMediaType: AVMediaType.video).first!
let audioTrack = asset.tracks(withMediaType: AVMediaType.audio).first!
let videoReaderSettings: [String:Any] = [(kCVPixelBufferPixelFormatTypeKey as String?)!:kCVPixelFormatType_32ARGB ]
// MARK: ADJUST BIT RATE OF VIDEO HERE
let videoSettings:[String:Any] = [
AVVideoCompressionPropertiesKey: [AVVideoAverageBitRateKey: self.bitrate] as Any,
AVVideoCodecKey: AVVideoCodecType.h264,
AVVideoHeightKey: videoTrack.naturalSize.height,//352,
AVVideoWidthKey: videoTrack.naturalSize.width //640//
]
let assetReaderVideoOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
let assetReaderAudioOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
if reader.canAdd(assetReaderVideoOutput){
reader.add(assetReaderVideoOutput)
}else{
fatalError("Couldn't add video output reader")
}
if reader.canAdd(assetReaderAudioOutput){
reader.add(assetReaderAudioOutput)
}else{
fatalError("Couldn't add audio output reader")
}
let audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
let videoInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoSettings)
videoInput.transform = videoTrack.preferredTransform
//we need to add samples to the video input
let videoInputQueue = DispatchQueue(label: "videoQueue")
let audioInputQueue = DispatchQueue(label: "audioQueue")
do{
assetWriter = try AVAssetWriter(outputURL: outputURL, fileType: AVFileType.mov)
}catch{
assetWriter = nil
}
guard let writer = assetWriter else{
fatalError("assetWriter was nil")
}
writer.shouldOptimizeForNetworkUse = true
writer.add(videoInput)
writer.add(audioInput)
writer.startWriting()
reader.startReading()
writer.startSession(atSourceTime: CMTime.zero)
let closeWriter:()->Void = {
if (audioFinished && videoFinished){
self.assetWriter?.finishWriting(completionHandler: {
self.checkFileSize(sizeUrl: (self.assetWriter?.outputURL)!, message: "The file size of the compressed file is: ")
completion((self.assetWriter?.outputURL)!)
})
self.assetReader?.cancelReading()
}
}
audioInput.requestMediaDataWhenReady(on: audioInputQueue) {
while(audioInput.isReadyForMoreMediaData){
let sample = assetReaderAudioOutput.copyNextSampleBuffer()
if (sample != nil){
audioInput.append(sample!)
}else{
audioInput.markAsFinished()
DispatchQueue.main.async {
audioFinished = true
closeWriter()
}
break;
}
}
}
videoInput.requestMediaDataWhenReady(on: videoInputQueue) {
//request data here
while(videoInput.isReadyForMoreMediaData){
let sample = assetReaderVideoOutput.copyNextSampleBuffer()
if (sample != nil){
videoInput.append(sample!)
}else{
videoInput.markAsFinished()
DispatchQueue.main.async {
videoFinished = true
closeWriter()
}
break;
}
}
}
}
Edit:- Using above method it converts video on different bitrate not on defined bit rate why?
Any Direction will appreciate.
Thanks in advance
This is a non-trivial problem. There is no "simple" answer to how to encode faster. If you are unfamiliar with this, you are better off picking another codec bit-rates and codecs that encode faster. You have to assume the library is doing it's job. So, there a few things you can do:
Make the source file smaller -- or chunk it
Run it on a cloud service which has beefier hardware
You can choose another codec and a lower set of bit-rate + width and height. (https://developer.apple.com/documentation/avfoundation/avvideocodectype)
You can potentially spin up multiple threads and encode faster this way but I doubt it would gain you much

Issue in merging/mixing of two audio using AVMutableCompositionTrack

it crashes when i try to play the merged audio
I am trying to merging two audio file . They merged successfully, but I am seeing with a problem that when i play that audio with AVAudioPlayer it crashes . also there is a problem with formats as the merge audio only store with .m4a format if i save that audio with .wav format the it crashes.
func merge(audio1: NSURL, audio2: NSURL) {
let finalURL = getMergeFileURL()
let preferredTimeScale : Int32 = 100
//This object will be edited to include both audio files
let composition = AVMutableComposition()
let compositionAudioTrack1:AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
//let url1 = audio1
let avAsset1 = AVURLAsset(url: audio1 as URL, options: nil)
let tracks1 = avAsset1.tracks(withMediaType: AVMediaTypeAudio)
let assetTrack1:AVAssetTrack = tracks1[0]
let duration1: CMTime = CMTimeMakeWithSeconds(30.0, preferredTimeScale)
let startCMTime = CMTimeMakeWithSeconds(Double(30.0), preferredTimeScale)
let timeRange1 = CMTimeRangeMake(startCMTime, duration1)
let compositionAudioTrack2:AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
//let url2 = audio2
let avAsset2 = AVURLAsset(url: audio2 as URL, options: nil)
let tracks2 = avAsset2.tracks(withMediaType: AVMediaTypeAudio)
let assetTrack2:AVAssetTrack = tracks2[0]
let duration2: CMTime = CMTimeMakeWithSeconds(30.0, preferredTimeScale)
let startCMTime2 = CMTimeMakeWithSeconds(Double(30.0), preferredTimeScale)
let timeRange2 = CMTimeRangeMake(startCMTime, duration1)
//Insert the tracks into the composition
do {
try compositionAudioTrack1.insertTimeRange(timeRange1, of: assetTrack1, at: kCMTimeZero)
try compositionAudioTrack2.insertTimeRange(timeRange2, of: assetTrack2, at: duration1)
} catch {
print(error)
}
//Perform the merge
let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
assetExport!.outputFileType = AVFileTypeAppleM4A
assetExport!.outputURL = finalURL as URL // final url is the url of that merged file
assetExport!.exportAsynchronously(completionHandler: {
switch assetExport!.status{
case AVAssetExportSessionStatus.failed:
print("failed \(assetExport!.error)")
case AVAssetExportSessionStatus.cancelled:
print("cancelled \(assetExport!.error)")
default:
print("complete")
}
})
}

IOS Video Compression Swift iOS 8 corrupt video file

I am trying to compress video taken with the users camera from UIImagePickerController (Not an existing video but one on the fly) to upload to my server and take a small amount of time to do so, so a smaller size is ideal instead of 30-45 mb on newer quality cameras.
Here is the code to do a compression in swift for iOS 8 and it compresses wonderfully, i go from 35 mb down to 2.1 mb easily.
func convertVideo(inputUrl: NSURL, outputURL: NSURL)
{
//setup video writer
var videoAsset = AVURLAsset(URL: inputUrl, options: nil) as AVAsset
var videoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
var videoSize = videoTrack.naturalSize
var videoWriterCompressionSettings = Dictionary(dictionaryLiteral:(AVVideoAverageBitRateKey,NSNumber(integer:960000)))
var videoWriterSettings = Dictionary(dictionaryLiteral:(AVVideoCodecKey,AVVideoCodecH264),
(AVVideoCompressionPropertiesKey,videoWriterCompressionSettings),
(AVVideoWidthKey,videoSize.width),
(AVVideoHeightKey,videoSize.height))
var videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoWriterSettings)
videoWriterInput.expectsMediaDataInRealTime = true
videoWriterInput.transform = videoTrack.preferredTransform
var videoWriter = AVAssetWriter(URL: outputURL, fileType: AVFileTypeQuickTimeMovie, error: nil)
videoWriter.addInput(videoWriterInput)
var videoReaderSettings: [String:AnyObject] = [kCVPixelBufferPixelFormatTypeKey:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]
var videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
var videoReader = AVAssetReader(asset: videoAsset, error: nil)
videoReader.addOutput(videoReaderOutput)
//setup audio writer
var audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: nil)
audioWriterInput.expectsMediaDataInRealTime = false
videoWriter.addInput(audioWriterInput)
//setup audio reader
var audioTrack = videoAsset.tracksWithMediaType(AVMediaTypeAudio)[0] as AVAssetTrack
var audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil) as AVAssetReaderOutput
var audioReader = AVAssetReader(asset: videoAsset, error: nil)
audioReader.addOutput(audioReaderOutput)
videoWriter.startWriting()
//start writing from video reader
videoReader.startReading()
videoWriter.startSessionAtSourceTime(kCMTimeZero)
//dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue", nil)
var queue = dispatch_queue_create("processingQueue", nil)
videoWriterInput.requestMediaDataWhenReadyOnQueue(queue, usingBlock: { () -> Void in
println("Export starting")
while videoWriterInput.readyForMoreMediaData
{
var sampleBuffer:CMSampleBufferRef!
sampleBuffer = videoReaderOutput.copyNextSampleBuffer()
if (videoReader.status == AVAssetReaderStatus.Reading && sampleBuffer != nil)
{
videoWriterInput.appendSampleBuffer(sampleBuffer)
}
else
{
videoWriterInput.markAsFinished()
if videoReader.status == AVAssetReaderStatus.Completed
{
if audioReader.status == AVAssetReaderStatus.Reading || audioReader.status == AVAssetReaderStatus.Completed
{
}
else {
audioReader.startReading()
videoWriter.startSessionAtSourceTime(kCMTimeZero)
var queue2 = dispatch_queue_create("processingQueue2", nil)
audioWriterInput.requestMediaDataWhenReadyOnQueue(queue2, usingBlock: { () -> Void in
while audioWriterInput.readyForMoreMediaData
{
var sampleBuffer:CMSampleBufferRef!
sampleBuffer = audioReaderOutput.copyNextSampleBuffer()
println(sampleBuffer == nil)
if (audioReader.status == AVAssetReaderStatus.Reading && sampleBuffer != nil)
{
audioWriterInput.appendSampleBuffer(sampleBuffer)
}
else
{
audioWriterInput.markAsFinished()
if (audioReader.status == AVAssetReaderStatus.Completed)
{
videoWriter.finishWritingWithCompletionHandler({ () -> Void in
println("Finished writing video asset.")
self.videoUrl = outputURL
var data = NSData(contentsOfURL: outputURL)!
println("Byte Size After Compression: \(data.length / 1048576) mb")
println(videoAsset.playable)
//Networking().uploadVideo(data, fileName: "Test2")
self.dismissViewControllerAnimated(true, completion: nil)
})
break
}
}
}
})
break
}
}
}// Second if
}//first while
})// first block
// return
}
Here is the code for my UIImagePickerController that calls the compress method
func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [NSObject : AnyObject])
{
// Extract the media type from selection
let type = info[UIImagePickerControllerMediaType] as String
if (type == kUTTypeMovie)
{
self.videoUrl = info[UIImagePickerControllerMediaURL] as? NSURL
var uploadUrl = NSURL.fileURLWithPath(NSTemporaryDirectory().stringByAppendingPathComponent("captured").stringByAppendingString(".mov"))
var data = NSData(contentsOfURL: self.videoUrl!)!
println("Size Before Compression: \(data.length / 1048576) mb")
self.convertVideo(self.videoUrl!, outputURL: uploadUrl!)
// Get the video from the info and set it appropriately.
/*self.dismissViewControllerAnimated(true, completion: { () -> Void in
//self.next.enabled = true
})*/
}
}
As i mentioned above this works as far as file size reduction, but when i get the file back (it is still of type .mov) quicktime cannot play it. Quicktime does try to convert it initially but fails halfway through (1-2 seconds after opening the file.) I've even tested the video file in AVPlayerController but it doesn't give any info about the movie, its just a play button without ant loading and without any length just "--" where the time is usually in the player. IE a corrupt file that won't play.
Im sure it has something to do with the settings for writing the asset out wether it is the video writing or the audio writing I'm not sure at all. It could even be the reading of the asset that is causing it to be corrupt. I've tried changing the variables around and setting different keys for reading and writing but i haven't found the right combination and this sucks that i can compress but get a corrupt file out of it. I'm not sure at all and any help would be appreciated. Pleeeeeeeeease.
This answer has been completely rewritten and annotated to support Swift 4.0. Keep in mind that changing the AVFileType and presetName values allows you to tweak the final output in terms of size and quality.
import AVFoundation
extension ViewController: AVCaptureFileOutputRecordingDelegate {
// Delegate function has been updated
func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
// This code just exists for getting the before size. You can remove it from production code
do {
let data = try Data(contentsOf: outputFileURL)
print("File size before compression: \(Double(data.count / 1048576)) mb")
} catch {
print("Error: \(error)")
}
// This line creates a generic filename based on UUID, but you may want to use your own
// The extension must match with the AVFileType enum
let path = NSTemporaryDirectory() + UUID().uuidString + ".m4v"
let outputURL = URL.init(fileURLWithPath: path)
let urlAsset = AVURLAsset(url: outputURL)
// You can change the presetName value to obtain different results
if let exportSession = AVAssetExportSession(asset: urlAsset,
presetName: AVAssetExportPresetMediumQuality) {
exportSession.outputURL = outputURL
// Changing the AVFileType enum gives you different options with
// varying size and quality. Just ensure that the file extension
// aligns with your choice
exportSession.outputFileType = AVFileType.mov
exportSession.exportAsynchronously {
switch exportSession.status {
case .unknown: break
case .waiting: break
case .exporting: break
case .completed:
// This code only exists to provide the file size after compression. Should remove this from production code
do {
let data = try Data(contentsOf: outputFileURL)
print("File size after compression: \(Double(data.count / 1048576)) mb")
} catch {
print("Error: \(error)")
}
case .failed: break
case .cancelled: break
}
}
}
}
}
Below is the original answer as written for Swift 3.0:
extension ViewController: AVCaptureFileOutputRecordingDelegate {
func capture(_ captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAt outputFileURL: URL!, fromConnections connections: [Any]!, error: Error!) {
guard let data = NSData(contentsOf: outputFileURL as URL) else {
return
}
print("File size before compression: \(Double(data.length / 1048576)) mb")
let compressedURL = NSURL.fileURL(withPath: NSTemporaryDirectory() + NSUUID().uuidString + ".m4v")
compressVideo(inputURL: outputFileURL as URL, outputURL: compressedURL) { (exportSession) in
guard let session = exportSession else {
return
}
switch session.status {
case .unknown:
break
case .waiting:
break
case .exporting:
break
case .completed:
guard let compressedData = NSData(contentsOf: compressedURL) else {
return
}
print("File size after compression: \(Double(compressedData.length / 1048576)) mb")
case .failed:
break
case .cancelled:
break
}
}
}
func compressVideo(inputURL: URL, outputURL: URL, handler:#escaping (_ exportSession: AVAssetExportSession?)-> Void) {
let urlAsset = AVURLAsset(url: inputURL, options: nil)
guard let exportSession = AVAssetExportSession(asset: urlAsset, presetName: AVAssetExportPresetMediumQuality) else {
handler(nil)
return
}
exportSession.outputURL = outputURL
exportSession.outputFileType = AVFileTypeQuickTimeMovie
exportSession.shouldOptimizeForNetworkUse = true
exportSession.exportAsynchronously { () -> Void in
handler(exportSession)
}
}
}
Figured it out!
Ok so there were 2 problems: 1 problem was with the videoWriter.finishWritingWithCompletionHandler function call. when this completion block gets executed it DOES NOT MEAN that the video writer has finished writing to the output url. So i had to check if the status was completed before i uploaded the actual video file. It's kind of a hack but this is what i did
videoWriter.finishWritingWithCompletionHandler({() -> Void in
while true
{
if videoWriter.status == .Completed
{
var data = NSData(contentsOfURL: outputURL)!
println("Finished: Byte Size After Compression: \(data.length / 1048576) mb")
Networking().uploadVideo(data, fileName: "Video")
self.dismissViewControllerAnimated(true, completion: nil)
break
}
}
})
The second problem I was having was a Failed status and that was because i kept writing to the same temp directory as shown in the code for the UIImagePickerController didFinishSelectingMediaWithInfo method in my question. So i just used the current date as a directory name so it would be unique.
var uploadUrl = NSURL.fileURLWithPath(NSTemporaryDirectory().stringByAppendingPathComponent("\(NSDate())").stringByAppendingString(".mov"))
[EDIT]: BETTER SOLUTION
Ok so after a lot of experimenting and months later I've found a damn good and much simpler solution for getting a video down from 45 mb down to 1.42 mb with pretty good quality.
Below is the function to call instead of the original convertVideo function. note that i had to write my own completion handler paramater which is called after the asynchronous export has finished. i just called it handler.
func compressVideo(inputURL: NSURL, outputURL: NSURL, handler:(session: AVAssetExportSession)-> Void)
{
var urlAsset = AVURLAsset(URL: inputURL, options: nil)
var exportSession = AVAssetExportSession(asset: urlAsset, presetName: AVAssetExportPresetMediumQuality)
exportSession.outputURL = outputURL
exportSession.outputFileType = AVFileTypeQuickTimeMovie
exportSession.shouldOptimizeForNetworkUse = true
exportSession.exportAsynchronouslyWithCompletionHandler { () -> Void in
handler(session: exportSession)
}
}
And here is the code in the uiimagepickercontrollerDidFinisPickingMediaWithInfo function.
self.compressVideo(inputURL!, outputURL: uploadUrl!, handler: { (handler) -> Void in
if handler.status == AVAssetExportSessionStatus.Completed
{
var data = NSData(contentsOfURL: uploadUrl!)
println("File size after compression: \(Double(data!.length / 1048576)) mb")
self.picker.dismissViewControllerAnimated(true, completion: nil)
}
else if handler.status == AVAssetExportSessionStatus.Failed
{
let alert = UIAlertView(title: "Uh oh", message: " There was a problem compressing the video maybe you can try again later. Error: \(handler.error.localizedDescription)", delegate: nil, cancelButtonTitle: "Okay")
alert.show()
})
}
})
Your conversion method is asynchronous, yet doesn't have a completion block. So how can your code know when the file is ready? Maybe you're using the file before it is been completely written.
The conversion itself also looks strange - audio and video are usually written in parallel, not in series.
Your miraculous compression ratio might indicate that you've written out fewer frames than you actually think.

Resources