iOS Convert AVI to MP4 Format programatically - ios

I have one query in my application, for that i want to convert my AVI formatted video to MP4 movie format, so is there any way to do this programatically.
Any code snippet will be appreciated.

You need to use AVAssetExportSession to convert videos to .mp4 format, below method convert .avi format videos to .mp4.
Check the line exportSession.outputFileType = AVFileTypeMPEG4; it specify the output format of the video.
Here inputURL is an url of video which needs to be converted and outputURL will be the final destination of video.
One more thing don't forget to specify .mp4 extension in outputURL video file.
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *filePath = [documentsDirectory stringByAppendingPathComponent:#"MyVideo.mp4"];
NSURL *outputURL = [NSURL fileURLWithPath:filePath];
[self convertVideoToLowQuailtyWithInputURL:localUrl outputURL:outputURL handler:^(AVAssetExportSession *exportSession)
{
if (exportSession.status == AVAssetExportSessionStatusCompleted) {
// Video conversation completed
}
}];
- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL outputURL:(NSURL*)outputURL handler:(void (^)(AVAssetExportSession*))handler {
[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetPassthrough];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeMPEG4;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void) {
handler(exportSession);
}];
}

Swift 4.2
An updated version of the accepted answer in Swift 4.2.
let paths = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)
let documentsDirectory = paths[0]
let filePath = URL(fileURLWithPath: documentsDirectory).appendingPathComponent("MyVideo.mp4").absoluteString
let outputURL = URL(fileURLWithPath: filePath)
convertVideoToLowQuailty(withInputURL: inputUrl, outputURL: outputURL, handler: { exportSession in
if exportSession?.status == .completed {
// Video conversation completed
}
})
func convertVideoToLowQuailty(withInputURL inputURL: URL?, outputURL: URL?, handler: #escaping (AVAssetExportSession?) -> Void) {
if let anURL = outputURL {
try? FileManager.default.removeItem(at: anURL)
}
var asset: AVURLAsset? = nil
if let anURL = inputURL {
asset = AVURLAsset(url: anURL, options: nil)
}
var exportSession: AVAssetExportSession? = nil
if let anAsset = asset {
exportSession = AVAssetExportSession(asset: anAsset, presetName: AVAssetExportPresetPassthrough)
}
exportSession?.outputURL = outputURL
exportSession?.outputFileType = .mp4
exportSession?.exportAsynchronously(completionHandler: {
handler(exportSession)
})
}
Took help from an online conversion tool.

Related

How to reduce the itunes music size in swift?

I am working with accessing Itunes Library music. I stored a selected song in my Document directory. The song's size is 13MB, and I need to reduce the song size so that I can easily send it to the server. How do I to it? Here is my overall code:
#IBAction func AddSongs(sender: UIButton)
{
displayMediaPickerAndPlayItem()
}func displayMediaPickerAndPlayItem()
{mediaPicker = MPMediaPickerController(mediaTypes: .AnyAudio)
if let picker = mediaPicker{
print("Successfully instantiated a media picker")
picker.delegate = self
view.addSubview(picker.view)
presentViewController(picker, animated: true, completion: nil)
} else {
print("Could not instantiate a media picker")
}
}
MPMediaPickerController,
didPickMediaItems mediaItemCollection Function
let item: MPMediaItem = mediaItemCollection.items[0]
print(mediaItemCollection)
print("mediaItemCollection = \(mediaItemCollection)")
let url = item.valueForProperty(MPMediaItemPropertyAssetURL) as! NSURL
FinalAudioName = item.valueForProperty(MPMediaItemPropertyTitle) as! NSString as String
// let songAsset = AVURLAsset.init(URL: url, options: nil)
// print("songAsset = \(songAsset)")
export(url, completionHandler: { _, _ in })
func export(assetURL: NSURL, completionHandler: (NSURL?, ErrorType?) -> ()) {
let asset = AVURLAsset(URL: assetURL)
guard let exporter = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else {
completionHandler(nil, ExportError.unableToCreateExporter)
return
}
let fileURL = NSURL(fileURLWithPath: NSTemporaryDirectory())
.URLByAppendingPathComponent(NSUUID().UUIDString)!
.URLByAppendingPathExtension("m4a")
exporter.outputURL = fileURL
exporter.outputFileType = "com.apple.m4a-audio"
exporter.exportAsynchronouslyWithCompletionHandler {
if exporter.status == .Completed {
completionHandler(fileURL, nil)
} else {
completionHandler(nil, exporter.error)
}
}}
If you want to compress your vide for sharing, you should look into AVAssetExportSession. Change AVAssetExportPresetAppleM4A to AVAssetExportPresetLowQuality . You can do something like this :
- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL
outputURL:(NSURL*)outputURL
handler:(void (^)(AVAssetExportSession*))handler
{
[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetLowQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
handler(exportSession);
[exportSession release];
}];
}
You can also set AVFormatIDKey to kAudioFormatMPEG4AAC for recordSettings

Issue playing slow-mo AVAsset in AVPlayer

I'm trying to play an slow motion video (filmed by the user's iPhone) in an AVPlayer.
I am retrieving the AVAsset with a request on a PHAsset from a picker:
[manager requestAVAssetForVideo:PHAsset
options:videoRequestOptions
resultHandler:^(AVAsset * avasset, AVAudioMix * audioMix, NSDictionary * info) {}];
The problem is once it plays, I get this error:
-[AVComposition URL]: unrecognized selector sent to instance 0x138d17f40
However, if I set this option on the manager request, it will play as normal speed video at 120/240fps and no crashes:
videoRequestOptions.version = PHVideoRequestOptionsVersionOriginal;
Whats going on? The default version property is PHVideoRequestOptionsVersionCurrent which incorporates slow motion, user edits and trims, etc.
I would like to play that video version. Thanks
So it turns out that slow motion videos are passed as AVComposition.
You can export that into a video file / URL, and then handle it like any other video.
Solution here: https://overflow.buffer.com/2016/02/29/slow-motion-video-ios/
//Output URL
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = paths.firstObject;
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"mergeSlowMoVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
//Begin slow mo video export
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (exporter.status == AVAssetExportSessionStatusCompleted) {
NSURL *URL = exporter.outputURL;
NSData *videoData = [NSData dataWithContentsOfURL:URL];
// Upload
[self uploadSelectedVideo:video data:videoData];
}
});
}];
For those coming here looking for a swift answer, this is the swift translation, which I use in my project, where I need the url of the slow-motion Video to play it with the AVPlayerViewController:
else if asset is AVComposition {
//Slow-Motion Assets are passed as AVComposition
let paths = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)
let documentsDirectory: NSString? = paths.first as NSString?
if documentsDirectory != nil {
let random = Int(arc4random() % 1000)
let pathToAppend = String(format: "mergeSlowMoVideo-%d.mov", random)
let myPathDocs = documentsDirectory!.strings(byAppendingPaths: [pathToAppend])
let myPath = myPathDocs.first
if myPath != nil {
let url = URL(fileURLWithPath: myPath!)
let exporter = AVAssetExportSession(asset: asset!, presetName: AVAssetExportPresetHighestQuality)
if exporter != nil {
exporter!.outputURL = url
exporter!.outputFileType = AVFileTypeQuickTimeMovie
exporter!.shouldOptimizeForNetworkUse = true
exporter!.exportAsynchronously(completionHandler: {
AsyncUtil.asyncMain {
let url = exporter!.outputURL
if url != nil {
let player = AVPlayer(url: url!)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
playerViewController.modalTransitionStyle = .crossDissolve
view.present(playerViewController, animated: true) {
playerViewController.player!.play()
}
}
}
})
}
}
Playing Slo-mo & Library Video in Swift 4 n above with Custom View
var vcPlayer = AVPlayerViewController()
var player = AVPlayer()
func playallVideo(_ customView: UIView, asset: PHAsset) {
guard asset.mediaType == .video
else {
print("Not a valid video media type")
return
}
let options = PHVideoRequestOptions()
options.isNetworkAccessAllowed = true
PHCachingImageManager().requestPlayerItem(forVideo: asset, options: options) { (playerItem, info) in
DispatchQueue.main.async {
self.player = AVPlayer(playerItem: playerItem)
self.vcPlayer.player = self.player
self.vcPlayer.view.frame = customView.bounds
self.vcPlayer.videoGravity = .resizeAspectFill
self.vcPlayer.showsPlaybackControls = true
//self.vcPlayer.allowsPictureInPicturePlayback = true
self.playerView.addSubview(self.vcPlayer.view)
self.player.play()
}
}
}
/**********Function Call ********/
self.playallVideo(self.playerView/*YourCustomView*/, asset: currentAssetArr[currentIndex]/*Current PHAsset Fetched from Library*/)
:) enjoy
Looking answer in Swift?
here's how I did it
Creating "Slow motion" video in iOS swift is not easy, that I came across many "slow motion" that came to know not working or some of the codes in them are depreciated. And so I finally figured a way to make slow motion in Swift.
This code can be used for 120fps are greater than that too.
Here is the "code snippet I created for achieving slow motion"
Give me a UPVOTE if this code works.
func slowMotion(pathUrl: URL) {
let videoAsset = AVURLAsset.init(url: pathUrl, options: nil)
let currentAsset = AVAsset.init(url: pathUrl)
let vdoTrack = currentAsset.tracks(withMediaType: .video)[0]
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
let videoInsertError: Error? = nil
var videoInsertResult = false
do {
try compositionVideoTrack?.insertTimeRange(
CMTimeRangeMake(start: .zero, duration: videoAsset.duration),
of: videoAsset.tracks(withMediaType: .video)[0],
at: .zero)
videoInsertResult = true
} catch let videoInsertError {
}
if !videoInsertResult || videoInsertError != nil {
//handle error
return
}
var duration: CMTime = .zero
duration = CMTimeAdd(duration, currentAsset.duration)
//MARK: You see this constant (videoScaleFactor) this helps in achieving the slow motion that you wanted. This increases the time scale of the video that makes slow motion
// just increase the videoScaleFactor value in order to play video in higher frames rates(more slowly)
let videoScaleFactor = 2.0
let videoDuration = videoAsset.duration
compositionVideoTrack?.scaleTimeRange(
CMTimeRangeMake(start: .zero, duration: videoDuration),
toDuration: CMTimeMake(value: videoDuration.value * Int64(videoScaleFactor), timescale: videoDuration.timescale))
compositionVideoTrack?.preferredTransform = vdoTrack.preferredTransform
let dirPaths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).map(\.path)
let docsDir = dirPaths[0]
let outputFilePath = URL(fileURLWithPath: docsDir).appendingPathComponent("slowMotion\(UUID().uuidString).mp4").path
if FileManager.default.fileExists(atPath: outputFilePath) {
do {
try FileManager.default.removeItem(atPath: outputFilePath)
} catch {
}
}
let filePath = URL(fileURLWithPath: outputFilePath)
let assetExport = AVAssetExportSession(
asset: mixComposition,
presetName: AVAssetExportPresetHighestQuality)
assetExport?.outputURL = filePath
assetExport?.outputFileType = .mp4
assetExport?.exportAsynchronously(completionHandler: {
switch assetExport?.status {
case .failed:
print("asset output media url = \(String(describing: assetExport?.outputURL))")
print("Export session faiied with error: \(String(describing: assetExport?.error))")
DispatchQueue.main.async(execute: {
// completion(nil);
})
case .completed:
print("Successful")
let outputURL = assetExport!.outputURL
print("url path = \(String(describing: outputURL))")
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputURL!)
}) { saved, error in
if saved {
print("video successfully saved in photos gallery view video in photos gallery")
}
if (error != nil) {
print("error in saing video \(String(describing: error?.localizedDescription))")
}
}
DispatchQueue.main.async(execute: {
// completion(_filePath);
})
case .none:
break
case .unknown:
break
case .waiting:
break
case .exporting:
break
case .cancelled:
break
case .some(_):
break
}
})
}

Ios: Remove audio from video

I have requirement some times mute video send enduser so I required remove audio file form the video.
Passing video for single process for mute.
// Remove audio from video
- (void) RemoveAudioFromVideo:(NSString *)VideoLocalPath {
NSString * initPath1 = VideoLocalPath;
AVMutableComposition *composition = [AVMutableComposition composition];
NSString *inputVideoPath = initPath1;
AVURLAsset * sourceAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:inputVideoPath] options:nil];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
BOOL ok = NO;
AVAssetTrack * sourceVideoTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CMTimeRange x = CMTimeRangeMake(kCMTimeZero, [sourceAsset duration]);
ok = [compositionVideoTrack insertTimeRange:x ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil];
if([[NSFileManager defaultManager] fileExistsAtPath:initPath1]) {
[[NSFileManager defaultManager] removeItemAtPath:initPath1 error:nil];
}
NSURL *url = [[NSURL alloc] initFileURLWithPath: initPath1];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
NSLog(#";%#", [exporter supportedFileTypes]);
exporter.outputFileType = #"com.apple.quicktime-movie";
[exporter exportAsynchronouslyWithCompletionHandler:^{
[self savefinalVideoFileToDocuments:exporter.outputURL];
}];
}
// Write final Video
-(void)savefinalVideoFileToDocuments:(NSURL *)url {
NSString *storePath = [[self applicationCacheDirectory] stringByAppendingPathComponent:#"Videos"];
NSData * movieData = [NSData dataWithContentsOfURL:url];
[movieData writeToFile:storePath atomically:YES];
}
// Directory Path
- (NSString *)applicationCacheDirectory {
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = ([paths count] > 0) ? [paths objectAtIndex:0] : nil;
return documentsDirectory;
}
In Swift 3,
func removeAudioFromVideo(_ videoPath: String) {
let initPath1: String = videoPath
let composition = AVMutableComposition()
let inputVideoPath: String = initPath1
let sourceAsset = AVURLAsset(url: URL(fileURLWithPath: inputVideoPath), options: nil)
let compositionVideoTrack: AVMutableCompositionTrack? = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let sourceVideoTrack: AVAssetTrack? = sourceAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
let x: CMTimeRange = CMTimeRangeMake(kCMTimeZero, sourceAsset.duration)
_ = try? compositionVideoTrack!.insertTimeRange(x, of: sourceVideoTrack!, at: kCMTimeZero)
if FileManager.default.fileExists(atPath: initPath1) {
try? FileManager.default.removeItem(atPath: initPath1)
}
let url = URL(fileURLWithPath: initPath1)
let exporter = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetHighestQuality)
exporter?.outputURL = url
exporter?.outputFileType = "com.apple.quicktime-movie"
exporter?.exportAsynchronously(completionHandler: {() -> Void in
self.saveFinalVideoFile(toDocuments: exporter!.outputURL!)
})
}
func saveFinalVideoFile(toDocuments url: URL) {
let fileURL = try! FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false).appendingPathComponent("Videos")
let movieData = try? Data(contentsOf: url)
try? movieData?.write(to: fileURL, options: .atomic)
}

Split Video in Multiple Partition

I'm developing iOS App in objective C.
As per my requirement, I want to split video in multiple parts.
Suppose I have a video of 50 seconds and I want to divide it in 5 parts of 10 seconds each.
Please advice me if you guys have any idea about it.
Swift 5 version of #MilanPatel answer, that i found very useful :
func splitSecondVideo() {
guard did<splitdivide else {return}
guard let videourl = URL(string: "YOUR_VIDEO_URL_HERE") else {return}
let asset = AVURLAsset(url: videourl, options: nil)
let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHighestQuality)
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).map(\.path)
let documentsDirectory = paths[0]
var myPathDocs: String?
var starttime: CMTime
var duration: CMTime
myPathDocs = URL(fileURLWithPath: documentsDirectory).appendingPathComponent("SplitFinalVideo\(did).mov").path
let endt = CMTimeGetSeconds(asset.duration)
print("All Duration : \(endt)")
let divide = CMTimeGetSeconds(asset.duration) / splitdivide
print("All Duration : \(divide)")
starttime = CMTimeMakeWithSeconds(Float64(divide * did), preferredTimescale: 1)
duration = CMTimeMakeWithSeconds(Float64(divide), preferredTimescale: 1)
let fileManager = FileManager()
var error: Error?
if fileManager.fileExists(atPath: myPathDocs ?? "") == true {
do {
try fileManager.removeItem(atPath: myPathDocs ?? "")
} catch {
}
}
exportSession?.outputURL = URL(fileURLWithPath: myPathDocs ?? "")
exportSession?.shouldOptimizeForNetworkUse = true
exportSession?.outputFileType = .mov
// Trim to half duration
let secondrange = CMTimeRangeMake(start: starttime, duration: duration)
exportSession?.timeRange = secondrange
exportSession?.exportAsynchronously(completionHandler: { [self] in
exportDidFinish(exportSession)
did += 1
self.splitSecondVideo()
})
}
func exportDidFinish(_ session: AVAssetExportSession?) {
if session?.status == .completed {
let outputURL = session?.outputURL
print("Before Exported")
saveVideoAtPath(with: outputURL)
}
}
Another update:
session?.status was always coming failed. To solve this issue,
I added:
asset.resourceLoader.setDelegate(self, queue: .main)
and conform to AVAssetResourceLoaderDelegate protocol and it start working. Thanks to this answer by #panychyk.dima
Nice Question.The solution is here...
-(void)splitSecondVideo
{
if (did<splitdivide){
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:_videourl options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs;
CMTime starttime;
CMTime duration;
myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"SplitFinalVideo%d.mov",did]];
double endt=CMTimeGetSeconds([asset duration]);
NSLog(#"All Duration : %f",endt);
double divide=CMTimeGetSeconds([asset duration])/splitdivide;
NSLog(#"All Duration : %f",divide);
starttime = CMTimeMakeWithSeconds(divide*did, 1);
duration = CMTimeMakeWithSeconds(divide, 1);
NSFileManager *fileManager=[[NSFileManager alloc]init];
NSError *error;
if ([fileManager fileExistsAtPath:myPathDocs] == YES) {
[fileManager removeItemAtPath:myPathDocs error:&error];
}
exportSession.outputURL = [NSURL fileURLWithPath:myPathDocs];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
// Trim to half duration
CMTimeRange secondrange = CMTimeRangeMake(starttime, duration);
exportSession.timeRange = secondrange;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
[self exportDidFinish:exportSession];
did++;
[self splitSecondVideo];
}];
}}
- (void)exportDidFinish:(AVAssetExportSession*)session {
if (session.status == AVAssetExportSessionStatusCompleted) {
NSURL *outputURL = session.outputURL;
NSLog(#"Before Exported");
[self SaveVideoAtPathWithURL:outputURL];
}}
Where did=0.0 in viewDidLoad method.& splitdivide is the value that you want to create the part of the video.In your question splitdivide=5;
Note : The did & split divide are both the Integer Value.
Hope this is helpful....Enjoy...

How to do Slow Motion video in IOS

I have to do "slow motion" in a video file along with audio, in-between some frames and need to store the ramped video as a new video.
Ref: http://www.youtube.com/watch?v=BJ3_xMGzauk (watch from 0 to 10s)
From my analysis, I've found that AVFoundation framework can be helpful.
Ref:
http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/00_Introduction.html
Copy and pasted from the above link:
"
Editing
AV Foundation uses compositions to create new assets from existing pieces of media (typically, one or more video and audio tracks). You use a mutable composition to add and remove tracks, and adjust their temporal orderings. You can also set the relative volumes and ramping of audio tracks; and set the opacity, and opacity ramps, of video tracks. A composition is an assemblage of pieces of media held in memory. When you export a composition using an export session, it's collapsed to a file.
On iOS 4.1 and later, you can also create an asset from media such as sample buffers or still images using an asset writer.
"
Questions:
Can I do " slow motion " the video/audio file using the AVFoundation framework ? Or Is there any other package available? If i want to handle audio and video separately, please guide me how to do?
Update :: Code For AV Export Session :
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *outputURL = paths[0];
NSFileManager *manager = [NSFileManager defaultManager];
[manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
outputURL = [outputURL stringByAppendingPathComponent:#"output.mp4"];
// Remove Existing File
[manager removeItemAtPath:outputURL error:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:self.inputAsset presetName:AVAssetExportPresetLowQuality];
exportSession.outputURL = [NSURL fileURLWithPath:outputURL]; // output path;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void) {
if (exportSession.status == AVAssetExportSessionStatusCompleted) {
[self writeVideoToPhotoLibrary:[NSURL fileURLWithPath:outputURL]];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputURL] completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
NSLog(#"Video could not be saved");
}
}];
} else {
NSLog(#"error: %#", [exportSession error]);
}
}];
You could scale video using AVFoundation and CoreMedia frameworks.
Take a look at the AVMutableCompositionTrack method:
- (void)scaleTimeRange:(CMTimeRange)timeRange toDuration:(CMTime)duration;
Sample:
AVURLAsset* videoAsset = nil; //self.inputAsset;
//create mutable composition
AVMutableComposition *mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *videoInsertError = nil;
BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero
error:&videoInsertError];
if (!videoInsertResult || nil != videoInsertError) {
//handle error
return;
}
//slow down whole video by 2.0
double videoScaleFactor = 2.0;
CMTime videoDuration = videoAsset.duration;
[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
//export
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetLowQuality];
(Probably audio track from videoAsset should also be added to mixComposition)
Slower + Faster with or without audio track
I have tried and able to Slower the asset.
compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration) did the trick.
I made a class which will help you to generate a slower video from AVAsset.
+ point is you can also make it faster and another + point is it will handle the audio too.
Here is my custom class sample:
import UIKit
import AVFoundation
enum SpeedoMode {
case Slower
case Faster
}
class VSVideoSpeeder: NSObject {
/// Singleton instance of `VSVideoSpeeder`
static var shared: VSVideoSpeeder = {
return VSVideoSpeeder()
}()
/// Range is b/w 1x, 2x and 3x. Will not happen anything if scale is out of range. Exporter will be nil in case url is invalid or unable to make asset instance.
func scaleAsset(fromURL url: URL, by scale: Int64, withMode mode: SpeedoMode, completion: #escaping (_ exporter: AVAssetExportSession?) -> Void) {
/// Check the valid scale
if scale < 1 || scale > 3 {
/// Can not proceed, Invalid range
completion(nil)
return
}
/// Asset
let asset = AVAsset(url: url)
/// Video Tracks
let videoTracks = asset.tracks(withMediaType: AVMediaType.video)
if videoTracks.count == 0 {
/// Can not find any video track
completion(nil)
return
}
/// Get the scaled video duration
let scaledVideoDuration = (mode == .Faster) ? CMTimeMake(asset.duration.value / scale, asset.duration.timescale) : CMTimeMake(asset.duration.value * scale, asset.duration.timescale)
let timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration)
/// Video track
let videoTrack = videoTracks.first!
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)
/// Audio Tracks
let audioTracks = asset.tracks(withMediaType: AVMediaType.audio)
if audioTracks.count > 0 {
/// Use audio if video contains the audio track
let compositionAudioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)
/// Audio track
let audioTrack = audioTracks.first!
do {
try compositionAudioTrack?.insertTimeRange(timeRange, of: audioTrack, at: kCMTimeZero)
compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
} catch _ {
/// Ignore audio error
}
}
do {
try compositionVideoTrack?.insertTimeRange(timeRange, of: videoTrack, at: kCMTimeZero)
compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
/// Keep original transformation
compositionVideoTrack?.preferredTransform = videoTrack.preferredTransform
/// Initialize Exporter now
let outputFileURL = URL(fileURLWithPath: "/Users/thetiger/Desktop/scaledVideo.mov")
/// Note:- Please use directory path if you are testing with device.
if FileManager.default.fileExists(atPath: outputFileURL.absoluteString) {
try FileManager.default.removeItem(at: outputFileURL)
}
let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
exporter?.outputURL = outputFileURL
exporter?.outputFileType = AVFileType.mov
exporter?.shouldOptimizeForNetworkUse = true
exporter?.exportAsynchronously(completionHandler: {
completion(exporter)
})
} catch let error {
print(error.localizedDescription)
completion(nil)
return
}
}
}
I took 1x, 2x and 3x as a valid scale. Class contains the proper validation and handling. Below is the sample of how to use this function.
let url = Bundle.main.url(forResource: "1", withExtension: "mp4")!
VSVideoSpeeder.shared.scaleAsset(fromURL: url, by: 3, withMode: SpeedoMode.Slower) { (exporter) in
if let exporter = exporter {
switch exporter.status {
case .failed: do {
print(exporter.error?.localizedDescription ?? "Error in exporting..")
}
case .completed: do {
print("Scaled video has been generated successfully!")
}
case .unknown: break
case .waiting: break
case .exporting: break
case .cancelled: break
}
}
else {
/// Error
print("Exporter is not initialized.")
}
}
This line will handle the audio scaling
compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
I have achieved on adding slow motion to video including audio as well with proper output orientation.
- (void)SlowMotion:(NSURL *)URl
{
AVURLAsset* videoAsset = [AVURLAsset URLAssetWithURL:URl options:nil]; //self.inputAsset;
AVAsset *currentAsset = [AVAsset assetWithURL:URl];
AVAssetTrack *vdoTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
//create mutable composition
AVMutableComposition *mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *videoInsertError = nil;
BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero
error:&videoInsertError];
if (!videoInsertResult || nil != videoInsertError) {
//handle error
return;
}
NSError *audioInsertError =nil;
BOOL audioInsertResult =[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero
error:&audioInsertError];
if (!audioInsertResult || nil != audioInsertError) {
//handle error
return;
}
CMTime duration =kCMTimeZero;
duration=CMTimeAdd(duration, currentAsset.duration);
//slow down whole video by 2.0
double videoScaleFactor = 2.0;
CMTime videoDuration = videoAsset.duration;
[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
[compositionAudioTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
[compositionVideoTrack setPreferredTransform:vdoTrack.preferredTransform];
NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:#"slowMotion.mov"]];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
NSURL *_filePath = [NSURL fileURLWithPath:outputFilePath];
//export
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetLowQuality];
assetExport.outputURL=_filePath;
assetExport.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[assetExport exportAsynchronouslyWithCompletionHandler:^
{
switch ([assetExport status]) {
case AVAssetExportSessionStatusFailed:
{
NSLog(#"Export session faiied with error: %#", [assetExport error]);
dispatch_async(dispatch_get_main_queue(), ^{
// completion(nil);
});
}
break;
case AVAssetExportSessionStatusCompleted:
{
NSLog(#"Successful");
NSURL *outputURL = assetExport.outputURL;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) {
[self writeExportedVideoToAssetsLibrary:outputURL];
}
dispatch_async(dispatch_get_main_queue(), ^{
// completion(_filePath);
});
}
break;
default:
break;
}
}];
}
- (void)writeExportedVideoToAssetsLibrary :(NSURL *)url {
NSURL *exportURL = url;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:exportURL]) {
[library writeVideoAtPathToSavedPhotosAlbum:exportURL completionBlock:^(NSURL *assetURL, NSError *error){
dispatch_async(dispatch_get_main_queue(), ^{
if (error) {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:[error localizedDescription]
message:[error localizedRecoverySuggestion]
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alertView show];
}
if(!error)
{
// [activityView setHidden:YES];
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:#"Sucess"
message:#"video added to gallery successfully"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alertView show];
}
#if !TARGET_IPHONE_SIMULATOR
[[NSFileManager defaultManager] removeItemAtURL:exportURL error:nil];
#endif
});
}];
} else {
NSLog(#"Video could not be exported to assets library.");
}
}
I would extract all frames from initial video using ffmpeg and then collect together using AVAssetWriter but with lower frame rate. For getting more fulid slow motion maybe you will need to apply some blur effect, or even generate frame between existing, which will be mix from two frames.
An example in swift :
I
var asset: AVAsset?
func configureAssets(){
let videoAsset = AVURLAsset(url: Bundle.main.url(forResource: "sample", withExtension: "m4v")!)
let audioAsset = AVURLAsset(url: Bundle.main.url(forResource: "sample", withExtension: "m4a")!)
// let audioAsset2 = AVURLAsset(url: Bundle.main.url(forResource: "audio2", withExtension: "m4a")!)
let comp = AVMutableComposition()
let videoAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo).first! as AVAssetTrack
let audioAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeAudio).first! as AVAssetTrack
// let audioAssetSourceTrack2 = audioAsset2.tracks(withMediaType: AVMediaTypeAudio).first! as AVAssetTrack
let videoCompositionTrack = comp.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let audioCompositionTrack = comp.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
do {
try videoCompositionTrack.insertTimeRange(
CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(9 , 600)),
of: videoAssetSourceTrack,
at: kCMTimeZero)
try audioCompositionTrack.insertTimeRange(
CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(9, 600)),
of: audioAssetSourceTrack,
at: kCMTimeZero)
//
// try audioCompositionTrack.insertTimeRange(
// CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(3, 600)),
// of: audioAssetSourceTrack2,
// at: CMTimeMakeWithSeconds(7, 600))
let videoScaleFactor = Int64(2.0)
let videoDuration: CMTime = videoAsset.duration
videoCompositionTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero, videoDuration), toDuration: CMTimeMake(videoDuration.value * videoScaleFactor, videoDuration.timescale))
audioCompositionTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero, videoDuration), toDuration: CMTimeMake(videoDuration.value * videoScaleFactor, videoDuration.timescale))
videoCompositionTrack.preferredTransform = videoAssetSourceTrack.preferredTransform
}catch { print(error) }
asset = comp
}
II
func createFileFromAsset(_ asset: AVAsset){
let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] as URL
let filePath = documentsDirectory.appendingPathComponent("rendered-audio.m4v")
deleteFile(filePath)
if let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetLowQuality){
exportSession.canPerformMultiplePassesOverSourceMediaData = true
exportSession.outputURL = filePath
exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration)
exportSession.outputFileType = AVFileTypeQuickTimeMovie
exportSession.exportAsynchronously {
_ in
print("finished: \(filePath) : \(exportSession.status.rawValue) ")
}
}
}
func deleteFile(_ filePath:URL) {
guard FileManager.default.fileExists(atPath: filePath.path) else {
return
}
do {
try FileManager.default.removeItem(atPath: filePath.path)
}catch{
fatalError("Unable to delete file: \(error) : \(#function).")
}
}
Swift 5
Here is #TheTiger's code converted to SwiftUI:
import UIKit
import AVFoundation
enum SpeedoMode {
case Slower
case Faster
}
class VSVideoSpeeder: NSObject {
/// Singleton instance of `VSVideoSpeeder`
static var shared: VSVideoSpeeder = {
return VSVideoSpeeder()
}()
/// Range is b/w 1x, 2x and 3x. Will not happen anything if scale is out of range. Exporter will be nil in case url is invalid or unable to make asset instance.
func scaleAsset(fromURL url: URL, by scale: Int64, withMode mode: SpeedoMode, completion: #escaping (_ exporter: AVAssetExportSession?) -> Void) {
/// Check the valid scale
if scale < 1 || scale > 3 {
/// Can not proceed, Invalid range
completion(nil)
return
}
/// Asset
let asset = AVAsset(url: url)
/// Video Tracks
let videoTracks = asset.tracks(withMediaType: AVMediaType.video)
if videoTracks.count == 0 {
/// Can not find any video track
completion(nil)
return
}
/// Get the scaled video duration
let scaledVideoDuration = (mode == .Faster) ? CMTimeMake(value: asset.duration.value / scale, timescale: asset.duration.timescale) : CMTimeMake(value: asset.duration.value * scale, timescale: asset.duration.timescale)
let timeRange = CMTimeRangeMake(start: CMTime.zero, duration: asset.duration)
/// Video track
let videoTrack = videoTracks.first!
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)
/// Audio Tracks
let audioTracks = asset.tracks(withMediaType: AVMediaType.audio)
if audioTracks.count > 0 {
/// Use audio if video contains the audio track
let compositionAudioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)
/// Audio track
let audioTrack = audioTracks.first!
do {
try compositionAudioTrack?.insertTimeRange(timeRange, of: audioTrack, at: CMTime.zero)
compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
} catch _ {
/// Ignore audio error
}
}
do {
try compositionVideoTrack?.insertTimeRange(timeRange, of: videoTrack, at: CMTime.zero)
compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
/// Keep original transformation
compositionVideoTrack?.preferredTransform = videoTrack.preferredTransform
/// Initialize Exporter now
let outputFileURL = URL(fileURLWithPath: "/Users/thetiger/Desktop/scaledVideo.mov")
/// Note:- Please use directory path if you are testing with device.
if FileManager.default.fileExists(atPath: outputFileURL.absoluteString) {
try FileManager.default.removeItem(at: outputFileURL)
}
let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
exporter?.outputURL = outputFileURL
exporter?.outputFileType = AVFileType.mov
exporter?.shouldOptimizeForNetworkUse = true
exporter?.exportAsynchronously(completionHandler: {
completion(exporter)
})
} catch let error {
print(error.localizedDescription)
completion(nil)
return
}
}
}
}
With the same use case:
let url = Bundle.main.url(forResource: "1", withExtension: "mp4")!
VSVideoSpeeder.shared.scaleAsset(fromURL: url, by: 3, withMode: SpeedoMode.Slower) { (exporter) in
if let exporter = exporter {
switch exporter.status {
case .failed: do {
print(exporter.error?.localizedDescription ?? "Error in exporting..")
}
case .completed: do {
print("Scaled video has been generated successfully!")
}
case .unknown: break
case .waiting: break
case .exporting: break
case .cancelled: break
}
}
else {
/// Error
print("Exporter is not initialized.")
}
}
Creating "Slow motion" video in iOS swift is not easy, that I came across many "slow motion" that came to know not working or some of the codes in them are depreciated. And so I finally figured a way to make slow motion in Swift.
note: This code can be used for 120fps are greater than that too.
You can make audio in slow motion in the same way I did
Here is the "code snippet I created for achieving slow motion"
func slowMotion(pathUrl: URL) {
let videoAsset = AVURLAsset.init(url: pathUrl, options: nil)
let currentAsset = AVAsset.init(url: pathUrl)
let vdoTrack = currentAsset.tracks(withMediaType: .video)[0]
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
let videoInsertError: Error? = nil
var videoInsertResult = false
do {
try compositionVideoTrack?.insertTimeRange(
CMTimeRangeMake(start: .zero, duration: videoAsset.duration),
of: videoAsset.tracks(withMediaType: .video)[0],
at: .zero)
videoInsertResult = true
} catch let videoInsertError {
}
if !videoInsertResult || videoInsertError != nil {
//handle error
return
}
var duration: CMTime = .zero
duration = CMTimeAdd(duration, currentAsset.duration)
//MARK: You see this constant (videoScaleFactor) this helps in achieving the slow motion that you wanted. This increases the time scale of the video that makes slow motion
// just increase the videoScaleFactor value in order to play video in higher frames rates(more slowly)
let videoScaleFactor = 2.0
let videoDuration = videoAsset.duration
compositionVideoTrack?.scaleTimeRange(
CMTimeRangeMake(start: .zero, duration: videoDuration),
toDuration: CMTimeMake(value: videoDuration.value * Int64(videoScaleFactor), timescale: videoDuration.timescale))
compositionVideoTrack?.preferredTransform = vdoTrack.preferredTransform
let dirPaths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).map(\.path)
let docsDir = dirPaths[0]
let outputFilePath = URL(fileURLWithPath: docsDir).appendingPathComponent("slowMotion\(UUID().uuidString).mp4").path
if FileManager.default.fileExists(atPath: outputFilePath) {
do {
try FileManager.default.removeItem(atPath: outputFilePath)
} catch {
}
}
let filePath = URL(fileURLWithPath: outputFilePath)
let assetExport = AVAssetExportSession(
asset: mixComposition,
presetName: AVAssetExportPresetHighestQuality)
assetExport?.outputURL = filePath
assetExport?.outputFileType = .mp4
assetExport?.exportAsynchronously(completionHandler: {
switch assetExport?.status {
case .failed:
print("asset output media url = \(String(describing: assetExport?.outputURL))")
print("Export session faiied with error: \(String(describing: assetExport?.error))")
DispatchQueue.main.async(execute: {
// completion(nil);
})
case .completed:
print("Successful")
let outputURL = assetExport!.outputURL
print("url path = \(String(describing: outputURL))")
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputURL!)
}) { saved, error in
if saved {
print("video successfully saved in photos gallery view video in photos gallery")
}
if (error != nil) {
print("error in saing video \(String(describing: error?.localizedDescription))")
}
}
DispatchQueue.main.async(execute: {
// completion(_filePath);
})
case .none:
break
case .unknown:
break
case .waiting:
break
case .exporting:
break
case .cancelled:
break
case .some(_):
break
}
})
}

Resources