AVMutableAudioMix multiple volume changes to single track - ios

I'm working on an app that merges multiple video clips into one final video. I would like to give users the ability to mute individual clips if desired (so, only parts of the final merged video would be muted). I have wrapped the AVAssets in a class called "Video" that has a "shouldMute" property.
My problem is, when I set the volume of one of the AVAssetTracks to zero, it stays muted for the remainder of the final video. Here is my code:
var completeDuration : CMTime = CMTimeMake(0, 1)
var insertTime = kCMTimeZero
var layerInstructions = [AVVideoCompositionLayerInstruction]()
let mixComposition = AVMutableComposition()
let audioMix = AVMutableAudioMix()
let videoTrack =
mixComposition.addMutableTrack(withMediaType: AVMediaType.video,
preferredTrackID: kCMPersistentTrackID_Invalid)
let audioTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
// iterate through video assets and merge together
for (i, video) in clips.enumerated() {
let videoAsset = video.asset
var clipDuration = videoAsset.duration
do {
if video == clips.first {
insertTime = kCMTimeZero
} else {
insertTime = completeDuration
}
if let videoAssetTrack = videoAsset.tracks(withMediaType: .video).first {
try videoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, clipDuration), of: videoAssetTrack, at: insertTime)
completeDuration = CMTimeAdd(completeDuration, clipDuration)
}
if let audioAssetTrack = videoAsset.tracks(withMediaType: .audio).first {
try audioTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, clipDuration), of: audioAssetTrack, at: insertTime)
if video.shouldMute {
let audioMixInputParams = AVMutableAudioMixInputParameters()
audioMixInputParams.trackID = audioTrack!.trackID
audioMixInputParams.setVolume(0.0, at: insertTime)
audioMix.inputParameters.append(audioMixInputParams)
}
}
} catch let error as NSError {
print("error: \(error)")
}
let videoInstruction = videoCompositionInstructionForTrack(track: videoTrack!, video: video)
if video != clips.last{
videoInstruction.setOpacity(0.0, at: completeDuration)
}
layerInstructions.append(videoInstruction)
} // end of video asset iteration
If I add another setVolume:atTime instruction to increase the volume back to 1.0 at the end of the clip, then the first volume instruction is completely ignored and the whole video plays at full volume.
In other words, this isn't working:
if video.shouldMute {
let audioMixInputParams = AVMutableAudioMixInputParameters()
audioMixInputParams.trackID = audioTrack!.trackID
audioMixInputParams.setVolume(0.0, at: insertTime)
audioMixInputParams.setVolume(1.0, at: completeDuration)
audioMix.inputParameters.append(audioMixInputParams)
}
I have set the audioMix on both my AVPlayerItem and AVAssetExportSession. What am I doing wrong? What can I do to allow users to mute the time ranges of individual clips before merging into the final video?

Apparently I was going about this wrong. As you can see above, my composition has two AVMutableCompositionTracks: a video track, and an audio track. Even though I inserted the time ranges of a series of other tracks into those two tracks, there's still ultimately only two tracks. So, I only needed one AVMutableAudioMixInputParameters object to associate with my one audio track.
I initialized a single AVMutableAudioMixInputParameters object and then, after I inserted the time range of each clip, I'd check to see whether it should be muted and set a volume ramp for the clip's time range (the time range in relation to the entire audio track). Here's what that looks like, inside my clip iteration:
if let audioAssetTrack = videoAsset.tracks(withMediaType: .audio).first {
try audioTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, clipDuration), of: audioAssetTrack, at: insertTime)
if video.shouldMute {
audioMixInputParams.setVolumeRamp(fromStartVolume: 0.0, toEndVolume: 0.0, timeRange: CMTimeRangeMake(insertTime, clipDuration))
} else {
audioMixInputParams.setVolumeRamp(fromStartVolume: 1.0, toEndVolume: 1.0, timeRange: CMTimeRangeMake(insertTime, clipDuration))
}
}

Related

Swift -Converted Audio URL to Video URL Doesn't Play in Photos Library

I have an audio url (.m4a) that I create using the AVAudioRecorder. I want to share that audio on Instagram so I convert the audio to a video. The issue is after the conversion, when I save the video url to the Files app using the UIActivityViewController, I can replay the video, see the time (eg 7 seconds) and hear the audio with no problem. A black screen with a sound icon appears.
But when I save the video to the Photos Library using the UIActivityViewController, the video shows the 7 seconds but nothing plays, the video is all gray, and the sound icon doesn't show.
Why is the video successfully saving/playing in the Files app but saving and not playing in the Photos Library?
let asset: AVURLAsset = AVURLAsset(url: audioURL)
let mixComposition = AVMutableComposition()
guard let compositionTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: CMPersistentTrackID()) else { return }
let track = asset.tracks(withMediaType: .audio)
guard let assetTrack = track.first else { return }
do {
try compositionTrack.insertTimeRange(CMTimeRangeMake(start: .zero, duration: assetTrack.timeRange.duration), of: assetTrack, at: .zero)
} catch {
print(error.localizedDescription)
}
guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough) else { return }
let dirPath = NSTemporaryDirectory().appending("\(UUID().uuidString).mov")
let outputFileURL = URL(fileURLWithPath: dirPath)
exporter.outputFileType = .mov
exporter.outputURL = outputFileURL
exporter.shouldOptimizeForNetworkUse = true
exporter.exportAsynchronously {
switch exporter.status {
// ...
guard let videoURL = exporter.outputURL else { return }
// present UIActivityViewController to save videoURL and then save it to the Photos Library via 'Save Video`
}
}
As Lance rightfully pointed out, the issue is that while there was an export of a file in the .mov or .mp4 format, there was no video, it was just an audio playing.
On reading a bit more, .mp4 for example is just a digital multimedia container format which can very well just be used for audio so it's possible to save audio file as a .mp4 / .mov.
What was needed was to add an empty video track to the AVMutableComposition to succeed. Lance already posted a great solution works perfectly well and is more self sustained than an alternative solution I propose which relies on having a blank 1 second video.
Overview of how it works
You get a blank video file that is 1 second long in the resolution you want, for example 1920 x 1080
You retrieve the video track from this video asset
Retrieve the audio track from your audio file
Create an AVMutableComposition which will be used to merge the audio and video tracks
Configure an AVMutableCompositionTrack with the audio track and add that to the main AVMutableComposition
Configure an AVMutableVideoComposition with the video track
Use an AVAssetExportSession to export the final video with the AVMutableComposition and the AVMutableVideoComposition
The code
In most of the code below you will see multiple guard statements. You can create one guard, however, it can be useful to know with such types of tasks where the failure occurred as there could be several reason why an export could fail.
Configuring the audio track
private func configureAudioTrack(_ audioURL: URL,
inComposition composition: AVMutableComposition) -> AVMutableCompositionTrack?
{
// Initialize an AVURLAsset with your audio file
let audioAsset: AVURLAsset = AVURLAsset(url: audioURL)
let trackTimeRange = CMTimeRange(start: .zero,
duration: audioAsset.duration)
// Get the audio track from the audio asset
guard let sourceAudioTrack = audioAsset.tracks(withMediaType: .audio).first
else
{
manageError(nil, withMessage: "Error retrieving audio track from source file")
return nil
}
// Insert a new video track to the AVMutableComposition
guard let audioTrack = composition.addMutableTrack(withMediaType: .audio,
preferredTrackID: CMPersistentTrackID())
else
{
// manage your error
return nil
}
do {
// Inset the contents of the audio source into the new audio track
try audioTrack.insertTimeRange(trackTimeRange,
of: sourceAudioTrack,
at: .zero)
}
catch {
// manage your error
}
return audioTrack
}
Configuring the video track
private func configureVideoTrack(inComposition composition: AVMutableComposition) -> AVMutableCompositionTrack?
{
// Initialize a video asset with the empty video file
guard let blankMoviePathURL = Bundle.main.url(forResource: "blank",
withExtension: ".mp4"),
let videoAsset = AVAsset(url: blankMoviePathURL)
else
{
// manage errors
return nil
}
// Get the video track from the empty video
guard let sourceVideoTrack = videoAsset.tracks(withMediaType: .video).first
else
{
// manage errors
return nil
}
// Insert a new video track to the AVMutableComposition
guard let videoTrack = composition.addMutableTrack(withMediaType: .video,
preferredTrackID: kCMPersistentTrackID_Invalid)
else
{
// manage errors
return nil
}
let trackTimeRange = CMTimeRange(start: .zero,
duration: composition.duration)
do {
// Inset the contents of the video source into the new audio track
try videoTrack.insertTimeRange(trackTimeRange,
of: sourceVideoTrack,
at: .zero)
}
catch {
// manage errors
}
return videoTrack
}
Configure the video composition
// Configure the video properties like resolution and fps
private func createVideoComposition(with videoCompositionTrack: AVMutableCompositionTrack) -> AVMutableVideoComposition
{
let videoComposition = AVMutableVideoComposition()
// Set the fps
videoComposition.frameDuration = CMTime(value: 1,
timescale: 25)
// Video dimensions
videoComposition.renderSize = CGSize(width: 1920, height: 1080)
// Specify the duration of the video composition
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRange(start: .zero, duration: .indefinite)
// Add the video composition track to a new layer
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoCompositionTrack)
let transform = videoCompositionTrack.preferredTransform
layerInstruction.setTransform(transform, at: .zero)
// Apply the layer configuration instructions
instruction.layerInstructions = [layerInstruction]
videoComposition.instructions = [instruction]
return videoComposition
}
Configure the AVAssetExportSession
private func configureAVAssetExportSession(with composition: AVMutableComposition,
videoComposition: AVMutableVideoComposition) -> AVAssetExportSession?
{
// Configure export session
guard let exporter = AVAssetExportSession(asset: composition,
presetName: AVAssetExportPresetHighestQuality)
else
{
// Manage your errors
return nil
}
// Configure where the exported file will be stored
let documentsURL = FileManager.default.urls(for: .documentDirectory,
in: .userDomainMask)[0]
let fileName = "\(UUID().uuidString).mov"
let dirPath = documentsURL.appendingPathComponent(fileName)
let outputFileURL = dirPath
// Apply exporter settings
exporter.videoComposition = videoComposition
exporter.outputFileType = .mov
exporter.outputURL = outputFileURL
exporter.shouldOptimizeForNetworkUse = true
return exporter
}
Over here, one important thing to not is to set the exporter's present quality to a movie present like AVAssetExportPresetHighestQuality or AVAssetExportPresetLowQuality for example, something other than AVAssetExportPresetPassthrough which as per the documentation,
A preset to export the asset in its current format, unless otherwise
prohibited.
So you would still get an audio mp4 or mov file since the current format of the composition is of an audio. I did not test this extensively but this is from a few tests.
Finally, you can bring it all the above functions together like so:
func generateMovie(with audioURL: URL)
{
delegate?.audioMovieExporterDidStart(self)
let composition = AVMutableComposition()
// Configure the audio and video tracks in the new composition
guard let _ = configureAudioTrack(audioURL, inComposition: composition),
let videoCompositionTrack = configureVideoTrack(inComposition: composition)
else
{
// manage error
return
}
let videoComposition = createVideoComposition(with: videoCompositionTrack)
if let exporter = configureAVAssetExportSession(with: composition,
videoComposition: videoComposition)
{
exporter.exportAsynchronously
{
switch exporter.status {
case .completed:
guard let videoURL = exporter.outputURL
else
{
// manage errors
return
}
// notify someone the video is ready at videoURL
default:
// manege error
}
}
}
}
Final Thoughts
You could test drive a working sample here
I converted this into a simple library if you wish to use it where you can configure the orientation, fps and even set a background color to the video - available at the same link
If you just want the blank videos, you can get them from here
So it seems that although the code from my question did covert the audio file to a video file, there still wasn't a video track. I know this for a fact because after I got the exporter's videoURL from my question, I tried to add a watermark to it and in the watermark code it kept crashing on
let videoTrack = asset.tracks(withMediaType: AVMediaType.video)[0]
Basically the code from my question coverts audio to video but doesn't add a video track.
What I assume is happening is when the Files app reads the file, it knows that it's a .mov or .mp4 file and then it'll play the audio track even if the video track is missing.
Conversely, when the Photos app reads the file it also know's that it's a .mov or .mp4 file but if there isn't a video track, it won't play anything.
I had to combine these 2 answers to get the audio to play as a video in the Photos app.
1st- I added my app icon (you can add any image) as 1 image to an array of images to make a video track using the code from How do I export UIImage array as a movie? answered by #scootermg.
The code from #scootermg's answer is conveniently in 1 file at this GitHub by #dldnh. In his code, in the ImageAnimator class, in the render function, instead of saving to the Library I returned the videoWriter's output URL in the completionHandler.
2nd- I combined the app icon video that I just made with the audio url from my question using the code from Swift Merge audio and video files into one video answered by #TungFam
In the mixCompostion from TungFam's answer I used the audio url's asset duration for the length of the video.
do {
try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(start: .zero,
duration: aAudioAssetTrack.timeRange.duration),
of: aVideoAssetTrack,
at: .zero)
try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(start: .zero,
duration: aAudioAssetTrack.timeRange.duration),
of: aAudioAssetTrack,
at: .zero)
if let aAudioOfVideoAssetTrack = aAudioOfVideoAssetTrack {
try mutableCompositionAudioOfVideoTrack[0].insertTimeRange(CMTimeRangeMake(start: .zero,
duration: aAudioAssetTrack.timeRange.duration),
of: aAudioOfVideoAssetTrack,
at: .zero)
}
} catch {
print(error.localizedDescription)
}

Swift 4 insertTimeRange adds video tracks but when playing does not show except 1st one

The following function "func mergeVideos()" reads video files prerecorded and makes a new movie file with tracks created with those files. dump(mixComposition.tracks returns following for two video file tracks.
*▿ 2 elements
- <AVMutableCompositionTrack: 0x14661a30 trackID = 1, mediaType = vide, editCount = 1> #0
- super: AVCompositionTrack
- super: AVAssetTrack
- super: NSObject
- <AVMutableCompositionTrack: 0x14592bc0 trackID = 2, mediaType = vide, editCount = 1> #1
- super: AVCompositionTrack
- super: AVAssetTrack
- super: NSObject*
The problem is written, created new video's playing duration is as long as the first video duration. Insert time and duration time of added videos as follows and there is no problem with them.
▿ 2 elements
- "1.mp4"
- "video.mp4"
First video's insert and duration.
time= (0.0, 5.2316666666666665)
Second video's insert and duration.
time= (5.2316666666666665, 5.4366666666666665)
After hours of search we could not find similar problem so decided to ask.
func mergeVideos(){
let mixComposition = AVMutableComposition.init()
var timeRange: CMTimeRange!
var insertTime = kCMTimeZero
for k in 0..<videoListOnDisk.count {
let url = URL(fileURLWithPath: NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]).appendingPathComponent(videoListOnDisk[k])
let videoAsset = AVURLAsset(url: url)
let track = mixComposition.addMutableTrack(withMediaType: AVMediaType.video,
preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration )
do {
try track?.insertTimeRange(timeRange, of: videoAsset.tracks(withMediaType: .video)[0], at: insertTime)
} catch let error as NSError {
print("error when adding video to mix = \(error)")
}
insertTime = CMTimeAdd(insertTime, videoAsset.duration)
}
dump(mixComposition.tracks)
let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
exporter!.outputURL = fileURL(combinedVideoFileName)
exporter!.outputFileType = AVFileType.mp4
exporter!.shouldOptimizeForNetworkUse = false
exporter!.exportAsynchronously() {
DispatchQueue.main.async(execute: { () -> Void in
print("I am done with exporting \(exporter?.status.rawValue)")
})
}
}
The problem is the repetition of this line:
let track = mixComposition.addMutableTrack...
Put that before the loop, so that you create only one video track and insert all the video clips into that one track.

Swift AVFoundation stitching multiple videos together and keep preferred transform

I'm trying to stitch multiple video clips together. If I stitch each AVAsset in one AVMutableCompositionTrack it works but loses the transformation on the first asset by appending another one with enabled mirroring mode for front facing camera. Can I somehow use multiple AVMutableCompositionTrack of type video in one AVMutableComposition?
// create mix composition
let mixComposition = AVMutableComposition()
// insert video track
let videoCompositionTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)
// keep track of total duration
var totalDuration = kCMTimeZero
// for each video clip add to mutable composition and transform each video layer
for (index, videoClip) in videoClips.enumerated() {
if let videoAsset = videoClip.asset, let videoAssetTrack = videoAsset.tracks(withMediaType: AVMediaType.video).first {
// insert current video track to composition
try videoCompositionTrack!.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAssetTrack, at: totalDuration)
videoCompositionTrack?.preferredTransform = videoAssetTrack.preferredTransform
// shift duration to next
totalDuration = CMTimeAdd(totalDuration, videoAsset.duration)
}
}
// Use AVAssetExportSession to export video
let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPreset1920x1080)
assetExport?.outputFileType = AVFileType.mp4
// get needed save url to save the video to recommended url
let movieDestinationUrl = self.getRecommendedSaveUrl()
// seting up asset export session
assetExport?.outputURL = movieDestinationUrl
assetExport?.shouldOptimizeForNetworkUse = true
// export video to file system asyc
assetExport?.exportAsynchronously(completionHandler: {
assetExport?.cancelExport()
switch assetExport!.status {
case AVAssetExportSessionStatus.failed:
break
case AVAssetExportSessionStatus.cancelled:
break
default:
DispatchQueue.main.async {
completion?(movieDestinationUrl, nil)
}
}
if ((assetExport?.error) != nil) {
AppDelegate.logger.error("Could not create user video: \((assetExport?.error)!)")
DispatchQueue.main.async {
completion?(nil, assetExport?.error)
}
}
})
I'm trying to use something like this and multiple AVMutableCompositionTrack's with different CGAffineTransform objects.
// create mix composition
let mixComposition = AVMutableComposition()
// keep track of total duration
var totalDuration = kCMTimeZero
// for each video clip add to mutable composition and transform each video layer
for (index, videoClip) in videoClips.enumerated() {
if let videoAsset = videoClip.asset, let videoAssetTrack = videoAsset.tracks(withMediaType: AVMediaType.video).first {
// insert video track
let videoCompositionTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: CMPersistentTrackID(index))
// insert current video track to composition
try videoCompositionTrack!.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAssetTrack, at: totalDuration)
videoCompositionTrack?.preferredTransform = videoAssetTrack.preferredTransform
// shift duration to next
totalDuration = CMTimeAdd(totalDuration, videoAsset.duration)
}
}
// Use AVAssetExportSession to export video
let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPreset1920x1080)
assetExport?.outputFileType = AVFileType.mp4
// get needed save url to save the video to recommended url
let movieDestinationUrl = self.getRecommendedSaveUrl()
// seting up asset export session
assetExport?.outputURL = movieDestinationUrl
assetExport?.shouldOptimizeForNetworkUse = true
// export video to file system asyc
assetExport?.exportAsynchronously(completionHandler: {
assetExport?.cancelExport()
switch assetExport!.status {
case AVAssetExportSessionStatus.failed:
break
case AVAssetExportSessionStatus.cancelled:
break
default:
DispatchQueue.main.async {
completion?(movieDestinationUrl, nil)
}
}
if ((assetExport?.error) != nil) {
AppDelegate.logger.error("Could not create user video: \((assetExport?.error)!)")
DispatchQueue.main.async {
completion?(nil, assetExport?.error)
}
}
})
In the case above I'm not able to get any useable video: it is much shorter than it should be. I'm trying to avoid using any AVMutableVideoCompositionInstruction because it takes too long to process but it would still be an option if it worked for any resolution and especially with mirroring support.
// create mix composition
let mixComposition = AVMutableComposition()
// keep track of total duration
var totalDuration = kCMTimeZero
// keeps all layer transformations for each video asset
var videoCompositionLayerInstructions = [AVMutableVideoCompositionLayerInstruction]()
// for each video clip add to mutable composition and transform each video layer
for (index, videoClip) in videoClips.enumerated() {
if let videoAsset = videoClip.asset {
// use first video asset track for setting like height and width
let videoAssetTrack = videoAsset.tracks(withMediaType: AVMediaType.video).first!
// insert video trakc
let videoCompositionTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: CMPersistentTrackID(index))
// insert current video track to composition
try videoCompositionTrack!.insertTimeRange(CMTimeRangeMake(totalDuration, videoAssetTrack.timeRange.duration), of: videoAssetTrack, at: totalDuration)
videoCompositionTrack?.preferredTransform = videoAssetTrack.preferredTransform
let videoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoAssetTrack)
videoCompositionLayerInstruction.setTransform((videoCompositionTrack?.preferredTransform)!, at: totalDuration)
videoCompositionLayerInstruction.setOpacity(0.0, at: videoAsset.duration)
// apply instruction
videoCompositionLayerInstructions.append(videoCompositionLayerInstruction)
// shift duration to next
totalDuration = CMTimeAdd(totalDuration, videoAsset.duration)
}
}
let videoCompositionInstruction = AVMutableVideoCompositionInstruction()
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, totalDuration)
videoCompositionInstruction.layerInstructions = videoCompositionLayerInstructions
let mainComposition = AVMutableVideoComposition()
mainComposition.renderSize = CGSize(width: 1080, height: 1920)
mainComposition.frameDuration = CMTimeMake(1, 30)
mainComposition.instructions = [videoCompositionInstruction]
// Use AVAssetExportSession to export video
let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPreset1920x1080)
assetExport?.outputFileType = AVFileType.mp4
// get needed save url to save the video to recommended url
let movieDestinationUrl = self.getRecommendedSaveUrl()
// seting up asset export session
assetExport?.outputURL = movieDestinationUrl
assetExport?.shouldOptimizeForNetworkUse = true
assetExport?.videoComposition = mainComposition
Anybody an idea how to implement this functionality?
Note: I don't need to care about audio at all.

AVMutableComposition - Only Playing First Track (Swift)

I have an array of [AVAsset], and I am trying to combine all of those Assets into a single Asset so that I can play back the video seamlessly (I tried using an AVQueuePlayer, but that does not play back the assets seamlessly).
Below is what I have so far, but when I try to play the final composition, it only plays the first track, even though it shows that it has all tracks and the total duration equals all of the tracks together.
Am I missing a step, even though it appears that all the tracks are in the composition? Perhaps I need to handle the AVPlayer differently if the AVPlayerItem has multiple tracks?
let playerLayer: AVPlayerLayer = AVPlayerLayer()
lazy var videoPlayer: AVPlayer = AVPlayer()
var videoClips = [AVAsset]()
let videoComposition = AVMutableComposition()
var playerItem: AVPlayerItem!
var lastTime: CMTime = kCMTimeZero
for clipIndex in videoClips {
let videoCompositionTrack = videoComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
do {
try videoCompositionTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, clipIndex.duration),
ofTrack: clipIndex.tracksWithMediaType(AVMediaTypeVideo)[0] ,
atTime: lastTime)
lastTime = CMTimeAdd(lastTime, clipIndex.duration)
} catch {
print("Failed to insert track")
}
}
print("VideoComposition Tracks: \(videoComposition.tracks.count)") // Shows multiple tracks
playerItem = AVPlayerItem(asset: videoComposition)
print("PlayerItem Duration: \(playerItem.duration.seconds)") // Shows the duration of all tracks together
print("PlayerItem Tracks: \(playerItem.tracks.count)") // Shows same number of tracks as the VideoComposition Track count
videoPlayer = AVPlayer(playerItem: playerItem)
playerLayer.player = videoPlayer
videoPlayer.volume = 0.0
videoPlayer.play() // Only plays the first track
I was able to figure out an answer the most important question. In order to play all of the clips together, they need to be in the same track. To do this, move the following line outside (before) the for loop:
let videoCompositionTrack = videoComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
Here is the full, corrected code:
let playerLayer: AVPlayerLayer = AVPlayerLayer()
lazy var videoPlayer: AVPlayer = AVPlayer()
var videoClips = [AVAsset]()
let videoComposition = AVMutableComposition()
var playerItem: AVPlayerItem!
var lastTime: CMTime = kCMTimeZero
let videoCompositionTrack = videoComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
for clipIndex in videoClips {
do {
try videoCompositionTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, clipIndex.duration),
ofTrack: clipIndex.tracksWithMediaType(AVMediaTypeVideo)[0] ,
atTime: lastTime)
lastTime = CMTimeAdd(lastTime, clipIndex.duration)
} catch {
print("Failed to insert track")
}
}
print("VideoComposition Tracks: \(videoComposition.tracks.count)") // Shows multiple tracks
playerItem = AVPlayerItem(asset: videoComposition)
print("PlayerItem Duration: \(playerItem.duration.seconds)") // Shows the duration of all tracks together
print("PlayerItem Tracks: \(playerItem.tracks.count)") // Shows same number of tracks as the VideoComposition Track count
videoPlayer = AVPlayer(playerItem: playerItem)
playerLayer.player = videoPlayer
videoPlayer.volume = 0.0
videoPlayer.play() // Does play all clips sequentially
EDIT: I mentioned earlier that I was still wondering how to play multiple tracks in one asset. That's not how it works, I understand now. A good resource:
https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html

Merging of two or more video in ios

How to combine video clips with different orientation using AVFoundation
I have gone with the above answer and is going good. But i am facing a problem that audio of the video is being removed. Even all of my videos have voice. But after merging the exported video is mute. Can anyone help. Thanks in Advance.
I was also facing the same problem, but i got the solution.
Swift 4.2 version.
// Merge All videos.
func mergeAllVideos(completionHandler: #escaping(Bool)->Void){
mixComposition = AVMutableComposition.init()
// To capture video.
let compositionVideoTrack = mixComposition?.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
// To capture audio.
let compositionAudioTrack = mixComposition?.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
var nextCliptStartTime: CMTime = CMTime.zero
// Iterate video array.
for file_url in Constants.videoFileNameArr{
// Do Merging here.
let videoAsset = AVURLAsset.init(url: file_url)
let timeRangeInAsset = CMTimeRangeMake(start: CMTime.zero, duration: videoAsset.duration);
do{
// Merge video.
try compositionVideoTrack?.insertTimeRange(CMTimeRange(start: CMTime.zero, duration: videoAsset.duration), of: videoAsset.tracks(withMediaType: .video)[0], at: nextCliptStartTime)
// Merge Audio
try compositionAudioTrack?.insertTimeRange(CMTimeRange(start: CMTime.zero, duration: videoAsset.duration), of: videoAsset.tracks(withMediaType: .audio)[0], at: nextCliptStartTime)
}catch{
print(error)
}
// Increment the time to which next clip add.
nextCliptStartTime = CMTimeAdd(nextCliptStartTime, timeRangeInAsset.duration)
}
// Add rotation to make it portrait.
let rotationTransform = CGAffineTransform(rotationAngle: CGFloat(Double.pi/2))
compositionVideoTrack!.preferredTransform = rotationTransform
// Save final file.
self.saveFinalFile(mixComposition!){
isDone in
completionHandler(true)
}
}

Resources