I'm trying to combine one videotrack and two audiotracks with AVComposition. It works ok (I have some video with two audiotracks as the result), but there is a strange problem: when I'm trying to play resulting video in player, it's playing for, like, 18 seconds and after that it starts from the beginning!
After that it's ok.
Why it could happen?
videoAsset, audioAsset, and audioAsset2 are just AVAssets initialized with URL.
//Creating AVMutableComposition
var mixComposition = AVMutableComposition()
let videoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
try! videoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration),
ofTrack: videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0],
atTime: kCMTimeZero)
let audioTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: 0)
try! audioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, CMTimeAdd(kCMTimeZero, audioAsset.duration)),ofTrack: audioAsset.tracksWithMediaType(AVMediaTypeAudio)[0], atTime: kCMTimeZero)
//the first audiotrack is very short, it's just an overlay for the first several seconds
let audioTrack2 = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: 1)
try! audioTrack2.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), ofTrack: audioAsset2.tracksWithMediaType(AVMediaTypeAudio)[0], atTime: kCMTimeZero)
//this one is the main audiotrack for the video. its length slighly bigger than video duration, so I use videoAsset.duration here
//Player
self.videoitem = AVPlayerItem(asset: mixComposition)
player = AVPlayer(playerItem: self.videoitem)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.backgroundColor = UIColor.blackColor().CGColor
playerLayer.frame = CGRectMake(0, 0, screenWidth, screenHeight)
self.view.layer.addSublayer(playerLayer)
NSNotificationCenter.defaultCenter().addObserver(self,
selector: "playerItemDidReachEnd:",
name: AVPlayerItemDidPlayToEndTimeNotification,
object: player.currentItem)
player.volume = 0.7
player.play()
Related
I am trying to merge two videos together in AVFoundation.
I am using AVMutableComposition and I add both tracks to the composition, resulting in a final video where I have the first video with its audio, and after that the 2nd audio but no video.
How can I get the audio and video of both tracks?
Thank you
let composition = AVMutableComposition()
let audioTrack: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)!
let videoTrack: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)!
let audioTrack2: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)!
let videoTrack2: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)!
var outputURL = documentDirectory.appendingPathComponent("output-temp")
do {
try! audioTrack.insertTimeRange(CMTimeRangeFromTimeToTime(start: startTime, end: endTime), of: asset.tracks(withMediaType: AVMediaType.audio)[0], at: CMTime.zero)
try! videoTrack.insertTimeRange(CMTimeRangeFromTimeToTime(start: startTime, end: endTime), of: asset.tracks(withMediaType: AVMediaType.video)[0], at: CMTime.zero)
try! audioTrack2.insertTimeRange(CMTimeRangeFromTimeToTime(start: startTime, end: asset2.duration), of: asset2.tracks(withMediaType: AVMediaType.audio)[0], at: CMTime.invalid)
try! videoTrack2.insertTimeRange(CMTimeRangeFromTimeToTime(start: startTime, end: asset2.duration), of: asset2.tracks(withMediaType: AVMediaType.video)[0], at: CMTime.invalid)
try manager.createDirectory(at: outputURL, withIntermediateDirectories: true, attributes: nil)
let id = "id-\(Int.random(in: 0...199))"
let mediaType = "mp4"
outputURL = outputURL.appendingPathComponent("preVideo-\(id).\(mediaType)")
} catch let error {
print(error)
}
The problem is that you are adding a second video track to the composition. You need to insert both videos into the same video track. Just delete your let videoTrack2 and go from there.
2019-04-10 10:49:51.590008+0500 VTKaraokeView[869:1039603] *** Terminating app due to uncaught exception 'NSRangeException', reason: '*** -[__NSArray0 objectAtIndex:]: index 0 beyond bounds for empty NSArray'
Hi iOS Gurus! I'm merging Video & Audio files(.mp4 & .mp3 file)...
BACKGROUND & PROBLEM STATEMENT:-
As I'm working on an App like karaoke... I'm recording Video with Background Music and then AFTER MERGING this recorded video & background music into new newVideo.mp4 file and then Playing .newVideo.mp4 into AVPlayerViewController... As this works perfect BUT the problem is when my recorded video exceeds from almost 10 seconds then I'm getting this above exception. AND this exception occurs on this Line let aAudioOfVideoTrack : AVAssetTrack = aVideoAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
func mergeFilesWithUrl(videoUrl: URL, audioUrl:URL)
{
let savePathUrl : NSURL = NSURL(fileURLWithPath: NSHomeDirectory() + "/Documents/newVideo.mp4")
do { // delete old video
try FileManager.default.removeItem(at: savePathUrl as URL)
} catch { print(error.localizedDescription) }
let mixComposition : AVMutableComposition = AVMutableComposition()
var mutableCompositionVideoTrack : [AVMutableCompositionTrack] = []
var mutableCompositionAudioTrack : [AVMutableCompositionTrack] = []
var mutableCompositionAudioOfVideoTrack : [AVMutableCompositionTrack] = []
let totalVideoCompositionInstruction : AVMutableVideoCompositionInstruction = AVMutableVideoCompositionInstruction()
//start merge
let aVideoAsset : AVAsset = AVAsset(url: videoUrl)
let aAudioAsset : AVAsset = AVAsset(url: audioUrl)
mutableCompositionVideoTrack.append(mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid))
mutableCompositionAudioTrack.append(mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid))
mutableCompositionAudioOfVideoTrack.append( mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid))
let aAudioOfVideoTrack : AVAssetTrack = aVideoAsset.tracks(withMediaType: AVMediaTypeAudio)[0] // HERE i'm getting Error...Index Array Out Of Bound...
let aVideoAssetTrack : AVAssetTrack = aVideoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
let aAudioAssetTrack : AVAssetTrack = aAudioAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
do{
try mutableCompositionAudioOfVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), of: aAudioOfVideoTrack , at: kCMTimeZero)
try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), of: aVideoAssetTrack, at: kCMTimeZero)
//In my case my audio file is longer then video file so i took videoAsset duration
//instead of audioAsset duration
try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), of: aAudioAssetTrack, at: kCMTimeZero)
//Use this instead above line if your audiofile and video file's playing durations are same
// try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), ofTrack: aAudioAssetTrack, atTime: kCMTimeZero)
}catch{
}
totalVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,aVideoAssetTrack.timeRange.duration )
let mutableVideoComposition : AVMutableVideoComposition = AVMutableVideoComposition()
mutableVideoComposition.frameDuration = CMTimeMake(1, 30)
mutableVideoComposition.renderSize = CGSize(width: 1280, height: 720)//CGSize(1280,720)
finalPath = savePathUrl.absoluteString
let assetExport: AVAssetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)!
assetExport.outputFileType = AVFileTypeMPEG4
assetExport.outputURL = savePathUrl as URL
assetExport.shouldOptimizeForNetworkUse = true
assetExport.exportAsynchronously { () -> Void in
switch assetExport.status {
case AVAssetExportSessionStatus.completed:
print("success")
case AVAssetExportSessionStatus.failed:
print("failed \(assetExport.error)")
case AVAssetExportSessionStatus.cancelled:
print("cancelled \(assetExport.error)")
default:
print("complete")
}
}
}
Spent almost 1 day to solve this & This is the perfect solution for this...
After a lot got help from iOS 8 iPad AVCaptureMovieFileOutput drops / loses / never gets audio track after 13 - 14 seconds of recording ...
Just add this line & works like a charm
avCaptureMovieFileOutput.movieFragmentInterval = kCMTimeInvalid
I am working on Apple TV application. I have one URL of Video and one URL of video Subtitle.
I want to play video with subtitle. Anybody have a solution for this problem
If you have please let me know.
There is a Video URL: "https://sample/hls_playlist.m3u8"
There is a Subtitle URL: "https://samplesubtitle=AKIAJS5BLYTDP3J5UOFQ&Expires=1507918258&Signature=r%2FcV7UVSejOBYDKFHwMYtrvXUmM%3D"
This is my code:
func playVideo(_ videoURL: String) {
// 1 - Load video asset
let videoAsset = AVURLAsset.init(url: URL(string:videoURL)!) as AVURLAsset
// 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
let mixComposition = AVMutableComposition.init()
// 3 - Video track
let videoTrack: AVMutableCompositionTrack? = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
try? videoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: (videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]), at: kCMTimeZero)
// 4 - Subtitle track
let subtitleAsset = AVURLAsset.init(url: URL(string: self.subtitle!)!)
let subtitleTrack: AVMutableCompositionTrack? = mixComposition.addMutableTrack(withMediaType: AVMediaTypeText, preferredTrackID: kCMPersistentTrackID_Invalid)
try? subtitleTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: subtitleAsset.tracks(withMediaType: AVMediaTypeText)[0], at: kCMTimeZero)
self.playerItem = AVPlayerItem.init(asset: mixComposition)
self.playerObj = AVPlayer.init(playerItem: self.playerItem)
self.playerController?.player = self.playerObj
self.playerItem.addObserver(self, forKeyPath: Notification.Name.status.rawValue, options: NSKeyValueObservingOptions.new, context: nil)
self.playerController?.player!.play()
NotificationCenter.default.addObserver(self, selector: #selector(PlayerViewController.playerItemDidReachEnd(_:)), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: self.playerObj.currentItem)
}
The code below exports a video using AVMutableComposition. But in the exported video, if you want an image to display for 3 seconds after the source video finishes, is there a way to do that with AVMutableCompositionTrack or do you need to add an image layer and animate its appearance after the video ends?
Eventually, the goal is to merge an arbitrary number of images and videos into one master video.
Unfortunately, during testing it seems like AVVideoCompositionCoreAnimationTool severely slows down the export process (from < 1 second to 10-20 seconds), so the goal is to avoid AVVideoCompositionCoreAnimationTool if possible.
// Create composition object
let composition = AVMutableComposition()
let compositionVideoTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
let compositionAudioTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
var insertTime = kCMTimeZero
// Extract tracks from slice video
let videoURL = NSURL(fileURLWithPath: videoPath)
let videoAsset = AVURLAsset(URL: videoURL, options: nil)
let sourceVideoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
let sourceAudioTrack = videoAsset.tracksWithMediaType(AVMediaTypeAudio)[0]
do {
try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), ofTrack: sourceVideoTrack, atTime: kCMTimeZero)
try compositionAudioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), ofTrack: sourceAudioTrack, atTime: kCMTimeZero)
} catch {
print("Error with insertTimeRange while exporting video: \(error)")
}
// Export composition to video
let outputURL = getFilePath(getUniqueFilename(gMP4File))
let exporter = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetHighestQuality)
exporter!.outputURL = NSURL(fileURLWithPath: outputURL)
exporter!.outputFileType = AVFileTypeMPEG4
exporter!.exportAsynchronouslyWithCompletionHandler({
self.exportDidFinish(exporter!)
})
After consulting others on SO and performing more web research, it seems like this is not possible. Merging an image with a video into a master video that is playable out of an app seems to require AVVideoCompositionCoreAnimationTool.
How to combine video clips with different orientation using AVFoundation
I have gone with the above answer and is going good. But i am facing a problem that audio of the video is being removed. Even all of my videos have voice. But after merging the exported video is mute. Can anyone help. Thanks in Advance.
I was also facing the same problem, but i got the solution.
Swift 4.2 version.
// Merge All videos.
func mergeAllVideos(completionHandler: #escaping(Bool)->Void){
mixComposition = AVMutableComposition.init()
// To capture video.
let compositionVideoTrack = mixComposition?.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
// To capture audio.
let compositionAudioTrack = mixComposition?.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
var nextCliptStartTime: CMTime = CMTime.zero
// Iterate video array.
for file_url in Constants.videoFileNameArr{
// Do Merging here.
let videoAsset = AVURLAsset.init(url: file_url)
let timeRangeInAsset = CMTimeRangeMake(start: CMTime.zero, duration: videoAsset.duration);
do{
// Merge video.
try compositionVideoTrack?.insertTimeRange(CMTimeRange(start: CMTime.zero, duration: videoAsset.duration), of: videoAsset.tracks(withMediaType: .video)[0], at: nextCliptStartTime)
// Merge Audio
try compositionAudioTrack?.insertTimeRange(CMTimeRange(start: CMTime.zero, duration: videoAsset.duration), of: videoAsset.tracks(withMediaType: .audio)[0], at: nextCliptStartTime)
}catch{
print(error)
}
// Increment the time to which next clip add.
nextCliptStartTime = CMTimeAdd(nextCliptStartTime, timeRangeInAsset.duration)
}
// Add rotation to make it portrait.
let rotationTransform = CGAffineTransform(rotationAngle: CGFloat(Double.pi/2))
compositionVideoTrack!.preferredTransform = rotationTransform
// Save final file.
self.saveFinalFile(mixComposition!){
isDone in
completionHandler(true)
}
}