iOS AVPlayer cant play 240 fps video - ios

I recorded a 240 fps video after changing the AVCaptureDeviceFormat. If I save that video in the photo library, the slowmo effect is there. But, If I play that file from documents directory, using an AVPlayer, I cant see the slowmo effect.
Code to play the video:
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:[AVAsset assetWithURL:[NSURL fileURLWithPath:fullPath]]];
AVPlayer *feedVideoPlayer = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerViewController *playerController = [[AVPlayerViewController alloc] init];
playerController.view.frame = CGRectMake(0, 0, videoPreviewView.frame.size.width, videoPreviewView.frame.size.height);
playerController.player = feedVideoPlayer;

It's a bit annoying, but I believe you'll need to re-create the video in an AVComposition if you don't want to lose quality. I'd love to know if there is another way, but this is what I've come up with. You can technically export the video via AVAssetExportSession, but using a PassThrough quality will result in the same video file, which won't be slow motion- you'll need to transcode it, which loses quality (AFAIK. See Issue playing slow-mo AVAsset in AVPlayer for that solution).
The first thing you'll need to do is grab the source media's original time mapping objects. You can do that like so:
let options = PHVideoRequestOptions()
options.version = PHVideoRequestOptionsVersion.current
options.deliveryMode = .highQualityFormat
PHImageManager().requestAVAsset(forVideo: phAsset, options: options, resultHandler: { (avAsset, mix, info) in
guard let avAsset = avAsset else { return }
let originalTimeMaps = avAsset.tracks(withMediaType: AVMediaTypeVideo)
.first?
.segments
.flatMap { $0.timeMapping } ?? []
}
Once you have timeMappings of the original media (the one sitting in your documents directory), you can pass in the URL of that media and the original CMTimeMapping objects that you would like to recreate. Then create a new AVComposition that is ready to play in an AVPlayer. You'll need a class similar to this:
class CompositionMapper {
let url: URL
let timeMappings: [CMTimeMapping]
init(for url: URL, with timeMappings: [CMTimeMapping]) {
self.url = url
self.timeMappings = timeMappings
}
init(with asset: AVAsset, and timeMappings: [CMTimeMapping]) {
guard let asset = asset as? AVURLAsset else {
print("cannot get a base URL from this asset.")
fatalError()
}
self.timeMappings = timeMappings
self.url = asset.url
}
func compose() -> AVComposition {
let composition = AVMutableComposition(urlAssetInitializationOptions: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
let emptyTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let audioTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
let asset = AVAsset(url: url)
guard let videoAssetTrack = asset.tracks(withMediaType: AVMediaTypeVideo).first else { return composition }
var segments: [AVCompositionTrackSegment] = []
for map in timeMappings {
let segment = AVCompositionTrackSegment(url: url, trackID: kCMPersistentTrackID_Invalid, sourceTimeRange: map.source, targetTimeRange: map.target)
segments.append(segment)
}
emptyTrack.preferredTransform = videoAssetTrack.preferredTransform
emptyTrack.segments = segments
if let _ = asset.tracks(withMediaType: AVMediaTypeVideo).first {
audioTrack.segments = segments
}
return composition.copy() as! AVComposition
}
You can then use the compose() function of your CompositionMapper class to give you an AVComposition that is ready to play in an AVPlayer, which should respect the CMTimeMapping objects that you've passed in.
let compositionMapper = CompositionMapper(for: someAVAssetURL, with: originalTimeMaps)
let mappedComposition = compositionMapper.compose()
let playerItem = AVPlayerItem(asset: mappedComposition)
let player = AVPlayer(playerItem: playerItem)
playerItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed
Let me know if you need help converting this to Objective-C, but it should be relatively straight forward.

Related

Swift -Converted Audio URL to Video URL Doesn't Play in Photos Library

I have an audio url (.m4a) that I create using the AVAudioRecorder. I want to share that audio on Instagram so I convert the audio to a video. The issue is after the conversion, when I save the video url to the Files app using the UIActivityViewController, I can replay the video, see the time (eg 7 seconds) and hear the audio with no problem. A black screen with a sound icon appears.
But when I save the video to the Photos Library using the UIActivityViewController, the video shows the 7 seconds but nothing plays, the video is all gray, and the sound icon doesn't show.
Why is the video successfully saving/playing in the Files app but saving and not playing in the Photos Library?
let asset: AVURLAsset = AVURLAsset(url: audioURL)
let mixComposition = AVMutableComposition()
guard let compositionTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: CMPersistentTrackID()) else { return }
let track = asset.tracks(withMediaType: .audio)
guard let assetTrack = track.first else { return }
do {
try compositionTrack.insertTimeRange(CMTimeRangeMake(start: .zero, duration: assetTrack.timeRange.duration), of: assetTrack, at: .zero)
} catch {
print(error.localizedDescription)
}
guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough) else { return }
let dirPath = NSTemporaryDirectory().appending("\(UUID().uuidString).mov")
let outputFileURL = URL(fileURLWithPath: dirPath)
exporter.outputFileType = .mov
exporter.outputURL = outputFileURL
exporter.shouldOptimizeForNetworkUse = true
exporter.exportAsynchronously {
switch exporter.status {
// ...
guard let videoURL = exporter.outputURL else { return }
// present UIActivityViewController to save videoURL and then save it to the Photos Library via 'Save Video`
}
}
As Lance rightfully pointed out, the issue is that while there was an export of a file in the .mov or .mp4 format, there was no video, it was just an audio playing.
On reading a bit more, .mp4 for example is just a digital multimedia container format which can very well just be used for audio so it's possible to save audio file as a .mp4 / .mov.
What was needed was to add an empty video track to the AVMutableComposition to succeed. Lance already posted a great solution works perfectly well and is more self sustained than an alternative solution I propose which relies on having a blank 1 second video.
Overview of how it works
You get a blank video file that is 1 second long in the resolution you want, for example 1920 x 1080
You retrieve the video track from this video asset
Retrieve the audio track from your audio file
Create an AVMutableComposition which will be used to merge the audio and video tracks
Configure an AVMutableCompositionTrack with the audio track and add that to the main AVMutableComposition
Configure an AVMutableVideoComposition with the video track
Use an AVAssetExportSession to export the final video with the AVMutableComposition and the AVMutableVideoComposition
The code
In most of the code below you will see multiple guard statements. You can create one guard, however, it can be useful to know with such types of tasks where the failure occurred as there could be several reason why an export could fail.
Configuring the audio track
private func configureAudioTrack(_ audioURL: URL,
inComposition composition: AVMutableComposition) -> AVMutableCompositionTrack?
{
// Initialize an AVURLAsset with your audio file
let audioAsset: AVURLAsset = AVURLAsset(url: audioURL)
let trackTimeRange = CMTimeRange(start: .zero,
duration: audioAsset.duration)
// Get the audio track from the audio asset
guard let sourceAudioTrack = audioAsset.tracks(withMediaType: .audio).first
else
{
manageError(nil, withMessage: "Error retrieving audio track from source file")
return nil
}
// Insert a new video track to the AVMutableComposition
guard let audioTrack = composition.addMutableTrack(withMediaType: .audio,
preferredTrackID: CMPersistentTrackID())
else
{
// manage your error
return nil
}
do {
// Inset the contents of the audio source into the new audio track
try audioTrack.insertTimeRange(trackTimeRange,
of: sourceAudioTrack,
at: .zero)
}
catch {
// manage your error
}
return audioTrack
}
Configuring the video track
private func configureVideoTrack(inComposition composition: AVMutableComposition) -> AVMutableCompositionTrack?
{
// Initialize a video asset with the empty video file
guard let blankMoviePathURL = Bundle.main.url(forResource: "blank",
withExtension: ".mp4"),
let videoAsset = AVAsset(url: blankMoviePathURL)
else
{
// manage errors
return nil
}
// Get the video track from the empty video
guard let sourceVideoTrack = videoAsset.tracks(withMediaType: .video).first
else
{
// manage errors
return nil
}
// Insert a new video track to the AVMutableComposition
guard let videoTrack = composition.addMutableTrack(withMediaType: .video,
preferredTrackID: kCMPersistentTrackID_Invalid)
else
{
// manage errors
return nil
}
let trackTimeRange = CMTimeRange(start: .zero,
duration: composition.duration)
do {
// Inset the contents of the video source into the new audio track
try videoTrack.insertTimeRange(trackTimeRange,
of: sourceVideoTrack,
at: .zero)
}
catch {
// manage errors
}
return videoTrack
}
Configure the video composition
// Configure the video properties like resolution and fps
private func createVideoComposition(with videoCompositionTrack: AVMutableCompositionTrack) -> AVMutableVideoComposition
{
let videoComposition = AVMutableVideoComposition()
// Set the fps
videoComposition.frameDuration = CMTime(value: 1,
timescale: 25)
// Video dimensions
videoComposition.renderSize = CGSize(width: 1920, height: 1080)
// Specify the duration of the video composition
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRange(start: .zero, duration: .indefinite)
// Add the video composition track to a new layer
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoCompositionTrack)
let transform = videoCompositionTrack.preferredTransform
layerInstruction.setTransform(transform, at: .zero)
// Apply the layer configuration instructions
instruction.layerInstructions = [layerInstruction]
videoComposition.instructions = [instruction]
return videoComposition
}
Configure the AVAssetExportSession
private func configureAVAssetExportSession(with composition: AVMutableComposition,
videoComposition: AVMutableVideoComposition) -> AVAssetExportSession?
{
// Configure export session
guard let exporter = AVAssetExportSession(asset: composition,
presetName: AVAssetExportPresetHighestQuality)
else
{
// Manage your errors
return nil
}
// Configure where the exported file will be stored
let documentsURL = FileManager.default.urls(for: .documentDirectory,
in: .userDomainMask)[0]
let fileName = "\(UUID().uuidString).mov"
let dirPath = documentsURL.appendingPathComponent(fileName)
let outputFileURL = dirPath
// Apply exporter settings
exporter.videoComposition = videoComposition
exporter.outputFileType = .mov
exporter.outputURL = outputFileURL
exporter.shouldOptimizeForNetworkUse = true
return exporter
}
Over here, one important thing to not is to set the exporter's present quality to a movie present like AVAssetExportPresetHighestQuality or AVAssetExportPresetLowQuality for example, something other than AVAssetExportPresetPassthrough which as per the documentation,
A preset to export the asset in its current format, unless otherwise
prohibited.
So you would still get an audio mp4 or mov file since the current format of the composition is of an audio. I did not test this extensively but this is from a few tests.
Finally, you can bring it all the above functions together like so:
func generateMovie(with audioURL: URL)
{
delegate?.audioMovieExporterDidStart(self)
let composition = AVMutableComposition()
// Configure the audio and video tracks in the new composition
guard let _ = configureAudioTrack(audioURL, inComposition: composition),
let videoCompositionTrack = configureVideoTrack(inComposition: composition)
else
{
// manage error
return
}
let videoComposition = createVideoComposition(with: videoCompositionTrack)
if let exporter = configureAVAssetExportSession(with: composition,
videoComposition: videoComposition)
{
exporter.exportAsynchronously
{
switch exporter.status {
case .completed:
guard let videoURL = exporter.outputURL
else
{
// manage errors
return
}
// notify someone the video is ready at videoURL
default:
// manege error
}
}
}
}
Final Thoughts
You could test drive a working sample here
I converted this into a simple library if you wish to use it where you can configure the orientation, fps and even set a background color to the video - available at the same link
If you just want the blank videos, you can get them from here
So it seems that although the code from my question did covert the audio file to a video file, there still wasn't a video track. I know this for a fact because after I got the exporter's videoURL from my question, I tried to add a watermark to it and in the watermark code it kept crashing on
let videoTrack = asset.tracks(withMediaType: AVMediaType.video)[0]
Basically the code from my question coverts audio to video but doesn't add a video track.
What I assume is happening is when the Files app reads the file, it knows that it's a .mov or .mp4 file and then it'll play the audio track even if the video track is missing.
Conversely, when the Photos app reads the file it also know's that it's a .mov or .mp4 file but if there isn't a video track, it won't play anything.
I had to combine these 2 answers to get the audio to play as a video in the Photos app.
1st- I added my app icon (you can add any image) as 1 image to an array of images to make a video track using the code from How do I export UIImage array as a movie? answered by #scootermg.
The code from #scootermg's answer is conveniently in 1 file at this GitHub by #dldnh. In his code, in the ImageAnimator class, in the render function, instead of saving to the Library I returned the videoWriter's output URL in the completionHandler.
2nd- I combined the app icon video that I just made with the audio url from my question using the code from Swift Merge audio and video files into one video answered by #TungFam
In the mixCompostion from TungFam's answer I used the audio url's asset duration for the length of the video.
do {
try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(start: .zero,
duration: aAudioAssetTrack.timeRange.duration),
of: aVideoAssetTrack,
at: .zero)
try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(start: .zero,
duration: aAudioAssetTrack.timeRange.duration),
of: aAudioAssetTrack,
at: .zero)
if let aAudioOfVideoAssetTrack = aAudioOfVideoAssetTrack {
try mutableCompositionAudioOfVideoTrack[0].insertTimeRange(CMTimeRangeMake(start: .zero,
duration: aAudioAssetTrack.timeRange.duration),
of: aAudioOfVideoAssetTrack,
at: .zero)
}
} catch {
print(error.localizedDescription)
}

Setting multiple Volumes to each Video tracks using AudioMixInputParameters AVFoundation is not working in Swift iOS

I am working on Video based Application in Swift. As per the requirement I have to select multiple Videos from Device Gallery, setting up different different CIFilter effects and Volume for each Video Asset and then merge all the Videos and have to Save the Final Video. As an output, when I will play the Final Video then Video sound volume should change accordingly.
I have already merged all the selected Video Assets into one with different different CIFilter effects but my problem is when I am trying to set Volume for each Video Clips then it's not working. I am getting the default Volume for my Final Video. Here is my code:
func addFilerEffectAndVolumeToIndividualVideoClip(_ assetURL: URL, video: VideoFileModel, completion : ((_ session: AVAssetExportSession?, _ outputURL : URL?) -> ())?){
let videoFilteredAsset = AVAsset(url: assetURL)
print(videoFilteredAsset)
createVideoComposition(myAsset: videoFilteredAsset, videos: video)
let documentDirectory = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]
let url = URL(fileURLWithPath: documentDirectory).appendingPathComponent("\(video.fileID)_\("FilterVideo").mov")
let filePath = url.path
let fileManager = FileManager.default
do {
if fileManager.fileExists(atPath: filePath) {
print("FILE AVAILABLE")
try fileManager.removeItem(atPath:filePath)
} else {
print("FILE NOT AVAILABLE")
}
} catch _ {
}
let composition: AVMutableComposition = AVMutableComposition()
let compositionVideo: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
let compositionAudioVideo: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
//Add video to the final record
do {
try compositionVideo.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoFilteredAsset.duration), of: videoFilteredAsset.tracks(withMediaType: AVMediaTypeVideo)[0], at: kCMTimeZero)
} catch _ {
}
//Extract audio from the video and the music
let audioMix: AVMutableAudioMix = AVMutableAudioMix()
var audioMixParam: [AVMutableAudioMixInputParameters] = []
let assetVideoTrack: AVAssetTrack = videoFilteredAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
let videoParam: AVMutableAudioMixInputParameters = AVMutableAudioMixInputParameters(track: assetVideoTrack)
videoParam.trackID = compositionAudioVideo.trackID
//Set final volume of the audio record and the music
videoParam.setVolume(video.videoClipVolume, at: kCMTimeZero)
//Add setting
audioMixParam.append(videoParam)
//Add audio on final record
do {
try compositionAudioVideo.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoFilteredAsset.duration), of: assetVideoTrack, at: kCMTimeZero)
} catch _ {
assertionFailure()
}
//Fading volume out for background music
let durationInSeconds = CMTimeGetSeconds(videoFilteredAsset.duration)
let firstSecond = CMTimeRangeMake(CMTimeMakeWithSeconds(0, 1), CMTimeMakeWithSeconds(1, 1))
let lastSecond = CMTimeRangeMake(CMTimeMakeWithSeconds(durationInSeconds-1, 1), CMTimeMakeWithSeconds(1, 1))
videoParam.setVolumeRamp(fromStartVolume: 0, toEndVolume: video.videoClipVolume, timeRange: firstSecond)
videoParam.setVolumeRamp(fromStartVolume: video.videoClipVolume, toEndVolume: 0, timeRange: lastSecond)
//Add parameter
audioMix.inputParameters = audioMixParam
// Export part, left for facility
let exporter = AVAssetExportSession(asset: videoFilteredAsset, presetName: AVAssetExportPresetHighestQuality)!
exporter.videoComposition = videoFilterComposition
exporter.outputURL = url
exporter.outputFileType = AVFileTypeQuickTimeMovie
exporter.audioMix = audioMix
exporter.exportAsynchronously(completionHandler: { () -> Void in
completion!(exporter, url)
})
}
After that again I am using a method to merge all the Video Clips using AVAssetExportSession, there I am not setting any AudioMixInputParameters.
Note: When I am setting up volume in final merging method using AVAssetExportSession's AudioMixInputParameters, then Volume is getting change for full Video.
My question: Is it possible to set multiple volume for each Video Clips. Please suggest. Thank you!
Here is the working solution for my question:
func addVolumeToIndividualVideoClip(_ assetURL: URL, video: VideoFileModel, completion : ((_ session: AVAssetExportSession?, _ outputURL : URL?) -> ())?){
//Create Asset from Url
let filteredVideoAsset: AVAsset = AVAsset(url: assetURL)
video.fileID = String(video.videoID)
//Get the path of App Document Directory
let documentDirectory = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]
let url = URL(fileURLWithPath: documentDirectory).appendingPathComponent("\(video.fileID)_\("FilterVideo").mov")
let filePath = url.path
let fileManager = FileManager.default
do {
if fileManager.fileExists(atPath: filePath) {
print("FILE AVAILABLE")
try fileManager.removeItem(atPath:filePath)
} else {
print("FILE NOT AVAILABLE")
}
} catch _ {
}
let composition: AVMutableComposition = AVMutableComposition()
let compositionVideo: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
let compositionAudioVideo: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
//Add video to the final record
do {
try compositionVideo.insertTimeRange(CMTimeRangeMake(kCMTimeZero, filteredVideoAsset.duration), of: filteredVideoAsset.tracks(withMediaType: AVMediaTypeVideo)[0], at: kCMTimeZero)
} catch _ {
}
//Extract audio from the video and the music
let audioMix: AVMutableAudioMix = AVMutableAudioMix()
var audioMixParam: [AVMutableAudioMixInputParameters] = []
let assetVideoTrack: AVAssetTrack = filteredVideoAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
let videoParam: AVMutableAudioMixInputParameters = AVMutableAudioMixInputParameters(track: assetVideoTrack)
videoParam.trackID = compositionAudioVideo.trackID
//Set final volume of the audio record and the music
videoParam.setVolume(video.videoVolume, at: kCMTimeZero)
//Add setting
audioMixParam.append(videoParam)
//Add audio on final record
//First: the audio of the record and Second: the music
do {
try compositionAudioVideo.insertTimeRange(CMTimeRangeMake(kCMTimeZero, filteredVideoAsset.duration), of: assetVideoTrack, at: kCMTimeZero)
} catch _ {
assertionFailure()
}
//Fading volume out for background music
let durationInSeconds = CMTimeGetSeconds(filteredVideoAsset.duration)
let firstSecond = CMTimeRangeMake(CMTimeMakeWithSeconds(0, 1), CMTimeMakeWithSeconds(1, 1))
let lastSecond = CMTimeRangeMake(CMTimeMakeWithSeconds(durationInSeconds-1, 1), CMTimeMakeWithSeconds(1, 1))
videoParam.setVolumeRamp(fromStartVolume: 0, toEndVolume: video.videoVolume, timeRange: firstSecond)
videoParam.setVolumeRamp(fromStartVolume: video.videoVolume, toEndVolume: 0, timeRange: lastSecond)
//Add parameter
audioMix.inputParameters = audioMixParam
//Remove the previous temp video if exist
let filemgr = FileManager.default
do {
if filemgr.fileExists(atPath: "\(video.fileID)_\("FilterVideo").mov") {
try filemgr.removeItem(atPath: "\(video.fileID)_\("FilterVideo").mov")
} else {
}
} catch _ {
}
//Exporte the final record’
let exporter: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetHighestQuality)!
exporter.outputURL = url
exporter.outputFileType = AVFileTypeMPEG4
exporter.audioMix = audioMix
exporter.exportAsynchronously(completionHandler: { () -> Void in
completion!(exporter, url)
// self.saveVideoToLibrary(from: filePath)
})
}
I found, that exporting an asset with preset of AVAssetExportPresetPassthrough doesn't have an impact on output volume. When I tried to use AVAssetExportPresetLowQuality, volume change successfully applied.
I wish it is better documented somewhere :(
The working code:
// Assume we have:
let composition: AVMutableComposition
var inputParameters = [AVAudioMixInputParameters]()
// We add a track
let trackComposition = composition.addMutableTrack(...)
// Configure volume for this track
let inputParameter = AVMutableAudioMixInputParameters(track: trackComposition)
inputParameter.setVolume(desiredVolume, at: startTime)
// It works even without setting the `trackID`
// inputParameter.trackID = trackComposition.trackID
inputParameters.append(inputParameter)
// Apply gathered `inputParameters` before exporting
let audioMix = AVMutableAudioMix()
audioMix.inputParameters = inputParameters
// I found it's not working, if using `AVAssetExportPresetPassthrough`,
// so try `AVAssetExportPresetLowQuality` first
let export = AVAssetExportSession(..., presetName: AVAssetExportPresetLowQuality)
export.audioMix = audioMix
Tested this with multiple assetTrack insertions to the same compositionTrack, setting different volume for each insertion. Seems to be working.

Unable to combine multiple audio files into one in a keyboard extension running on an iOS device

The keyboard extension I built uses audio files to play audio feedback when keys are pressed. At some point, the user has an ability to combine multiple audio files into a single audio file. Combining multiple audio files works in simulator but does not work on the device.
func createSound(myNotes: [String], outputFile: String) {
// CMTime struct represents a length of time that is stored as rational number
var startTime: CMTime = kCMTimeZero
// AVMutableComposition creates new composition
let composition: AVMutableComposition = AVMutableComposition()
// AVMutableCompositionTrack - A mutable track in composition that you use to insert, remove, and scale track segments
if let compositionAudioTrack: AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid) {
for url in allFilesForCharacters() {
let avAsset: AVURLAsset = AVURLAsset(url: url)
let timeRange: CMTimeRange = CMTimeRangeMake(kCMTimeZero, avAsset.duration)
let audioTrack: AVAssetTrack = avAsset.tracks(withMediaType: AVMediaType.audio)[0]
try! compositionAudioTrack.insertTimeRange(timeRange, of: audioTrack, at: startTime)
startTime = CMTimeAdd(startTime, timeRange.duration)
}
}
let exportPath: String = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0].path+"/"+outputFile+".m4a"
try? FileManager.default.removeItem(atPath: exportPath)
if let export: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A) {
export.outputURL = URL(fileURLWithPath: exportPath)
export.outputFileType = AVFileType.m4a
export.exportAsynchronously {
if export.status == AVAssetExportSessionStatus.completed {
NSLog("All done");
if let data = try? Data(contentsOf: export.outputURL!) {
let board = UIPasteboard.general
board.setData(data, forPasteboardType: kUTTypeMPEG4Audio as String)
}
}
else {
print(export.error?.localizedDescription ?? "")
}
}
}
}
So I was able to solve the issue after realizing that the app had an allow full access switch in its settings. After turning it on, everything worked as expected. The app couldn't carry out its functionality because the device was blocking it from accessing the devices data.

AVMutableVideoComposition request sourceImage is sometimes empty

This is sort of an extension of this question of mine, but I think it is different enough to merit its own question:
I am filtering videos of various sizes, scales, etc. by feeding them into an AVMutableVideoComposition.
This is the code that I currently have:
private func filterVideo(with filter: Filter?) {
if let player = playerLayer?.player, let playerItem = player.currentItem {
let composition = AVMutableComposition()
let videoAssetTrack = playerItem.asset.tracks(withMediaType: .video).first
let videoCompositionTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
try? videoCompositionTrack?.insertTimeRange(CMTimeRange(start: kCMTimeZero, duration: playerItem.asset.duration), of: videoAssetTrack!, at: kCMTimeZero)
let videoComposition = AVMutableVideoComposition(asset: composition, applyingCIFiltersWithHandler: { (request) in
print(request.sourceImage.pixelBuffer) // Sometimes => nil
if let filter = filter {
if let filteredImage = filter.filterImage(request.sourceImage) {
request.finish(with: filteredImage, context: nil)
} else {
request.finish(with: RenderError.couldNotFilter)
}
} else {
request.finish(with: request.sourceImage, context: nil)
}
})
playerItem.videoComposition = videoComposition
}
}
filter is an instance of my custom Filter class, which has functions to filter a UIImage or CIImage.
The problem is that some videos get messed up. This is the case for only the problematic videos for which filteredImage => nil as well. This suggests that some images are empty: their pixelBuffers are nil. By the way, the pixelBuffer is nil before I even feed it into the filter.
Why is this happening, and how can I fix it?

iOS audio conversion from AVMutableComposition to AVAudioPlayer

I been searching for a solution for this but so far I couldn't find it.
Right now I'm creating an audio file with AVMutableComposition and writing it to disk with AVAssetExportSession so I can load it with AVAudioPlayer and play it at a certain time. The fact that writing it to disk takes to much time is slowing things down and in some cases make my app very slow.
The creation part of the audio takes very little time, so I was wondering if insteed of exporting the file to disk I could just export it like NSData or something so I can use it in AVAudioPlayer without writing/reading from disk
code
let soundPath = "temp"
let filepath = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(soundPath, ofType: "wav")!)
var songAsset = AVURLAsset(URL: filepath, options: nil)
var track = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
var source: AnyObject = songAsset.tracksWithMediaType(AVMediaTypeAudio)[0]
var startTime = CMTimeMakeWithSeconds(0, 44100)
var time_insert = CMTimeMakeWithSeconds(atTime, 44100)
var trackDuration = songAsset.duration
var longestTime = CMTimeMake(882000, 44100)
var timeRange = CMTimeRangeMake(startTime, longestTime)
var trackMix = AVMutableAudioMixInputParameters(track: track)
trackMix.setVolume(volume, atTime: startTime)
audioMixParams.insert(trackMix, atIndex: audioMixParams.count)
track.insertTimeRange(timeRange, ofTrack: source as AVAssetTrack, atTime: time_insert, error: nil)
This happens several times in a loop generating a whole track
Finally I use
var exporter = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
exporter.audioMix = audioMix
exporter.outputFileType = "com.apple.m4a-audio"
exporter.outputURL = NSURL(fileURLWithPath: fileName)
exporter.exportAsynchronouslyWithCompletionHandler({
switch exporter.status{
case AVAssetExportSessionStatus.Failed:
println("failed \(exporter.error)")
case AVAssetExportSessionStatus.Cancelled:
println("cancelled \(exporter.error)")
default:
println("complete")
callback()
}
})
to export the file.
and after that it calls a callback that loads the sound and loops it in a time.
The fact I don't loop with the normal loop property is that each file might not have the same length as loopDuration
callback: {
self.fillSoundArray()
self.fillSoundArray()
self.fillSoundArray()
}
private func fillSoundArray() {
var currentloop = self.current_loop++
let documentsPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as NSString
var fileName = documentsPath + "/temp.mp4"
var tempAudio = AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath: fileName), error: nil)
if (self.start_time == 0) {
self.start_time = tempAudio.deviceCurrentTime
}
queue.insert(tempAudio, atIndex: queue.count)
tempAudio.prepareToPlay()
tempAudio.delegate = self
tempAudio.playAtTime(start_time + currentloop*self.loopDuration)
}
Use AVPlayerItem & AVPlayer to make a quick preview of your audio composition
AVPlayerItem* previewPlayerItem = [[AVPlayerItem alloc] initWithAsset:mutableCompositionMain
automaticallyLoadedAssetKeys:#[#"tracks",#"duration"]];
previewPlayerItem.videoComposition = mutableVideoComposition;
AVPlayer* previewPlayer = [[AVPlayer alloc] initWithPlayerItem:previewPlayerItem ];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(dismisPreviewPlayer:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[previewPlayer currentItem]];

Resources