Setting Track ID - ios

I am having trouble with setting the track id when i set up my input parameters for my audioMix. How do i set the trackId? I have tried it this way also params.trackID(track2.trackID) but that gives me this error CMPersistantTrackID -> $T4 is not identical to CMPersistantTrackID. I am trying to translate this line [audioInputParams setTrackID:[track trackID]]; from https://developer.apple.com/library/ios/qa/qa1716/_index.html
Error When i run code below:
Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[__NSArrayM trackID]: unrecognized selector sent to instance 0x7f98c34cbfc0'
Code:
let type = AVMediaTypeAudio
let asset1 = AVURLAsset(URL: beatLocationURL, options: nil)
let arr2 = asset1.tracksWithMediaType(type)
let track2 = arr2.last as AVAssetTrack
let asset = AVURLAsset(URL: vocalURL, options:nil)
let arr3 = asset.tracksWithMediaType(type)
let track3 = arr3.last as AVAssetTrack
var trackParams = NSMutableArray()
let params = AVMutableAudioMixInputParameters(track:track2)
params.setVolume(0.0, atTime:kCMTimeZero)
params.trackID = track2.trackID <--- this line
trackParams.addObject(params)
let params1 = AVMutableAudioMixInputParameters(track:track3)
params1.setVolume(1.0, atTime: kCMTimeZero)
params1.trackID = track3.trackID <-- this line also
trackParams.addObject(params1)
let mix = AVMutableAudioMix()
mix.inputParameters = [trackParams]

Related

AudioKit Apply Gain to Mono PCM Buffer

I have a mono audio file. Which I am opening and trying to apply gain with the following code:
let inputFile = try! AVAudioFile(forReading: url)
let settings = inputFile.fileFormat.settings
let outputFile = try! AVAudioFile(forWriting: outputURL, settings: settings)
let sourceBuffer = try! AVAudioPCMBuffer(file: inputFile)
let engine = AudioEngine()
let player = AudioPlayer()
let compressor = Compressor(player)
compressor.masterGain = AUValue(gain)
engine.output = compressor
compressor.start()
do {
try engine.start()
player.start()
player.scheduleBuffer(sourceBuffer!, at: nil, options: [], completionHandler: nil)
try engine.renderToFile(outputFile, duration: inputFile.duration)
} catch {
completion(.failure(error))
}
player.scheduleBuffer crashes with the following exception:
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: _outputFormat.channelCount == buffer.format.channelCount'
terminating with uncaught exception of type NSException
But how do I set the correct number of channels?
I already tried Settings.audioFormat = inputFile.fileFormat or Settings.audioFormat = sourceBuffer!.format before initializing the AudioEngine. Same results.
I appreciate any help. Thanks.

How to create a anAudioSampleBuffer for CMSampleBufferGetFormatDescription in iOS Swift

I have been working on video compression in iOS Swift, and following this SO's answer. It is working fine until I change this piece of code's file format to .mp4
let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileType.mov)
There are reasons that I need the output in .mp4 file format. So when I do that it crashes the app. And gives me this error,
2020-04-27 18:20:52.573614+0500 BrightCaster[7847:1513728] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetWriter addInput:] In order to perform passthrough to file type public.mpeg-4, please provide a format hint in the AVAssetWriterInput initializer'
*** First throw call stack:
(0x1b331d5f0 0x1b303fbcc 0x1bd53b2b0 0x102383c0c 0x102382164 0x1021897cc 0x1b6ca73bc 0x1b6caba7c 0x1b6daec94 0x1b7835080 0x1b7834d30 0x1e9d077b4 0x1b786a764 0x1b783eb68 0x1b783f070 0x1e9d468f4 0x1b783f1c0 0x1e9d468f4 0x1b9e21d9c 0x105173730 0x105181710 0x1b329b748 0x1b329661c 0x1b3295c34 0x1bd3df38c 0x1b73c822c 0x10230f8a0 0x1b311d800)
libc++abi.dylib: terminating with uncaught exception of type NSException
So I searched on SO and found this question relevant to my problem.
but now the issue is when I try to add its answer to my function it gives me error anAudioSampleBuffer not defined. As I am totally new to audio/video domain, I am unable to understand why it is giving me this. And how to resolve this.
The piece of code from answer that I am adding with my function is below.
//setup audio writer
//let formatDesc = CMSampleBufferGetFormatDescription(anAudioSampleBuffer)
//let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil, sourceFormatHint: formatDesc)
let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
audioWriterInput.expectsMediaDataInRealTime = false
videoWriter.add(audioWriterInput)
The commented part is not working. Any help would be appreciated Thanks.
Whole function for conversion is following
func convertVideoToLowQuailtyWithInputURL(inputURL: URL, outputURL: URL, completion: #escaping (Bool , _ url: String) -> Void) {
let videoAsset = AVURLAsset(url: inputURL as URL, options: nil)
let videoTrack = videoAsset.tracks(withMediaType: AVMediaType.video)[0]
let videoSize = videoTrack.naturalSize
let videoWriterCompressionSettings = [
AVVideoAverageBitRateKey : Int(125000)
]
let videoWriterSettings:[String : AnyObject] = [
AVVideoCodecKey : AVVideoCodecH264 as AnyObject,
AVVideoCompressionPropertiesKey : videoWriterCompressionSettings as AnyObject,
AVVideoWidthKey : Int(videoSize.width) as AnyObject,
AVVideoHeightKey : Int(videoSize.height) as AnyObject
]
let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoWriterSettings)
videoWriterInput.expectsMediaDataInRealTime = true
videoWriterInput.transform = videoTrack.preferredTransform
let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileType.mov) // for now its converting in .mov I THINK SO.
videoWriter.add(videoWriterInput)
//setup video reader
let videoReaderSettings:[String : AnyObject] = [
kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) as AnyObject
]
let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
var videoReader: AVAssetReader!
do{
videoReader = try AVAssetReader(asset: videoAsset)
}
catch {
print("video reader error: \(error)")
completion(false, "")
}
videoReader.add(videoReaderOutput)
//setup audio writer
//let formatDesc = CMSampleBufferGetFormatDescription(anAudioSampleBuffer) // this is giving me error here of un initilize, which I didn't I know.
//let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil, sourceFormatHint: formatDesc)
let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
audioWriterInput.expectsMediaDataInRealTime = false
videoWriter.add(audioWriterInput)
//setup audio reader
let audioTrack = videoAsset.tracks(withMediaType: AVMediaType.audio)[0]
let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
let audioReader = try! AVAssetReader(asset: videoAsset)
audioReader.add(audioReaderOutput)
videoWriter.startWriting()
//start writing from video reader
videoReader.startReading()
videoWriter.startSession(atSourceTime: CMTime.zero)
let processingQueue = DispatchQueue(label: "processingQueue1")
videoWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
while videoWriterInput.isReadyForMoreMediaData {
let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer();
if videoReader.status == .reading && sampleBuffer != nil {
videoWriterInput.append(sampleBuffer!)
}
else {
videoWriterInput.markAsFinished()
if videoReader.status == .completed {
//start writing from audio reader
audioReader.startReading()
videoWriter.startSession(atSourceTime: CMTime.zero)
let processingQueue = DispatchQueue(label: "processingQueue2")
audioWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
while audioWriterInput.isReadyForMoreMediaData {
let sampleBuffer:CMSampleBuffer? = audioReaderOutput.copyNextSampleBuffer()
if audioReader.status == .reading && sampleBuffer != nil {
audioWriterInput.append(sampleBuffer!)
}
else {
audioWriterInput.markAsFinished()
if audioReader.status == .completed {
videoWriter.finishWriting(completionHandler: {() -> Void in
completion(true, "\(videoWriter.outputURL)")
})
}
}
}
})
}
}
}
})
}
You can output as mp4, passing audio through (no transcode) by providing that format hint like so:
let audioTrack = videoAsset.tracks(withMediaType: AVMediaType.audio)[0]
let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil, sourceFormatHint: audioTrack.formatDescriptions[0] as! CMFormatDescription)
Note the new position of audioTrack definition.
I imagine both of Apple's .mov and .mp4 implementations need to know the the compressed audio format to write the file, but I guess .mov is ok with inferring that information after initialisation, where .mp4 is not. Maybe it's another AVFoundation Surprise!.
In your case I saw that it would be tiresome to rework the code to get the audio format from the first sample buffer, but then I remembered that the format is available from the input audio track.

display local pdf file using UIDocumentInteractionControllerDelegate

I want to display a local pdf file using UIDocumentInteractionController
Here is my code :
pole = Pole.getWithMinor(minorBeacon)
var address = pole.pdf
if pole != nil {
var urlpath = NSBundle.mainBundle().pathForResource(pole.pdf, ofType: "pdf")
let url : NSURL! = NSURL(string: urlpath!)
//pdfView.loadRequest(NSURLRequest(URL: url))
println(url)
let docController = UIDocumentInteractionController(URL : url)
docController.UTI = "com.adobe.pdf"
docController.delegate = self
docController.presentPreviewAnimated(true)
the error is :
2015-06-02 15:57:08.390 LePetitPoucet[4232:2026014] *** Assertion failure in -[UIDocumentInteractionController setURL:], /SourceCache/UIKit/UIKit-3318.93/UIDocumentInteractionController.m:1024
2015-06-02 15:57:08.391 LePetitPoucet[4232:2026014] *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'UIDocumentInteractionController: invalid scheme (null). Only the file scheme is supported.'
when i create a webWiew, the PDF is displayed well but i want to use pdf native viewer instead. Thanks for your help !
This one always gets me too
let url : NSURL! = NSURL(string: urlpath!) is wrong
you want
let url : NSURL! = NSURL.fileURLWithPath(urlpath!)

Invalid File Output AVAssetExport

When i run my app and click the save button to save the two files mixed together my app crashes saying invalid file output. I dont see why this error shows up because the two files being mixed are mp3 and output file is mp3.
Code:
#IBAction func mixButton (sender:AnyObject!) {
let oldAsset = self.player.currentItem.asset
let type = AVMediaTypeAudio
let audioFile = NSBundle.mainBundle().URLForResource("file1", withExtension: "mp3")
let asset1 = AVURLAsset(URL: audioFile, options: nil)
let arr2 = asset1.tracksWithMediaType(type)
let track2 = arr2.last as AVAssetTrack
let duration : CMTime = track2.timeRange.duration
let comp = AVMutableComposition()
let comptrack = comp.addMutableTrackWithMediaType(type,
preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
comptrack.insertTimeRange(CMTimeRangeMake(CMTimeMakeWithSeconds(0,600), CMTimeMakeWithSeconds(5,600)), ofTrack:track2, atTime:CMTimeMakeWithSeconds(0,600), error:nil)
comptrack.insertTimeRange(CMTimeRangeMake(CMTimeSubtract(duration, CMTimeMakeWithSeconds(5,600)), CMTimeMakeWithSeconds(5,600)), ofTrack:track2, atTime:CMTimeMakeWithSeconds(5,600), error:nil)
let type3 = AVMediaTypeAudio
let s = NSBundle.mainBundle().URLForResource("file2", withExtension:"mp3")
let asset = AVURLAsset(URL:s, options:nil)
let arr3 = asset.tracksWithMediaType(type3)
let track3 = arr3.last as AVAssetTrack
let comptrack3 = comp.addMutableTrackWithMediaType(type3, preferredTrackID:Int32(kCMPersistentTrackID_Invalid))
comptrack3.insertTimeRange(CMTimeRangeMake(CMTimeMakeWithSeconds(0,600), CMTimeMakeWithSeconds(10,600)), ofTrack:track3, atTime:CMTimeMakeWithSeconds(0,600), error:nil)
let params = AVMutableAudioMixInputParameters(track:comptrack3)
params.setVolume(1, atTime:CMTimeMakeWithSeconds(0,600))
params.setVolumeRampFromStartVolume(1, toEndVolume:0, timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(7,600), CMTimeMakeWithSeconds(3,600)))
let mix = AVMutableAudioMix()
mix.inputParameters = [params]
let item = AVPlayerItem(asset:comp)
item.audioMix = mix
mixedFile = comp //global variable for mixed file
}
}
#IBAction func saveButton(sender: AnyObject) {
let documentsPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as String
let savedFileTest = documentsPath + "/myfile.mp3"
if (NSFileManager.defaultManager().fileExistsAtPath(savedFileTest)) {
NSFileManager.defaultManager().removeItemAtPath(savedFileTest, error: nil)
}
let url = NSURL.fileURLWithPath(savedFileTest)
let exporter = AVAssetExportSession(asset: mixedFile, presetName: AVAssetExportPresetHighestQuality)
exporter.outputURL = url
exporter.outputFileType = AVFileTypeMPEGLayer3
exporter.exportAsynchronouslyWithCompletionHandler({
switch exporter.status{
case AVAssetExportSessionStatus.Failed:
println("failed \(exporter.error)")
case AVAssetExportSessionStatus.Cancelled:
println("cancelled \(exporter.error)")
default:
println("complete")
}
The output file is not mp3. You can say mp3 but that doesn't make it one. I don't think Apple framework code can save as mp3. It can read it, but due to various licensing issues it can't write it.
Do it as an m4a, like this (I have starred the lines I changed from your original code):
let savedFileTest = documentsPath + "/myfile.m4a" // *
if (NSFileManager.defaultManager().fileExistsAtPath(savedFileTest)) {
NSFileManager.defaultManager().removeItemAtPath(savedFileTest, error: nil)
}
let url = NSURL.fileURLWithPath(savedFileTest)
let exporter = AVAssetExportSession(
asset: mixedFile, presetName: AVAssetExportPresetAppleM4A) // *
exporter.outputURL = url
exporter.outputFileType = AVFileTypeAppleM4A // *
By the way, you're saving the wrong thing (the unmixed comp rather than the mixed item).
You can export it to "AVFileTypeQuickTimeMovie" and rename it to *.mp3.
Init export, here you must set "presentName" argument to "AVAssetExportPresetPassthrough". If not, you will can't export mp3 correct.
AVAssetExportSession *export = [[AVAssetExportSession alloc] initWithAsset:sset presetName:AVAssetExportPresetPassthrough];
Set the ouput type:
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = true;
Export it and set the file name extension to "mp3".

Save AVPlayerItem to documents directory

How do i go about saving an AVPlayerItem? I looked online and really couldn't find anything. The AVPlayerItem contains two audio files in an asset. How do i save this to the users documents folder? My code is in swift but answers in objective c are welcome also.
Code:
let type = AVMediaTypeAudio
let audioFile = NSBundle.mainBundle().URLForResource("school", withExtension: "mp3")
let asset1 = AVURLAsset(URL: audioFile, options: nil)
let arr2 = asset1.tracksWithMediaType(type)
let track2 = arr2.last as AVAssetTrack
let duration : CMTime = track2.timeRange.duration
let comp = AVMutableComposition()
let comptrack = comp.addMutableTrackWithMediaType(type,
preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
comptrack.insertTimeRange(CMTimeRangeMake(CMTimeMakeWithSeconds(0,600), CMTimeMakeWithSeconds(5,600)), ofTrack:track2, atTime:CMTimeMakeWithSeconds(0,600), error:nil)
comptrack.insertTimeRange(CMTimeRangeMake(CMTimeSubtract(duration, CMTimeMakeWithSeconds(5,600)), CMTimeMakeWithSeconds(5,600)), ofTrack:track2, atTime:CMTimeMakeWithSeconds(5,600), error:nil)
let type3 = AVMediaTypeAudio
let s = NSBundle.mainBundle().URLForResource("file2", withExtension:"m4a")
let asset = AVURLAsset(URL:s, options:nil)
let arr3 = asset.tracksWithMediaType(type3)
let track3 = arr3.last as AVAssetTrack
let comptrack3 = comp.addMutableTrackWithMediaType(type3, preferredTrackID:Int32(kCMPersistentTrackID_Invalid))
comptrack3.insertTimeRange(CMTimeRangeMake(CMTimeMakeWithSeconds(0,600), CMTimeMakeWithSeconds(10,600)), ofTrack:track3, atTime:CMTimeMakeWithSeconds(0,600), error:nil)
let params = AVMutableAudioMixInputParameters(track:comptrack3)
params.setVolume(1, atTime:CMTimeMakeWithSeconds(0,600))
params.setVolumeRampFromStartVolume(1, toEndVolume:0, timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(7,600), CMTimeMakeWithSeconds(3,600)))
let mix = AVMutableAudioMix()
mix.inputParameters = [params]
let item = AVPlayerItem(asset:comp)
item.audioMix = mix
It's hard to understand from your question what you're trying to do, but I have a vague suspicion that you need to read up on the AVAssetExportSession class.
https://developer.apple.com/library/ios/Documentation/AVFoundation/Reference/AVAssetExportSession_Class/index.html#//apple_ref/occ/cl/AVAssetExportSession

Resources