I'm trying to create an MTAudioProcessingTap based on the tutorial from this blog entry - http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
The issue is working with various audio formats. I've been able to successfully create a tap with M4A's, but it is not working with MP3's. Even stranger to me - the code works on a simulator with both formats, but not on the device (only m4a works). I'm getting OSStatusError code 50 in my process block, and if I attempt to use the AudioBufferList data, I'll get a bad access. The tap setup and callbacks I'm using are below. The process block seems to be the culprit (I think) but I do not know why.
Update - It seems like it is very sporadically working on the first time after a bit of a break, but only the first time. I get the feeling there is some sort of lock on my audio file. Does anyone know what should be doing in the unprepare block for clean up?
Unprepare block -
void unprepare(MTAudioProcessingTapRef tap)
{
NSLog(#"Unpreparing the Audio Tap Processor");
}
Process block (will get OSStatus error 50) -
void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames,
MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut,
CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut)
{
OSStatus err = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut,
flagsOut, NULL, numberFramesOut);
if (err) NSLog(#"Error from GetSourceAudio: %ld", err);
}
Tap Setup -
NSURL *assetURL = [[NSBundle mainBundle] URLForResource:#"DLP" withExtension:#"mp3"];
assert(assetURL);
// Create the AVAsset
AVAsset *asset = [AVAsset assetWithURL:assetURL];
assert(asset);
// Create the AVPlayerItem
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
assert(playerItem);
assert([asset tracks]);
assert([[asset tracks] count]);
self.player = [AVPlayer playerWithPlayerItem:playerItem];
assert(self.player);
// Continuing on from where we created the AVAsset...
AVAssetTrack *audioTrack = [[asset tracks] objectAtIndex:0];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
// Create a processing tap for the input parameters
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
MTAudioProcessingTapRef tap;
// The create function makes a copy of our callbacks struct
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err || !tap) {
NSLog(#"Unable to create the Audio Processing Tap");
return;
}
assert(tap);
// Assign the tap to the input parameters
inputParams.audioTapProcessor = tap;
// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = #[inputParams];
playerItem.audioMix = audioMix;
// And then we create the AVPlayer with the playerItem, and send it the play message...
[self.player play];
This was apparently a bug in 6.0 (which my device was still on). Simulator was on 6.1
Upgraded device to 6.1.2 and errors disappeared.
Related
I want to send a external video from my iOS device.
This video is being received from a live streaming: RTSP server or HLS url (not from iPhone camera).
Currently I can stream my camera video from iPhone using VideoCore (internally using CameraSource and MicSource) but now, the video I want to stream comes from an URL. Similar to Periscope streaming video from GoPro Cam.
Problem 1: I don't know how to extract from a RTSP URL audio and video
Problem 2: I don't know how to create a CameraSource o MicSource from this extracted video and audio.
Do you know where to find an example or could you help me with this technical challenge?
I found a first approach for the first problem:
AVPlayerItem *item = [AVPlayerItem playerItemWithURL:URL];
AVAsset *asset = [item asset];
[asset loadValuesAsynchronouslyForKeys:#[#"tracks"] completionHandler:^{
if ([asset statusOfValueForKey:#"tracks" error:nil] == AVKeyValueStatusLoaded) {
NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
//VIDEO
//videoOutput is a AVPlayerItemVideoOutput * property
[item addOutput:self.videoOutput];
//AUDIO
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[audioTracks objectAtIndex:0]];
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)self,
callbacks.init = tap_InitCallback;
callbacks.finalize = tap_FinalizeCallback;
callbacks.prepare = tap_PrepareCallback;
callbacks.unprepare = tap_UnprepareCallback;
callbacks.process = tap_ProcessCallback;
MTAudioProcessingTapRef tap;
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
inputParams.audioTapProcessor = tap;
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = #[inputParams];
item.audioMix = audioMix;
}];
Then create a callback with CADisplayLink which will callback displayPixelBuffer: at every vsync.
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:#selector(displayLinkCallback:)];
[[self displayLink] addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[[self displayLink] setPaused:YES];
and in this method get pixelBuffer and send to output
For Audio, do similar tasks in prepare callback using AURenderCallbackStruct.
I am playing HLS streams using AVPlayer. And I also need to record these streams as user presses record button.
The approach I am using is to record audio and video separately then at the end merge these file to make the final video. And It is successful with remote mp4 files.
But now for the HLS (.m3u8) files I am able to record the video using AVAssetWriter but having problems with audio recording.
I am using MTAudioProccessingTap to process the raw audio data and write it to a file. I followed this article. I am able to record remote mp4 audio but its not working with HLS streams.
Initially I wasn't able to extract the audio tracks from the stream using AVAssetTrack *audioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
But I was able to extract the audioTracks using KVO to initialize the MTAudioProcessingTap.
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
AVPlayer *player = (AVPlayer*) object;
if (player.status == AVPlayerStatusReadyToPlay)
{
NSLog(#"Ready to play");
self.previousAudioTrackID = 0;
__weak typeof (self) weakself = self;
timeObserverForTrack = [player addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1, 100) queue:nil usingBlock:^(CMTime time)
{
#try {
for(AVPlayerItemTrack* track in [weakself.avPlayer.currentItem tracks]) {
if([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio])
weakself.currentAudioPlayerItemTrack = track;
}
AVAssetTrack* audioAssetTrack = weakself.currentAudioPlayerItemTrack.assetTrack;
weakself.currentAudioTrackID = audioAssetTrack.trackID;
if(weakself.previousAudioTrackID != weakself.currentAudioTrackID) {
NSLog(#":::::::::::::::::::::::::: Audio track changed : %d",weakself.currentAudioTrackID);
weakself.previousAudioTrackID = weakself.currentAudioTrackID;
weakself.audioTrack = audioAssetTrack;
/// Use this audio track to initialize MTAudioProcessingTap
}
}
#catch (NSException *exception) {
NSLog(#"Exception Trap ::::: Audio tracks not found!");
}
}];
}
}
I am also keeping track of trackID to check if track is changed.
This is how I initialize the MTAudioProcessingTap.
-(void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack{
// Configure an MTAudioProcessingTap to handle things.
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
OSStatus err = MTAudioProcessingTapCreate(
kCFAllocatorDefault,
&callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects,
&tap
);
if(err) {
NSLog(#"Unable to create the Audio Processing Tap %d", (int)err);
NSError *error = [NSError errorWithDomain:NSOSStatusErrorDomain
code:err
userInfo:nil];
NSLog(#"Error: %#", [error description]);;
return;
}
// Create an AudioMix and assign it to our currently playing "item", which
// is just the stream itself.
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters
audioMixInputParametersWithTrack:audioTrack];
inputParams.audioTapProcessor = tap;
audioMix.inputParameters = #[inputParams];
_audioPlayer.currentItem.audioMix = audioMix;
}
But Now with this audio track MTAudioProcessingTap callbacks "Prepare" and "Process" are never called.
Is the problem with the audioTrack I am getting through KVO?
Now I would really appreciate if some one can help me with this. Or can tell am I using the write approach to record HLS Streams?
I Found solution for this and using it in my app. Wanted to post it earlier but didn't get the time.
So to play with HLS you should have some knowledge what they are exactly. For that please see it here on Apple Website.
HLS Apple
Here are the steps I am following.
1. First get the m3u8 and parse it.
You can parse it using this helpful kit M3U8Kit.
Using this kit you can get the M3U8MediaPlaylist or M3U8MasterPlaylist(if it is a master playlist)
if you get the master playlist you can also parse it to get M3U8MediaPlaylist
(void) parseM3u8
{
NSString *plainString = [self.url m3u8PlanString];
BOOL isMasterPlaylist = [plainString isMasterPlaylist];
NSError *error;
NSURL *baseURL;
if(isMasterPlaylist)
{
M3U8MasterPlaylist *masterList = [[M3U8MasterPlaylist alloc] initWithContentOfURL:self.url error:&error];
self.masterPlaylist = masterList;
M3U8ExtXStreamInfList *xStreamInfList = masterList.xStreamList;
M3U8ExtXStreamInf *StreamInfo = [xStreamInfList extXStreamInfAtIndex:0];
NSString *URI = StreamInfo.URI;
NSRange range = [URI rangeOfString:#"dailymotion.com"];
NSString *baseURLString = [URI substringToIndex:(range.location+range.length)];
baseURL = [NSURL URLWithString:baseURLString];
plainString = [[NSURL URLWithString:URI] m3u8PlanString];
}
M3U8MediaPlaylist *mediaPlaylist = [[M3U8MediaPlaylist alloc] initWithContent:plainString baseURL:baseURL];
self.mediaPlaylist = mediaPlaylist;
M3U8SegmentInfoList *segmentInfoList = mediaPlaylist.segmentList;
NSMutableArray *segmentUrls = [[NSMutableArray alloc] init];
for (int i = 0; i < segmentInfoList.count; i++)
{
M3U8SegmentInfo *segmentInfo = [segmentInfoList segmentInfoAtIndex:i];
NSString *segmentURI = segmentInfo.URI;
NSURL *mediaURL = [baseURL URLByAppendingPathComponent:segmentURI];
[segmentUrls addObject:mediaURL];
if(!self.segmentDuration)
self.segmentDuration = segmentInfo.duration;
}
self.segmentFilesURLs = segmentUrls;
}
You can see that you will get the links to the .ts files from the m3u8 parse it.
Now download all the .ts file into a local folder.
Merge these .ts files in to one mp4 file and Export.
You can do that using this wonderful C library
TS2MP4
and then you can delete the .ts files or keep them if you need them.
This is not good approach what you can do is to Parse M3U8 link .Then try to download segment files (.ts) . If you can get these file you can merge them to generate mp4 file.
This question seems to be asked few times over last few years but none has answer for that. I am trying to process PCM data from HLS and I have to use AVPlayer.
this post taps the local files
https://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
and this tap work with remote files but not with .m3u8 hls files.
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
I can play first two tracks in the playlist but it doesn't start the needed callbacks to get the pcm, when the file is local or remote(not stream) I can still get the pcm but it is the hls is not working and I need HLS working
here is my code
//avplayer tap try
- (void)viewDidLoad {
[super viewDidLoad];
NSURL*testUrl= [NSURL URLWithString:#"http://playlists.ihrhls.com/c5/1469/playlist.m3u8"];
AVPlayerItem *item = [AVPlayerItem playerItemWithURL:testUrl];
self.player = [AVPlayer playerWithPlayerItem:item];
// Watch the status property - when this is good to go, we can access the
// underlying AVAssetTrack we need.
[item addObserver:self forKeyPath:#"status" options:0 context:nil];
}
-(void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context
{
if(![keyPath isEqualToString:#"status"])
return;
AVPlayerItem *item = (AVPlayerItem *)object;
if(item.status != AVPlayerItemStatusReadyToPlay)
return;
NSArray *tracks = [self.player.currentItem tracks];
for(AVPlayerItemTrack *track in tracks) {
if([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio]) {
NSLog(#"GOT DAT FUCKER");
[self beginRecordingAudioFromTrack:track.assetTrack];
[self.player play];
}
}
}
- (void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack
{
// Configure an MTAudioProcessingTap to handle things.
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
OSStatus err = MTAudioProcessingTapCreate(
kCFAllocatorDefault,
&callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects,
&tap
);
if(err) {
NSLog(#"Unable to create the Audio Processing Tap %d", (int)err);
return;
}
// Create an AudioMix and assign it to our currently playing "item", which
// is just the stream itself.
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters
audioMixInputParametersWithTrack:audioTrack];
inputParams.audioTapProcessor = tap;
audioMix.inputParameters = #[inputParams];
self.player.currentItem.audioMix = audioMix;
}
void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames,
MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut,
CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut)
{
OSStatus err = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut,
flagsOut, NULL, numberFramesOut);
if (err) NSLog(#"Error from GetSourceAudio: %d", (int)err);
NSLog(#"Process");
}
void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut)
{
NSLog(#"Initialising the Audio Tap Processor");
*tapStorageOut = clientInfo;
}
void finalize(MTAudioProcessingTapRef tap)
{
NSLog(#"Finalizing the Audio Tap Processor");
}
void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat)
{
NSLog(#"Preparing the Audio Tap Processor");
}
void unprepare(MTAudioProcessingTapRef tap)
{
NSLog(#"Unpreparing the Audio Tap Processor");
}
void init is called void prepare and process has to be called as well.
how can I do this?
I would suggest you use FFMPEG library to process HLS streams. This is a little harder but gives more flexibility. I did HLS Player for Android a few years ago (used this project) I believe same applies to iOS.
I recommended to use Novocaine
Really fast audio in iOS and Mac OS X using Audio Units is hard, and will leave you scarred and bloody. What used to take days can now be done with just a few lines of code.
I have made a video player that is analyzing the realtime audio and video tracks from the video that is currently playing. The videos are stored on the iOS device (in the Apps Documents directory).
This all works fine. I use MTAudioProcessingTap in order to get all the audio samples and do some FFT, and I am analyzing the video by just copy'ing the pixel buffers from the currently played CMTime (the AVPlayer currentTime property). As I said, this works fine.
But now I want to support Airplay. Just the airplay itself is not difficult, but my taps stop working as soon as Airplay is toggled and the video is playing on the ATV. Somehow, the MTAudioProcessingTap won't process and the pixelbuffers are all empty... I can't get to the data.
Is there any way to get to this data ?
In order to get the pixel buffers, I just fire an event every few milli-sec and retrieving the player's currentTime. Then:
CVPixelBufferRef imageBuffer = [videoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:nil];
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t *tempAddress = (uint8_t *) CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
Where tempAddress is my pixelbuffer, and videoOutput is an instance of AVPlayerItemVideoOutput.
For audio, I use:
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
// Create a processing tap for the input parameters
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
MTAudioProcessingTapRef tap;
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err || !tap) {
NSLog(#"Unable to create the Audio Processing Tap");
return;
}
inputParams.audioTapProcessor = tap;
// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = #[inputParams];
playerItem.audioMix = audioMix;
Regards,
Niek
Unfortunately, in my experience, it's not possible to get information about the audio/video during Airplay as the playback is being done on the Apple TV so the iOS device doesn't have any of the information.
I had the same issue with getting SMPTE subtitle data out of the timedMetaData, which stops getting reported during Airplay.
Here the solution:
this is to implement AirPlay, i use this code only for Audio on my app i don't know if you can improve for the video but you can try ;)
On AppDelegate.m :
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
[RADStyle applyStyle];
[radiosound superclass];
[self downloadZip];
NSError *sessionError = nil;
[[AVAudioSession sharedInstance] setDelegate:self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
[[AVAudioSession sharedInstance] setActive:YES error:nil];
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
}
An if you use airplay in nice to implement the LockScreen control, ArtWork, Stop/play, Title ecc.
In the DetailViewController of you player use this code:
- (BOOL)canBecomeFirstResponder {
return YES;
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString: (self.saved)[#"image"]]];
if (imageData == nil){
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:#"lockScreen.png"]];
infoCenter.nowPlayingInfo = #{MPMediaItemPropertyTitle: saved[#"web"],MPMediaItemPropertyArtist: saved[#"title"], MPMediaItemPropertyArtwork:albumArt};
} else {
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageWithData:imageData]];
infoCenter.nowPlayingInfo = #{MPMediaItemPropertyTitle: saved[#"link"],MPMediaItemPropertyArtist: saved[#"title"], MPMediaItemPropertyArtwork:albumArt};
}
}
Hope this code can help you ;)
In my iPad/iPhone App I'm playing back a video using AVPlayer. The video file has a stereo audio track but I need to playback only one channel of this track in mono. Deployment target is iOS 6. How can I achieve this? Thanks a lot for your help.
I now finally found an answer to this question - at least for deployment on iOS 6. You can easily add an MTAudioProcessingTap to your existing AVPlayer item and copy the selected channels samples to the other channel during your process callback function. Here is a great tutorial explaining the basics: http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
This is my code so far, mostly copied from the link above.
During AVPlayer setup I assign callback functions for audio processing:
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = ( void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
MTAudioProcessingTapRef tap;
// The create function makes a copy of our callbacks struct
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err || !tap) {
NSLog(#"Unable to create the Audio Processing Tap");
return;
}
assert(tap);
// Assign the tap to the input parameters
audioInputParam.audioTapProcessor = tap;
// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = #[audioInputParam];
playerItem.audioMix = audioMix;
Here are the audio processing functions (actually process is the only one needed):
#pragma mark Audio Processing
void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut) {
NSLog(#"Initialising the Audio Tap Processor");
*tapStorageOut = clientInfo;
}
void finalize(MTAudioProcessingTapRef tap) {
NSLog(#"Finalizing the Audio Tap Processor");
}
void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat) {
NSLog(#"Preparing the Audio Tap Processor");
}
void unprepare(MTAudioProcessingTapRef tap) {
NSLog(#"Unpreparing the Audio Tap Processor");
}
void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames,
MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut,
CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) {
OSStatus err = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, NULL, numberFramesOut);
if (err) NSLog(#"Error from GetSourceAudio: %ld", err);
SIVSViewController* self = (SIVSViewController*) MTAudioProcessingTapGetStorage(tap);
if (self.selectedChannel) {
int channel = self.selectedChannel;
if (channel == 0) {
bufferListInOut->mBuffers[1].mData = bufferListInOut->mBuffers[0].mData;
} else {
bufferListInOut->mBuffers[0].mData = bufferListInOut->mBuffers[1].mData;
}
}
}