iOS: Audio and Video tap from Airplay - ios

I have made a video player that is analyzing the realtime audio and video tracks from the video that is currently playing. The videos are stored on the iOS device (in the Apps Documents directory).
This all works fine. I use MTAudioProcessingTap in order to get all the audio samples and do some FFT, and I am analyzing the video by just copy'ing the pixel buffers from the currently played CMTime (the AVPlayer currentTime property). As I said, this works fine.
But now I want to support Airplay. Just the airplay itself is not difficult, but my taps stop working as soon as Airplay is toggled and the video is playing on the ATV. Somehow, the MTAudioProcessingTap won't process and the pixelbuffers are all empty... I can't get to the data.
Is there any way to get to this data ?
In order to get the pixel buffers, I just fire an event every few milli-sec and retrieving the player's currentTime. Then:
CVPixelBufferRef imageBuffer = [videoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:nil];
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t *tempAddress = (uint8_t *) CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
Where tempAddress is my pixelbuffer, and videoOutput is an instance of AVPlayerItemVideoOutput.
For audio, I use:
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
// Create a processing tap for the input parameters
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
MTAudioProcessingTapRef tap;
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err || !tap) {
NSLog(#"Unable to create the Audio Processing Tap");
return;
}
inputParams.audioTapProcessor = tap;
// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = #[inputParams];
playerItem.audioMix = audioMix;
Regards,
Niek

Unfortunately, in my experience, it's not possible to get information about the audio/video during Airplay as the playback is being done on the Apple TV so the iOS device doesn't have any of the information.
I had the same issue with getting SMPTE subtitle data out of the timedMetaData, which stops getting reported during Airplay.

Here the solution:
this is to implement AirPlay, i use this code only for Audio on my app i don't know if you can improve for the video but you can try ;)
On AppDelegate.m :
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
[RADStyle applyStyle];
[radiosound superclass];
[self downloadZip];
NSError *sessionError = nil;
[[AVAudioSession sharedInstance] setDelegate:self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
[[AVAudioSession sharedInstance] setActive:YES error:nil];
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
}
An if you use airplay in nice to implement the LockScreen control, ArtWork, Stop/play, Title ecc.
In the DetailViewController of you player use this code:
- (BOOL)canBecomeFirstResponder {
return YES;
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString: (self.saved)[#"image"]]];
if (imageData == nil){
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:#"lockScreen.png"]];
infoCenter.nowPlayingInfo = #{MPMediaItemPropertyTitle: saved[#"web"],MPMediaItemPropertyArtist: saved[#"title"], MPMediaItemPropertyArtwork:albumArt};
} else {
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageWithData:imageData]];
infoCenter.nowPlayingInfo = #{MPMediaItemPropertyTitle: saved[#"link"],MPMediaItemPropertyArtist: saved[#"title"], MPMediaItemPropertyArtwork:albumArt};
}
}
Hope this code can help you ;)

Related

How to stream (RTMP) video from iOS Device (not from its own camera)

I want to send a external video from my iOS device.
This video is being received from a live streaming: RTSP server or HLS url (not from iPhone camera).
Currently I can stream my camera video from iPhone using VideoCore (internally using CameraSource and MicSource) but now, the video I want to stream comes from an URL. Similar to Periscope streaming video from GoPro Cam.
Problem 1: I don't know how to extract from a RTSP URL audio and video
Problem 2: I don't know how to create a CameraSource o MicSource from this extracted video and audio.
Do you know where to find an example or could you help me with this technical challenge?
I found a first approach for the first problem:
AVPlayerItem *item = [AVPlayerItem playerItemWithURL:URL];
AVAsset *asset = [item asset];
[asset loadValuesAsynchronouslyForKeys:#[#"tracks"] completionHandler:^{
if ([asset statusOfValueForKey:#"tracks" error:nil] == AVKeyValueStatusLoaded) {
NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
//VIDEO
//videoOutput is a AVPlayerItemVideoOutput * property
[item addOutput:self.videoOutput];
//AUDIO
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[audioTracks objectAtIndex:0]];
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)self,
callbacks.init = tap_InitCallback;
callbacks.finalize = tap_FinalizeCallback;
callbacks.prepare = tap_PrepareCallback;
callbacks.unprepare = tap_UnprepareCallback;
callbacks.process = tap_ProcessCallback;
MTAudioProcessingTapRef tap;
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
inputParams.audioTapProcessor = tap;
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = #[inputParams];
item.audioMix = audioMix;
}];
Then create a callback with CADisplayLink which will callback displayPixelBuffer: at every vsync.
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:#selector(displayLinkCallback:)];
[[self displayLink] addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[[self displayLink] setPaused:YES];
and in this method get pixelBuffer and send to output
For Audio, do similar tasks in prepare callback using AURenderCallbackStruct.

Extract/Record Audio from HLS stream (video) while playing iOS

I am playing HLS streams using AVPlayer. And I also need to record these streams as user presses record button.
The approach I am using is to record audio and video separately then at the end merge these file to make the final video. And It is successful with remote mp4 files.
But now for the HLS (.m3u8) files I am able to record the video using AVAssetWriter but having problems with audio recording.
I am using MTAudioProccessingTap to process the raw audio data and write it to a file. I followed this article. I am able to record remote mp4 audio but its not working with HLS streams.
Initially I wasn't able to extract the audio tracks from the stream using AVAssetTrack *audioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
But I was able to extract the audioTracks using KVO to initialize the MTAudioProcessingTap.
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
AVPlayer *player = (AVPlayer*) object;
if (player.status == AVPlayerStatusReadyToPlay)
{
NSLog(#"Ready to play");
self.previousAudioTrackID = 0;
__weak typeof (self) weakself = self;
timeObserverForTrack = [player addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1, 100) queue:nil usingBlock:^(CMTime time)
{
#try {
for(AVPlayerItemTrack* track in [weakself.avPlayer.currentItem tracks]) {
if([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio])
weakself.currentAudioPlayerItemTrack = track;
}
AVAssetTrack* audioAssetTrack = weakself.currentAudioPlayerItemTrack.assetTrack;
weakself.currentAudioTrackID = audioAssetTrack.trackID;
if(weakself.previousAudioTrackID != weakself.currentAudioTrackID) {
NSLog(#":::::::::::::::::::::::::: Audio track changed : %d",weakself.currentAudioTrackID);
weakself.previousAudioTrackID = weakself.currentAudioTrackID;
weakself.audioTrack = audioAssetTrack;
/// Use this audio track to initialize MTAudioProcessingTap
}
}
#catch (NSException *exception) {
NSLog(#"Exception Trap ::::: Audio tracks not found!");
}
}];
}
}
I am also keeping track of trackID to check if track is changed.
This is how I initialize the MTAudioProcessingTap.
-(void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack{
// Configure an MTAudioProcessingTap to handle things.
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
OSStatus err = MTAudioProcessingTapCreate(
kCFAllocatorDefault,
&callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects,
&tap
);
if(err) {
NSLog(#"Unable to create the Audio Processing Tap %d", (int)err);
NSError *error = [NSError errorWithDomain:NSOSStatusErrorDomain
code:err
userInfo:nil];
NSLog(#"Error: %#", [error description]);;
return;
}
// Create an AudioMix and assign it to our currently playing "item", which
// is just the stream itself.
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters
audioMixInputParametersWithTrack:audioTrack];
inputParams.audioTapProcessor = tap;
audioMix.inputParameters = #[inputParams];
_audioPlayer.currentItem.audioMix = audioMix;
}
But Now with this audio track MTAudioProcessingTap callbacks "Prepare" and "Process" are never called.
Is the problem with the audioTrack I am getting through KVO?
Now I would really appreciate if some one can help me with this. Or can tell am I using the write approach to record HLS Streams?
I Found solution for this and using it in my app. Wanted to post it earlier but didn't get the time.
So to play with HLS you should have some knowledge what they are exactly. For that please see it here on Apple Website.
HLS Apple
Here are the steps I am following.
1. First get the m3u8 and parse it.
You can parse it using this helpful kit M3U8Kit.
Using this kit you can get the M3U8MediaPlaylist or M3U8MasterPlaylist(if it is a master playlist)
if you get the master playlist you can also parse it to get M3U8MediaPlaylist
(void) parseM3u8
{
NSString *plainString = [self.url m3u8PlanString];
BOOL isMasterPlaylist = [plainString isMasterPlaylist];
NSError *error;
NSURL *baseURL;
if(isMasterPlaylist)
{
M3U8MasterPlaylist *masterList = [[M3U8MasterPlaylist alloc] initWithContentOfURL:self.url error:&error];
self.masterPlaylist = masterList;
M3U8ExtXStreamInfList *xStreamInfList = masterList.xStreamList;
M3U8ExtXStreamInf *StreamInfo = [xStreamInfList extXStreamInfAtIndex:0];
NSString *URI = StreamInfo.URI;
NSRange range = [URI rangeOfString:#"dailymotion.com"];
NSString *baseURLString = [URI substringToIndex:(range.location+range.length)];
baseURL = [NSURL URLWithString:baseURLString];
plainString = [[NSURL URLWithString:URI] m3u8PlanString];
}
M3U8MediaPlaylist *mediaPlaylist = [[M3U8MediaPlaylist alloc] initWithContent:plainString baseURL:baseURL];
self.mediaPlaylist = mediaPlaylist;
M3U8SegmentInfoList *segmentInfoList = mediaPlaylist.segmentList;
NSMutableArray *segmentUrls = [[NSMutableArray alloc] init];
for (int i = 0; i < segmentInfoList.count; i++)
{
M3U8SegmentInfo *segmentInfo = [segmentInfoList segmentInfoAtIndex:i];
NSString *segmentURI = segmentInfo.URI;
NSURL *mediaURL = [baseURL URLByAppendingPathComponent:segmentURI];
[segmentUrls addObject:mediaURL];
if(!self.segmentDuration)
self.segmentDuration = segmentInfo.duration;
}
self.segmentFilesURLs = segmentUrls;
}
You can see that you will get the links to the .ts files from the m3u8 parse it.
Now download all the .ts file into a local folder.
Merge these .ts files in to one mp4 file and Export.
You can do that using this wonderful C library
TS2MP4
and then you can delete the .ts files or keep them if you need them.
This is not good approach what you can do is to Parse M3U8 link .Then try to download segment files (.ts) . If you can get these file you can merge them to generate mp4 file.

Decode audio samples from hls stream on ios?

I am trying to decode audio samples from a remote HLS (m3u8) stream on an iOS device for further processing of the data, e.g. record to a file.
As reference stream http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8 is used.
By using an AVURLAsset in combination with the AVPlayer I am able to show the video as preview on a CALayer.
I can also get the raw video data (CVPixelBuffer) by using AVPlayerItemVideoOutput. The audio is hearable over the speaker of the iOS device as well.
This is the code I am using at the moment for the AVURLAsset and AVPlayer:
NSURL* url = [NSURL URLWithString:#"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:#[tracksKey] completionHandler: ^{
dispatch_async(dispatch_get_main_queue(), ^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
NSDictionary *settings = #
{
(id)kCVPixelBufferPixelFormatTypeKey: #(kCVPixelFormatType_32BGRA),
#"IOSurfaceOpenGLESTextureCompatibility": #YES,
#"IOSurfaceOpenGLESFBOCompatibility": #YES,
};
AVPlayerItemVideoOutput* output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
[playerItem addOutput:output];
AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];
[player setVolume: 0.0]; // no preview audio
self.playerItem = playerItem;
self.player = player;
self.playerItemVideoOutput = output;
AVPlayerLayer* playerLayer = [AVPlayerLayer playerLayerWithPlayer: player];
[self.preview.layer addSublayer: playerLayer];
[playerLayer setFrame: self.preview.bounds];
[playerLayer setVideoGravity: AVLayerVideoGravityResizeAspectFill];
[self setPlayerLayer: playerLayer];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemNewAccessLogEntry:) name:AVPlayerItemNewAccessLogEntryNotification object:self.playerItem];
[_player addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:&PlayerStatusContext];
[_playerItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:&PlayerItemStatusContext];
[_playerItem addObserver:self forKeyPath:#"tracks" options:0 context:nil];
}
});
}];
-(void) observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (self.player.status == AVPlayerStatusReadyToPlay && context == &PlayerStatusContext) {
[self.player play];
}
}
To get the raw video data of the HLS stream I use:
CVPixelBufferRef buffer = [self.playerItemVideoOutput copyPixelBufferForItemTime:self.playerItem.currentTime itemTimeForDisplay:nil];
if (!buffer) {
return;
}
CMSampleBufferRef newSampleBuffer = NULL;
CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
timingInfo.duration = CMTimeMake(33, 1000);
int64_t ts = timestamp * 1000.0;
timingInfo.decodeTimeStamp = CMTimeMake(ts, 1000);
timingInfo.presentationTimeStamp = timingInfo.decodeTimeStamp;
CMVideoFormatDescriptionRef videoInfo = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(
NULL, buffer, &videoInfo);
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
buffer,
true,
NULL,
NULL,
videoInfo,
&timingInfo,
&newSampleBuffer);
// do something here with sample buffer...
CFRelease(buffer);
CFRelease(newSampleBuffer);
Now I would like to get access to the raw audio data as well, but had no luck so far.
I tried to use MTAudioProcessingTap as described here:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
Unfortunately I could not get this to work properly. I succeeded in getting access to the underlying assetTrack of the AVPlayerItem but the callback mehtods "prepare" and "process" of the MTAudioProcessingTap is never getting called. I am not sure if I am on the right track here.
AVPlayer is playing the audio of the stream via the speaker, so internally the audio seems to be available as raw audio data. Is it possible to get access to the raw audio data? If it is not possible with AVPlayer, are there any other approaches?
If possible, I would not like to use ffmpeg, because the hardware decoder of the iOS device should be used for the decoding of the stream.

HTTP live stream AVAsset

I am implementing a n HTTP live streaming player with OSX using avplayer.
I am able to stream it properly seek and get duration timing etc.
Now i want to take screen shots and process the frames from it using OpenCV.
I went for using AVASSetImageGenerator. But there is no audio and video tracks with the AVAsset which is associated with player.currentItem.
The tracks are appearing in player.currentItem.tracks.
So i am not able to sue AVAssetGenerator. Can anybody help to find out a solution to extract screenshots and individual frames in such a scenario?
Please find the code below how i am initiating an HTTP live stream
Thanks in advance.
NSURL* url = [NSURL URLWithString:#"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
playeritem = [AVPlayerItem playerItemWithURL:url];
[playeritem addObserver:self forKeyPath:#"status" options:0 context:AVSPPlayerStatusContext];
[self setPlayer:[AVPlayer playerWithPlayerItem:playeritem]];
[self addObserver:self forKeyPath:#"player.rate" options:NSKeyValueObservingOptionNew context:AVSPPlayerRateContext];
[self addObserver:self forKeyPath:#"player.currentItem.status" options:NSKeyValueObservingOptionNew context:AVSPPlayerItemStatusContext];
AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]];
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]];
[newPlayerLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable];
[newPlayerLayer setHidden:YES];
[[[self playerView] layer] addSublayer:newPlayerLayer];
[self setPlayerLayer:newPlayerLayer];
[self addObserver:self forKeyPath:#"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:AVSPPlayerLayerReadyForDisplay];
[self.player play];
Following is how i am checking whether video track is present with the Asset
case AVPlayerItemStatusReadyToPlay:
[self setTimeObserverToken:[[self player] addPeriodicTimeObserverForInterval:CMTimeMake(1, 10) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
[[self timeSlider] setDoubleValue:CMTimeGetSeconds(time)];
NSLog(#"%f,%f,%f",[self currentTime],[self duration],[[self player] rate]);
AVPlayerItem *item = playeritem;
if(item.status == AVPlayerItemStatusReadyToPlay)
{
AVAsset *asset = (AVAsset *)item.asset;
long audiotracks = [[asset tracks] count];
long videotracks = [[asset availableMediaCharacteristicsWithMediaSelectionOptions]count];
NSLog(#"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
}
}]];
AVPlayerItem *item = self.player.currentItem;
if(item.status != AVPlayerItemStatusReadyToPlay)
return;
AVURLAsset *asset = (AVURLAsset *)item.asset;
long audiotracks = [[asset tracksWithMediaType:AVMediaTypeAudio]count];
long videotracks = [[asset tracksWithMediaType:AVMediaTypeVideo]count];
NSLog(#"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
This is an older question but in case someone needs help for that i have an answer
AVURLAsset *asset = /* Your Asset here! */;
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * /* Put the FPS of the source video here */ ; i++){
#autoreleasepool {
CMTime time = CMTimeMake(i, /* Put the FPS of the source video here */);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
// Do what you want with the image, for example save it as UIImage
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
CGImageRelease(image);
}
}
You can easily get the FPS of a Video by using this code:
float fps=0.00;
if (asset) {
AVAssetTrack * videoATrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
if(videoATrack)
{
fps = [videoATrack nominalFrameRate];
}
}
Hope that helps someone who is asking how to get all frames from a video or just some specific (with CMTime for example) frames. Please bear in mind, that saving all frames to an array can impact the memory hardly!

Issue with MTAudioProcessingTap on device

I'm trying to create an MTAudioProcessingTap based on the tutorial from this blog entry - http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
The issue is working with various audio formats. I've been able to successfully create a tap with M4A's, but it is not working with MP3's. Even stranger to me - the code works on a simulator with both formats, but not on the device (only m4a works). I'm getting OSStatusError code 50 in my process block, and if I attempt to use the AudioBufferList data, I'll get a bad access. The tap setup and callbacks I'm using are below. The process block seems to be the culprit (I think) but I do not know why.
Update - It seems like it is very sporadically working on the first time after a bit of a break, but only the first time. I get the feeling there is some sort of lock on my audio file. Does anyone know what should be doing in the unprepare block for clean up?
Unprepare block -
void unprepare(MTAudioProcessingTapRef tap)
{
NSLog(#"Unpreparing the Audio Tap Processor");
}
Process block (will get OSStatus error 50) -
void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames,
MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut,
CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut)
{
OSStatus err = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut,
flagsOut, NULL, numberFramesOut);
if (err) NSLog(#"Error from GetSourceAudio: %ld", err);
}
Tap Setup -
NSURL *assetURL = [[NSBundle mainBundle] URLForResource:#"DLP" withExtension:#"mp3"];
assert(assetURL);
// Create the AVAsset
AVAsset *asset = [AVAsset assetWithURL:assetURL];
assert(asset);
// Create the AVPlayerItem
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
assert(playerItem);
assert([asset tracks]);
assert([[asset tracks] count]);
self.player = [AVPlayer playerWithPlayerItem:playerItem];
assert(self.player);
// Continuing on from where we created the AVAsset...
AVAssetTrack *audioTrack = [[asset tracks] objectAtIndex:0];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
// Create a processing tap for the input parameters
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
MTAudioProcessingTapRef tap;
// The create function makes a copy of our callbacks struct
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err || !tap) {
NSLog(#"Unable to create the Audio Processing Tap");
return;
}
assert(tap);
// Assign the tap to the input parameters
inputParams.audioTapProcessor = tap;
// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = #[inputParams];
playerItem.audioMix = audioMix;
// And then we create the AVPlayer with the playerItem, and send it the play message...
[self.player play];
This was apparently a bug in 6.0 (which my device was still on). Simulator was on 6.1
Upgraded device to 6.1.2 and errors disappeared.

Resources