iOS: Get Audio from AVAudioRecorder - ios

Disclaimer New to AVAudioRecorder
What I'm doing I'm working on an app that uses the iPhone microphone to record sound. After the sound is recorded, I need to convert the sound (should be AVAsset, right?) into NSData to send to our backend.
What's the issue The issue is I am not sure how to "get" the audio that is supposed to be recorded with the AVAudioRecorder. AVAudioRecorder has a delegate method called - (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag. I would have expected the actually AVAsset that contains the audio to be passed from this delegate method, but it does not. What it does give me is the aRecorder object that has a .url property on it. When I NSLog the url from the passed aRecorder, it shows up. In fact I can NSLog the length of the file in the code below:
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag
{
DLog (#"audioRecorderDidFinishRecording:successfully: %#",aRecorder);
AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:aRecorder.url options:nil];
CMTime audioDuration = audioAsset.duration;
float audioDurationSeconds = CMTimeGetSeconds(audioDuration);
NSLog(#"asset length = %f", audioDurationSeconds); //Logs 7.051 seconds, so I know it's "there".
self.audioURL = aRecorder.url;
}
Problem When I pass self.audioURL to the next viewController's self.mediaURL and try to grab the file from the AssetLibrary (similarly to how I did before), the asset is not returned from the AssetLibrary (even though when I po self.mediaURL it indeed logs the correct url:
if (self.mediaURL) {
ALAssetsLibrary *assetLibrary = [[ALAssetsLibrary alloc] init];
[assetLibrary assetForURL:self.mediaURL resultBlock:^(ALAsset *asset) {
if (asset) {
// This block does NOT get called...
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc((long)rep.size);
NSUInteger buffered =[rep getBytes:buffer fromOffset:0.0 length:(long)rep.size error:nil];
NSMutableData *body = [[NSMutableData alloc] init];
body = [NSMutableData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
[dataToSendToServer setObject:body forKey:#"audioData"];
}
} failureBlock:^(NSError *error) {
NSLog(#"FAILED TO ACCESS AUDIO FROM URL: %#!", self.mediaURL);
}];
}
else {
NSLog(#"NO AUDIO DATA!");
}
}
Because I am new to AVAudioRecorder, perhaps I am just not designing this flow correctly. Could anyone help me out in getting the actual audio data.
Thanks!

AVAudioRecorder records to a file, not to the Asset Library.
So you can simply read the data from that file.

Related

Extract/Record Audio from HLS stream (video) while playing iOS

I am playing HLS streams using AVPlayer. And I also need to record these streams as user presses record button.
The approach I am using is to record audio and video separately then at the end merge these file to make the final video. And It is successful with remote mp4 files.
But now for the HLS (.m3u8) files I am able to record the video using AVAssetWriter but having problems with audio recording.
I am using MTAudioProccessingTap to process the raw audio data and write it to a file. I followed this article. I am able to record remote mp4 audio but its not working with HLS streams.
Initially I wasn't able to extract the audio tracks from the stream using AVAssetTrack *audioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
But I was able to extract the audioTracks using KVO to initialize the MTAudioProcessingTap.
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
AVPlayer *player = (AVPlayer*) object;
if (player.status == AVPlayerStatusReadyToPlay)
{
NSLog(#"Ready to play");
self.previousAudioTrackID = 0;
__weak typeof (self) weakself = self;
timeObserverForTrack = [player addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1, 100) queue:nil usingBlock:^(CMTime time)
{
#try {
for(AVPlayerItemTrack* track in [weakself.avPlayer.currentItem tracks]) {
if([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio])
weakself.currentAudioPlayerItemTrack = track;
}
AVAssetTrack* audioAssetTrack = weakself.currentAudioPlayerItemTrack.assetTrack;
weakself.currentAudioTrackID = audioAssetTrack.trackID;
if(weakself.previousAudioTrackID != weakself.currentAudioTrackID) {
NSLog(#":::::::::::::::::::::::::: Audio track changed : %d",weakself.currentAudioTrackID);
weakself.previousAudioTrackID = weakself.currentAudioTrackID;
weakself.audioTrack = audioAssetTrack;
/// Use this audio track to initialize MTAudioProcessingTap
}
}
#catch (NSException *exception) {
NSLog(#"Exception Trap ::::: Audio tracks not found!");
}
}];
}
}
I am also keeping track of trackID to check if track is changed.
This is how I initialize the MTAudioProcessingTap.
-(void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack{
// Configure an MTAudioProcessingTap to handle things.
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
OSStatus err = MTAudioProcessingTapCreate(
kCFAllocatorDefault,
&callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects,
&tap
);
if(err) {
NSLog(#"Unable to create the Audio Processing Tap %d", (int)err);
NSError *error = [NSError errorWithDomain:NSOSStatusErrorDomain
code:err
userInfo:nil];
NSLog(#"Error: %#", [error description]);;
return;
}
// Create an AudioMix and assign it to our currently playing "item", which
// is just the stream itself.
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters
audioMixInputParametersWithTrack:audioTrack];
inputParams.audioTapProcessor = tap;
audioMix.inputParameters = #[inputParams];
_audioPlayer.currentItem.audioMix = audioMix;
}
But Now with this audio track MTAudioProcessingTap callbacks "Prepare" and "Process" are never called.
Is the problem with the audioTrack I am getting through KVO?
Now I would really appreciate if some one can help me with this. Or can tell am I using the write approach to record HLS Streams?
I Found solution for this and using it in my app. Wanted to post it earlier but didn't get the time.
So to play with HLS you should have some knowledge what they are exactly. For that please see it here on Apple Website.
HLS Apple
Here are the steps I am following.
1. First get the m3u8 and parse it.
You can parse it using this helpful kit M3U8Kit.
Using this kit you can get the M3U8MediaPlaylist or M3U8MasterPlaylist(if it is a master playlist)
if you get the master playlist you can also parse it to get M3U8MediaPlaylist
(void) parseM3u8
{
NSString *plainString = [self.url m3u8PlanString];
BOOL isMasterPlaylist = [plainString isMasterPlaylist];
NSError *error;
NSURL *baseURL;
if(isMasterPlaylist)
{
M3U8MasterPlaylist *masterList = [[M3U8MasterPlaylist alloc] initWithContentOfURL:self.url error:&error];
self.masterPlaylist = masterList;
M3U8ExtXStreamInfList *xStreamInfList = masterList.xStreamList;
M3U8ExtXStreamInf *StreamInfo = [xStreamInfList extXStreamInfAtIndex:0];
NSString *URI = StreamInfo.URI;
NSRange range = [URI rangeOfString:#"dailymotion.com"];
NSString *baseURLString = [URI substringToIndex:(range.location+range.length)];
baseURL = [NSURL URLWithString:baseURLString];
plainString = [[NSURL URLWithString:URI] m3u8PlanString];
}
M3U8MediaPlaylist *mediaPlaylist = [[M3U8MediaPlaylist alloc] initWithContent:plainString baseURL:baseURL];
self.mediaPlaylist = mediaPlaylist;
M3U8SegmentInfoList *segmentInfoList = mediaPlaylist.segmentList;
NSMutableArray *segmentUrls = [[NSMutableArray alloc] init];
for (int i = 0; i < segmentInfoList.count; i++)
{
M3U8SegmentInfo *segmentInfo = [segmentInfoList segmentInfoAtIndex:i];
NSString *segmentURI = segmentInfo.URI;
NSURL *mediaURL = [baseURL URLByAppendingPathComponent:segmentURI];
[segmentUrls addObject:mediaURL];
if(!self.segmentDuration)
self.segmentDuration = segmentInfo.duration;
}
self.segmentFilesURLs = segmentUrls;
}
You can see that you will get the links to the .ts files from the m3u8 parse it.
Now download all the .ts file into a local folder.
Merge these .ts files in to one mp4 file and Export.
You can do that using this wonderful C library
TS2MP4
and then you can delete the .ts files or keep them if you need them.
This is not good approach what you can do is to Parse M3U8 link .Then try to download segment files (.ts) . If you can get these file you can merge them to generate mp4 file.

How to access NSData/NSURL of slow motion videos using PhotoKit

Working with new Photo framework, I can access the NSData of PHAssets using requestImageDataForAsset. I can also access the file URL using the PHImageFileURLKey of the returned info NSDictionary.
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
//imageData contains the correct data for images and videos
NSLog(#"info - %#", info);
NSURL* fileURL = [info objectForKey:#"PHImageFileURLKey"];
}];
This works fine for images and normal videos.
However, when the asset is a PHAssetMediaSubtypeVideoHighFrameRate (slow motion video), the returned data corresponds to a JPG file containing the first frame of the video (both the NSData, the dataUTI and the info dictionary point to the same jpg file). As example, this is the URL and the dataUTI returned for a slow motion video:
PHImageFileURLKey =
"file:///var/mobile/Media/PhotoData/Metadata/DCIM/100APPLE/IMG_0642.JPG";
PHImageFileUTIKey = "public.jpeg";
Why is this happening?
How can i access the NSData/NSURL of the slow motion video instead of this JPG preview?
After going nuts and testing every single option I found the problem.
The responsable of returning JPG images for slow motion videos is the default PHImageRequestOptionsVersionCurrent value for the PHImageRequestOptions.version property.
Simply assign the version to PHImageRequestOptionsVersionUnadjusted or PHImageRequestOptionsVersionOriginal will return the original slow motion video.
PHImageRequestOptions * imageRequestOptions = [[PHImageRequestOptions alloc] init];
imageRequestOptions.version = PHImageRequestOptionsVersionUnadjusted;
// or
imageRequestOptions.version = PHImageRequestOptionsVersionOriginal;
I consider this as an unexpected behaviour, since i am not expecting that the "current" version of a slow motion video is a still image (maybe a video with the slow motion effect applied, but not a photo).
Hope this is usefull to someone.
It is important to note that Slow motion videos are of type AVComposition not AVURLAsset. An AVComposition object combines media data from multiple sources together.
Exporting a slow motion video
To achieve this, I basically went through a three-step process:
Create an output URL for the video
Configure an export session
Export the video and grab the URL!
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.networkAccessAllowed = YES;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if(([asset isKindOfClass:[AVComposition class]] && ((AVComposition *)asset).tracks.count == 2)){
//slow motion videos. See Here: https://overflow.buffer.com/2016/02/29/slow-motion-video-ios/
//Output URL of the slow motion file.
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = paths.firstObject;
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"mergeSlowMoVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
//Begin slow mo video export
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (exporter.status == AVAssetExportSessionStatusCompleted) {
NSURL *URL = exporter.outputURL;
self.filePath=URL.absoluteString;
// NSData *videoData = [NSData dataWithContentsOfURL:URL];
//
//// Upload
//[self uploadSelectedVideo:video data:videoData];
}
});
}];
}
}];
Please see this wonderful blog for slow motion videos in iOS.
Following code snippet for Swift 3/4
PHImageManager.default().requestAVAsset(forVideo: asset,
options: nil,
resultHandler: { (asset, _, _) in
// AVAsset has two sub classes: AVComposition and AVAssetURL
// AVComposition for slow mo vid
// AVAssetURL for normal videos
// For slow motion video checking for AVCompostion
// Creating an exporter to write the video into local file path and using the same to play/upload
if asset!.isKind(of: AVComposition.self){
let avCompositionAsset = asset as! AVComposition
if avCompositionAsset.tracks.count > 1{
let exporter = AVAssetExportSession(asset: avCompositionAsset, presetName: AVAssetExportPresetHighestQuality)
exporter!.outputURL = self.fetchOutputURL()
exporter!.outputFileType = AVFileTypeMPEG4
exporter!.shouldOptimizeForNetworkUse = true
exporter!.exportAsynchronously {
DispatchQueue.main.sync {
// Use this url for uploading or playing a video
let url = exporter!.outputURL
}
}
}
}else{
// Normal video, are stored as AVAssetURL
let url = (asset as! AVURLAsset).url
}
})
// Fetch local path
func fetchOutputURL() -> URL{
let documentDirectory = getDocumentsDirectory() as NSString
let path = documentDirectory.appendingPathComponent("test.mp4")
return URL(fileURLWithPath:path)
}
//video slo-mo
PHVideoRequestOptions *options=[[PHVideoRequestOptions alloc]init];
options.version=PHVideoRequestOptionsVersionOriginal;
Request AVAsset from PHImageManager
[[PHImageManager defaultManager] requestAVAssetForVideo:videoAsset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info)
{
if ([asset isKindOfClass:[AVURLAsset class]])
{
// use URL to get file content
NSURL *URL = [(AVURLAsset *)asset URL];
NSData *videoData=[NSData dataWithContentsOfURL:URL];
NSNumber *fileSizeValue = nil;
[URL getResourceValue:&fileSizeValue forKey:NSURLFileSizeKey error:nil];
}
}

iOS: How to trim silence from start and end of .aif audio recording?

My app includes the ability for the user to record a brief message; I'd like to trim off any silence (or, to be more precise, any audio whose volume falls below a given threshold) from the beginning and end of the recording.
I'm recording the audio with an AVAudioRecorder, and saving it to an .aif file. I've seen some mention elsewhere of methods by which I could have it wait to start recording until the audio level reaches a threshold; that'd get me halfway there, but won't help with trimming silence off the end.
If there's a simple way to do this, I'll be eternally grateful!
Thanks.
This project takes audio from the microphone, triggers on loud noise and untriggers when quiet. It also trims and fades in/fades out around the ends.
https://github.com/fulldecent/FDSoundActivatedRecorder
Relevant code you are seeking:
- (NSString *)recordedFilePath
{
// Prepare output
NSString *trimmedAudioFileBaseName = [NSString stringWithFormat:#"recordingConverted%x.caf", arc4random()];
NSString *trimmedAudioFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:trimmedAudioFileBaseName];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:trimmedAudioFilePath]) {
NSError *error;
if ([fileManager removeItemAtPath:trimmedAudioFilePath error:&error] == NO) {
NSLog(#"removeItemAtPath %# error:%#", trimmedAudioFilePath, error);
}
}
NSLog(#"Saving to %#", trimmedAudioFilePath);
AVAsset *avAsset = [AVAsset assetWithURL:self.audioRecorder.url];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *track = [tracks objectAtIndex:0];
AVAssetExportSession *exportSession = [AVAssetExportSession
exportSessionWithAsset:avAsset
presetName:AVAssetExportPresetAppleM4A];
// create trim time range
CMTime startTime = CMTimeMake(self.recordingBeginTime*SAVING_SAMPLES_PER_SECOND, SAVING_SAMPLES_PER_SECOND);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, kCMTimePositiveInfinity);
// create fade in time range
CMTime startFadeInTime = startTime;
CMTime endFadeInTime = CMTimeMake(self.recordingBeginTime*SAVING_SAMPLES_PER_SECOND + RISE_TRIGGER_INTERVALS*INTERVAL_SECONDS*SAVING_SAMPLES_PER_SECOND, SAVING_SAMPLES_PER_SECOND);
CMTimeRange fadeInTimeRange = CMTimeRangeFromTimeToTime(startFadeInTime, endFadeInTime);
// setup audio mix
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters =
[AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[exportAudioMixInputParameters setVolumeRampFromStartVolume:0.0 toEndVolume:1.0
timeRange:fadeInTimeRange];
exportAudioMix.inputParameters = [NSArray
arrayWithObject:exportAudioMixInputParameters];
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:trimmedAudioFilePath];
exportSession.outputFileType = AVFileTypeAppleM4A;
exportSession.timeRange = exportTimeRange;
exportSession.audioMix = exportAudioMix;
// MAKE THE EXPORT SYNCHRONOUS
dispatch_semaphore_t semaphore = dispatch_semaphore_create(0);
[exportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_semaphore_signal(semaphore);
}];
dispatch_semaphore_wait(semaphore, DISPATCH_TIME_FOREVER);
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
return trimmedAudioFilePath;
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"AVAssetExportSessionStatusFailed %#", exportSession.error.localizedDescription);
} else {
NSLog(#"Export Session Status: %d", exportSession.status);
}
return nil;
}
I'm recording the audio with an AVAudioRecorder, and saving it to an .aif file. I've seen some mention elsewhere of methods by which I could have it wait to start recording until the audio level reaches a threshold; that'd get me halfway there
Without adequate buffering, that would truncate the start.
I don't know of an easy way. You would have to write a new audio file after recording and analyzing it for the desired start and end points. Modifying the existing file would be straightforward if you knew the AIFF format well (not many people do) and had an easy way to read the file's sample data.
The analysis stage is pretty easy for a basic implementation -- evaluate the average power of sample data, until your threshold is exceeded. Repeat in reverse for end.

How to find the photo gallery path URL in iOS

I'm making an iOS app where the user will be able to record a video and as soon as he finish such record, he can send that video to a server.
So far so good!
But to find the recorded video is being a headache! How can I find the proper URL for the recorded video?
During my tests, I have saved a video in the Supporting Files folder and my code is the following
NSString *url = [[NSBundle mainBundle] pathForResource:#"trailer_iphone" ofType:#".m4v"];
NSData *data = [NSData dataWithContentsOfFile:url];
So what could I do to replace that URL for the one in the photo gallery?
In the Apple Developer web site is showing that we can use the AVAsset in order to access the photo gallery. so I copied into my code and I'm trying to figure this out!
//Copied from Apple website
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just videos.
[group setAssetsFilter:[ALAssetsFilter allVideos]];
// For this example, we're only interested in the first item.
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:0]
options:0
usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
NSURL *url = [representation url];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
// Do something interesting with the AV asset.
//My last code
//NSString *url = [[NSBundle mainBundle] pathForResource:#"trailer_iphone" ofType:#".m4v"];
NSData *data = [NSData dataWithContentsOfURL:url];
Use UISaveVideoAtPathToSavedPhotosAlbum Official Documentation
For recording a video and send this recorded video file to server you can use AVFoundation framework as follow: AVFoundation Video Recording
AVCaptureMovieFileOutput *movieFileOutput;
AVCaptureMovieFileOutput implements the complete file recording interface declared by AVCaptureFileOutput for writing media data to QuickTime movie files.
for starting a recoding you have to call below code with passing output file path as parameter with delegate.
//Define output file path.
NSString * filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:["recordVideo" stringByAppendingPathExtension:#"mov"]];
//Start recording
[movieFileOutput startRecordingToOutputFileURL:[NSURL fileURLWithPath:filePath] recordingDelegate:self];
And after a some time if you want to stop recording then call
//Stop a recording
[movieFileOutput stopRecording];
after a stop recording delegate methode of AVCaptureFileOutputRecordingDelegate is call so, there you can get recorded file path.
#pragma mark - AVCaptureFileOutputRecordingDelegate Methods.
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
NSLog(#"Did finish recording, error %# | path %# | connections %#", error, [outputFileURL absoluteString], connections);
//call upload video APi
[self VideoUploadApi];
}

Playing audio file while I download it

I am trying to play partially downloaded file (I want the audio to start when it has enough downloaded data).
I have tried the following:
self.mutableData = nil;
self.mutableData = [NSMutableData data];
self.mutableData.length = 0;
GTMHTTPFetcher *fetcher =
[[self driveService].fetcherService fetcherWithURLString:song.filePath];
[fetcher setReceivedDataBlock:^(NSData *dataReceivedSoFar) {
[self.mutableData appendData:dataReceivedSoFar];
if (!self.audioPlayer.playing && self.mutableData.length > 1000000)
{
self.audioPlayer = nil;
self.audioPlayer = [[AVAudioPlayer alloc] initWithData:self.mutableData error:nil];
[self.audioPlayer play];
}
}];
[fetcher beginFetchWithCompletionHandler:^(NSData *data, NSError *error) {
if (error == nil)
{
NSLog(#"NSURLResponse: %#", fetcher.response.);
}
else
{
NSLog(#"An error occurred: %#", error);
}
}];
But it starts playing the first 1-3 seconds and then the app either crashes or it plays that 1-3 seconds over and over again until the download finishes and then it plays the whole song.
Can I somehow achieve this with AVPlayer, or fix this problem?
AVAudioPlayer cannot play files that are open for writing / downloading, but AVPlayer can. Switch to AVPlayer.
To stream audio, you should use AVPlayer or Audio Queue Services.
initWithData isn't intended for this purpose. You can use initWithData after you have downloaded the entire file.

Resources