Can't get AVAudioPlayer to Play [duplicate] - ios

I'm hearing some conflicting reports about this. What I'm trying to do is stream an mp3 file from a URL. I've done hours of research, but I cannot find any good guides on how to do this, or even what kind of audio player I should use.
Some friends tell me that AVPlayer can stream mp3, but the Apple documentation says it can't. I've poured over Matt Gallagher's audio streamer (http://www.cocoawithlove.com/2008/09/streaming-and-playing-live-mp3-stream.html), but that code was made a good while ago, and I'm new enough to this that it's hard to work through the autoreleases and retains and all that.
The audio I'm trying to stream is a fairly large mp3 file from a libsyn server, with a URL of format..
http://traffic.libsyn.com/podcastname/episode.mp3
All I need to do is grab it and start playing, with the ability to pause and scrub. So first things first, CAN AVPlayer stream mp3's? And if so, does anybody have any guides or code they can point me to? And if not, is there any kind of audio player class that can stream audio?
I've tried creating an AVPlayerItem, initialized with the URL, then adding it to an AVPlayer, but I'm getting a ton of Error Loading... and Symbol Not Found... errors. I'd appreciate any information on this, thank you!

try this
-(void)playselectedsong{
AVPlayer *player = [[AVPlayer alloc]initWithURL:[NSURL URLWithString:urlString]];
self.songPlayer = player;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[songPlayer currentItem]];
[self.songPlayer addObserver:self forKeyPath:#"status" options:0 context:nil];
[NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:#selector(updateProgress:) userInfo:nil repeats:YES];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (object == songPlayer && [keyPath isEqualToString:#"status"]) {
if (songPlayer.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed");
} else if (songPlayer.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayerStatusReadyToPlay");
[self.songPlayer play];
} else if (songPlayer.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
- (void)playerItemDidReachEnd:(NSNotification *)notification {
// code here to play next sound file
}

You can also try my open source Audjustable library which supports HTTP streaming. It's based on Matt's AudioStreamer but has been tidied, optimised and updated to support multiple data sources (non HTTP) and gapless playback.
https://github.com/tumtumtum/audjustable.

In addition to Sumit Mundra's answer, which helped me a lot, I found that this technique doesn't actually stream MP3 files from a remote server. When I implemented this, the file downloaded synchronously, blocking my UI, before playing. The way to properly stream the MP3 that I found worked very well was to point to an M3U file. This is just a text file with an .m3u extension which contains a link to the original MP3. Point Sumit's code at that file instead, and you have a stream that starts playing immediately.
This is the place I found that information: http://www.soundabout.net/streammp3.htm

Matt Gallagher's AudioStreamer was updated 2 months ago https://github.com/mattgallagher/AudioStreamer/commits/master
But for what your looking for check out the sample code StichedStreamPlayer http://developer.apple.com/library/ios/#samplecode/StitchedStreamPlayer/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010092
It uses an AVPlayer object and if you look at method - (IBAction)loadMovieButtonPressed:(id)sender you should be able to follow how it sets up the AVPlayer Object.

Aaron's post about using an m3u file instead of an mp3 worked for me. I also found that AVPlayer was picky about the m3u syntax. For example, when I tried the following, I was unable to get a valid duration (it was always indefinite), and relative paths didn't work:
#EXTM3U
#EXTINF:71
https://test-domain.com/90c9a240-51b3-11e9-bb69-c1300ce2348f.mp3
However, after updating the m3u file to the following, both issues were resolved:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:70
#EXT-X-MEDIA-SEQUENCE:1
#EXT-X-PLAYLIST-TYPE:VOD
#EXTINF:70.000,
8577d650-51b3-11e9-8e69-4f2b085e94aa.mp3
#EXT-X-ENDLIST

Related

AVPlayer seeking a HLS Audio Stream

I'm using AVPlayer to stream a hls stream. The issue I'm having is that I'm incapable of seeking to a time.
The seeking logic works fine with the Apple's example hls stream. It also works fine with the non-hls stream I have but we wanted to use hls on this project.
So my belief is that problem is more likely with our stream and how AVPlayer handles it.
The playlist.m3u8
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=61378,CODECS="mp4a.40.34"
chunklist_w1105403169.m3u8
The chunklist_w1105403169.m3u8
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-ALLOW-CACHE:NO
#EXT-X-TARGETDURATION:11
#EXT-X-MEDIA-SEQUENCE:5015
#EXTINF:10.028,
media_w1105403169_5015.mp3
#EXTINF:10.036,
media_w1105403169_5016.mp3
#EXTINF:10.027,
media_w1105403169_5017.mp3
I have checked the stream using mediastreamvalidator and I do get:
Warning: (0:-16230) #EXT-X-ALLOW-CACHE should only be in master playlist
but its my understanding Apple ignores this flag.
I've no control over the stream but can request changes.
A cut down version of my implementation:
Setup of the player
AVPlayer *player = [[AVPlayer alloc] initWithURL:url];
Seeking back by x seconds
- (void)trySeekBackBy:(NSTimeInterval )seconds
CMTime minus = CMTimeMakeWithSeconds(seconds, 1);
CMTime time = CMTimeSubtract(self.currentTime, minus);
//check that time is within the seek able range.
if ([self canSeekToTime:time]) {
//I normally just seekToTime:
[self seekToTime:time toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}
/ /else work down till we can find a range we can seek to
}

AVPlayer HLS live stream level meter (Display FFT Data)

I'm using AVPlayer for a radio app using HTTP live streaming. Now I want to implement a level meter for that audio stream. The very best would a level meter showing the different frequencies, but a simple left / right solution would be a great starting point.
I found several examples using AVAudioPlayer. But I cannot find a solution for getting the required informations off AVPlayer.
Can someone think of a solution for my problem?
EDIT I want to create something like this (but nicer)
EDIT II
One suggestion was to use MTAudioProcessingTap to get the raw audio data. The examples I could find using the [[[_player currentItem] asset] tracks] array, which is, in my case, an empty array. Another suggestion was to use [[_player currentItem] audioMix] which is null for me.
EDIT III
After years already, there still not seems to be a solution. I did indeed make progress, so I'm sharing it.
During setup, I'm adding a key-value observer to the playerItem:
[[[self player] currentItem] addObserver:self forKeyPath:#"tracks" options:kNilOptions context:NULL];
//////////////////////////////////////////////////////
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)changecontext:(void *)context
if ([keyPath isEqualToString:#"tracks"] && [[object tracks] count] > 0) {
for (AVPlayerItemTrack *itemTrack in [object tracks]) {
AVAssetTrack *track = [itemTrack assetTrack];
if ([[track mediaType] isEqualToString:AVMediaTypeAudio]) {
[self addAudioProcessingTap:track];
break;
}
}
}
- (void)addAudioProcessingTap:(AVAssetTrack *)track {
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalise;
// more tap setup...
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
[inputParams setAudioTapProcessor:tap];
[audioMix setInputParameters:#[inputParams]];
[[[self player] currentItem] setAudioMix:audioMix];
}
So far so good. This all works, I could find the right track and setup the inputParams and audioMix etc.
But unfortunately the only callback, that gets called is the init callback. None of the others will fire at any point.
I tried different (kinds of) stream sources, one of them an official Apple HLS stream: http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8
Sadly, using an HLS stream with AVFoundation doesn't give you any control over the audio tracks. I ran into the same problem trying to mute an HLS stream, which turned out to be impossible.
The only way you could read audio data would be to tap into the AVAudioSession.
EDIT
You can access the AVAudioSession like this:
[AVAudioSession sharedInstance]
Here's the documentation for AVAudioSession
Measuring audio using AVPlayer looks to be an issue that is still ongoing. That being said, I believe that the solution can be reached by combining AVPlayer with AVAudioRecorder.
While the two classes have seemingly contradictory purposes, there is a work around that allows AVAudioRecorder to access the AVPlayer's audio output.
Player / Recorder
As described in this Stack Overflow Answer, recording the audio of a AVPlayer is possible if you access the audio route change using kAudioSessionProperty_AudioRouteChange.
Notice that the audio recording must be started after accessing the audio route change. Use the linked stack answer as a reference - it includes more details and necessary code.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Once you have access to the AVPlayer's audio route and are recording, the measuring is relatively straightforward.
Audio Levels
In my answer to a stack question regarding measuring microphone input I describe the steps necessary to access the audio level measurements. Using AVAudioRecorder to monitor volume changes is more complex than one would think, so I included a GitHub project that acts as a template for monitoring audio changes while recording.
~~~~~~~~~~~~~~~~~~~~~~~~~~ Please Note ~~~~~~~~~~~~~~~~~~~~~~~~~~
This combination during an HLS live stream is not something that I have tested. This answer is strictly theoretical, so it may take a sound understanding of both classes to work out completely.

AVAudioPlayer not working with .pls shoutcast file?

Good Day,
I am working on one Radio APP that gets shoutcast streaming .pls file and plays it with help of AVFoundation framework.
This job is easily done with AVPlayer, but the problem with it is that I can not code or find any good solution to get it working with volume slider, the AVPlayer class does not have volume property.
So now I am trying to get it working with AVAudioPlayer which has volume property, and here is my code:
NSString *resourcePatch = #"http://vibesradio.org:8002/listen.pls";
NSData *_objectData = [NSData dataWithContentsOfURL:[NSURL URLWithString:resourcePatch]];
NSError *error;
vPlayer = [[AVAudioPlayer alloc] initWithData:_objectData error:&error];
vPlayer.numberOfLoops = 0;
vPlayer.volume = 1.0f;
if (vPlayer == nil)
NSLog(#"%#", [error description]);
else
[vPlayer play];
This code is working with uploaded .mp3 files on my server but it is not working with .pls files generated by shoutcast,
Is there any way to fix AVAudioPlayer to work with .pls files, or implement volume slider to AVPlayer ?
Using AVAudioPlayer for stream from network is not a good idea. See what Apple's documentation say on AVAudioPlayer:
An instance of the AVAudioPlayer class, called an audio player,
provides playback of audio data from a file or memory.
Apple recommends that you use this class for audio playback unless you
are playing audio captured from a network stream or require very low
I/O latency.
For change AVPlayer's volume check this question:Adjusting the volume of a playing AVPlayer

Looking for simple code to play streaming audio by HTTP from m3u

I looking for sample code for iOS (I guess, using AVMediaPlayer or AVPlayer) to play streaming audio, from URL (our current server URL is http://server.local:8008/ourradio.aac.m3u).
Audio stream also should be played, when application in background mode.
M3U is a playlist format. It is a plain text file containing the locations of music files, most notably MP3 files. Read the Wikipedia Article about M3U. Then play each MP3 using this if you really want it on an iPhone:
AVPlayer *musicPlayer = [AVPlayer playerWithURL:musicLinkFromM3uFile];
[musicPlayer play];
where musicLinkFromM3uFile is the location of the MP3 file read from the m3u file.
EDIT: And to be able to continue playing in background you will need to setup an audio session with category kAudioSessionCategory_MediaPlayback. To do that add the following lines of codes to your applicationDidLoad in the app delegate:
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
You will also need to set UIBackgroundModes in your Info.plist to audio.
NSString *urlAddress = #"http://www.mysite.com/test.mp3";
urlStream = [NSURL URLWithString:urlAddress];
self.player = [AVPlayer playerWithURL:urlStream];
[player play];

AVPlayer how to manage different URL

I'm trying to do a small radio app and I got a list of URL that I pass to AVPlayer but I can't understand how to manage different URL.
As example if I first play this URL: http://www.example.com/file.mp3
then I call http://www.example.net/file2.mp3
it works fine but when I select http://www.example.org/file.mp3.m3u it doesn't load that URL and AVPlayer won't play.
This is the code I use:
urlStream = [NSURL URLWithString:mp3URL];
appDelegate = [[UIApplication sharedApplication]delegate];
playerItem = [AVPlayerItem playerItemWithURL:urlStream];
playerItem addObserver:self forKeyPath:#"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:nil];
[playerItem addObserver:self forKeyPath:#"playbackLikelyToKeepUp" options:NSKeyValueObservingOptionNew context:nil];
[appDelegate.player replaceCurrentItemWithPlayerItem:playerItem];
I use replaceCurrentItemWithPlayerItem:playerItem because if I use initWithPlayerItem when I choose another stream I just can't stop the previous play: so the only way to stop the playing stream and start another one is to use replaceCurrentItemWithPlayerItem.
In the Apple documentation I read that replaceCurrentItemWithPlayerItem must have the same "compositor" as the items it replaces: what's a compositor?
I see that what it's different between the first two streams and the third (in the example above) is the file extension.
Any suggestion where to look for would be greatly appreciated.
The problem appears not to be the replacing of items - but the file format of particular files. Try to open the m3u file first - probably it will fail as well.
The m3u-URL indicates that this not an mp3 file but a m3u file - that's a list of URLs to media files. AVPlayer and AVPlayerItem are capable of playing m3u8 files - that are m3u files which are UTF8-encoded.
Have you tried opening those URLs in Mobile Safari? If they don't work there the format probably is not supported by AVPlayer either.
In any case - if you could provide the actual problematic URL one could check the actual file format.
With "compositor" the "Quartz compositor" is meant - forget about that, not relevant here.
[Edit: maybe it is relevant - I'm dumbfounded after you comment to this answer ...]

Resources