AVPlayer seeking a HLS Audio Stream - ios

I'm using AVPlayer to stream a hls stream. The issue I'm having is that I'm incapable of seeking to a time.
The seeking logic works fine with the Apple's example hls stream. It also works fine with the non-hls stream I have but we wanted to use hls on this project.
So my belief is that problem is more likely with our stream and how AVPlayer handles it.
The playlist.m3u8
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=61378,CODECS="mp4a.40.34"
chunklist_w1105403169.m3u8
The chunklist_w1105403169.m3u8
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-ALLOW-CACHE:NO
#EXT-X-TARGETDURATION:11
#EXT-X-MEDIA-SEQUENCE:5015
#EXTINF:10.028,
media_w1105403169_5015.mp3
#EXTINF:10.036,
media_w1105403169_5016.mp3
#EXTINF:10.027,
media_w1105403169_5017.mp3
I have checked the stream using mediastreamvalidator and I do get:
Warning: (0:-16230) #EXT-X-ALLOW-CACHE should only be in master playlist
but its my understanding Apple ignores this flag.
I've no control over the stream but can request changes.
A cut down version of my implementation:
Setup of the player
AVPlayer *player = [[AVPlayer alloc] initWithURL:url];
Seeking back by x seconds
- (void)trySeekBackBy:(NSTimeInterval )seconds
CMTime minus = CMTimeMakeWithSeconds(seconds, 1);
CMTime time = CMTimeSubtract(self.currentTime, minus);
//check that time is within the seek able range.
if ([self canSeekToTime:time]) {
//I normally just seekToTime:
[self seekToTime:time toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}
/ /else work down till we can find a range we can seek to
}

Related

How can I find out the current TS segment during a HLS(m3u8) playback in iOS?

An HLS (m3u8) file references mpeg-ts files. During its playback in iOS' AVPlayer, how can i determine the currently playing mpeg-ts URI?
If your looking for a reference to the URI of the currently downloading TS, it's not available. You can get the URI of the stream for the current bit-rate by looking at the current AVPlayerItem's -accessLog.
E.g.:
[[[player currentItem] accessLog] events]
It's an NSArray of AVPlayerItemAccessLogEvent's.
But it's not going to give you the URI of the TS per se. You may just have to calculate the current TS by where the playhead is currently at in relation to the duration as well as the segment size.

AVPlayer HLS live stream level meter (Display FFT Data)

I'm using AVPlayer for a radio app using HTTP live streaming. Now I want to implement a level meter for that audio stream. The very best would a level meter showing the different frequencies, but a simple left / right solution would be a great starting point.
I found several examples using AVAudioPlayer. But I cannot find a solution for getting the required informations off AVPlayer.
Can someone think of a solution for my problem?
EDIT I want to create something like this (but nicer)
EDIT II
One suggestion was to use MTAudioProcessingTap to get the raw audio data. The examples I could find using the [[[_player currentItem] asset] tracks] array, which is, in my case, an empty array. Another suggestion was to use [[_player currentItem] audioMix] which is null for me.
EDIT III
After years already, there still not seems to be a solution. I did indeed make progress, so I'm sharing it.
During setup, I'm adding a key-value observer to the playerItem:
[[[self player] currentItem] addObserver:self forKeyPath:#"tracks" options:kNilOptions context:NULL];
//////////////////////////////////////////////////////
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)changecontext:(void *)context
if ([keyPath isEqualToString:#"tracks"] && [[object tracks] count] > 0) {
for (AVPlayerItemTrack *itemTrack in [object tracks]) {
AVAssetTrack *track = [itemTrack assetTrack];
if ([[track mediaType] isEqualToString:AVMediaTypeAudio]) {
[self addAudioProcessingTap:track];
break;
}
}
}
- (void)addAudioProcessingTap:(AVAssetTrack *)track {
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalise;
// more tap setup...
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
[inputParams setAudioTapProcessor:tap];
[audioMix setInputParameters:#[inputParams]];
[[[self player] currentItem] setAudioMix:audioMix];
}
So far so good. This all works, I could find the right track and setup the inputParams and audioMix etc.
But unfortunately the only callback, that gets called is the init callback. None of the others will fire at any point.
I tried different (kinds of) stream sources, one of them an official Apple HLS stream: http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8
Sadly, using an HLS stream with AVFoundation doesn't give you any control over the audio tracks. I ran into the same problem trying to mute an HLS stream, which turned out to be impossible.
The only way you could read audio data would be to tap into the AVAudioSession.
EDIT
You can access the AVAudioSession like this:
[AVAudioSession sharedInstance]
Here's the documentation for AVAudioSession
Measuring audio using AVPlayer looks to be an issue that is still ongoing. That being said, I believe that the solution can be reached by combining AVPlayer with AVAudioRecorder.
While the two classes have seemingly contradictory purposes, there is a work around that allows AVAudioRecorder to access the AVPlayer's audio output.
Player / Recorder
As described in this Stack Overflow Answer, recording the audio of a AVPlayer is possible if you access the audio route change using kAudioSessionProperty_AudioRouteChange.
Notice that the audio recording must be started after accessing the audio route change. Use the linked stack answer as a reference - it includes more details and necessary code.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Once you have access to the AVPlayer's audio route and are recording, the measuring is relatively straightforward.
Audio Levels
In my answer to a stack question regarding measuring microphone input I describe the steps necessary to access the audio level measurements. Using AVAudioRecorder to monitor volume changes is more complex than one would think, so I included a GitHub project that acts as a template for monitoring audio changes while recording.
~~~~~~~~~~~~~~~~~~~~~~~~~~ Please Note ~~~~~~~~~~~~~~~~~~~~~~~~~~
This combination during an HLS live stream is not something that I have tested. This answer is strictly theoretical, so it may take a sound understanding of both classes to work out completely.

AVAudioPlayer not working with .pls shoutcast file?

Good Day,
I am working on one Radio APP that gets shoutcast streaming .pls file and plays it with help of AVFoundation framework.
This job is easily done with AVPlayer, but the problem with it is that I can not code or find any good solution to get it working with volume slider, the AVPlayer class does not have volume property.
So now I am trying to get it working with AVAudioPlayer which has volume property, and here is my code:
NSString *resourcePatch = #"http://vibesradio.org:8002/listen.pls";
NSData *_objectData = [NSData dataWithContentsOfURL:[NSURL URLWithString:resourcePatch]];
NSError *error;
vPlayer = [[AVAudioPlayer alloc] initWithData:_objectData error:&error];
vPlayer.numberOfLoops = 0;
vPlayer.volume = 1.0f;
if (vPlayer == nil)
NSLog(#"%#", [error description]);
else
[vPlayer play];
This code is working with uploaded .mp3 files on my server but it is not working with .pls files generated by shoutcast,
Is there any way to fix AVAudioPlayer to work with .pls files, or implement volume slider to AVPlayer ?
Using AVAudioPlayer for stream from network is not a good idea. See what Apple's documentation say on AVAudioPlayer:
An instance of the AVAudioPlayer class, called an audio player,
provides playback of audio data from a file or memory.
Apple recommends that you use this class for audio playback unless you
are playing audio captured from a network stream or require very low
I/O latency.
For change AVPlayer's volume check this question:Adjusting the volume of a playing AVPlayer

Looking for simple code to play streaming audio by HTTP from m3u

I looking for sample code for iOS (I guess, using AVMediaPlayer or AVPlayer) to play streaming audio, from URL (our current server URL is http://server.local:8008/ourradio.aac.m3u).
Audio stream also should be played, when application in background mode.
M3U is a playlist format. It is a plain text file containing the locations of music files, most notably MP3 files. Read the Wikipedia Article about M3U. Then play each MP3 using this if you really want it on an iPhone:
AVPlayer *musicPlayer = [AVPlayer playerWithURL:musicLinkFromM3uFile];
[musicPlayer play];
where musicLinkFromM3uFile is the location of the MP3 file read from the m3u file.
EDIT: And to be able to continue playing in background you will need to setup an audio session with category kAudioSessionCategory_MediaPlayback. To do that add the following lines of codes to your applicationDidLoad in the app delegate:
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
You will also need to set UIBackgroundModes in your Info.plist to audio.
NSString *urlAddress = #"http://www.mysite.com/test.mp3";
urlStream = [NSURL URLWithString:urlAddress];
self.player = [AVPlayer playerWithURL:urlStream];
[player play];

AVFoundation: Video to OpenGL texture working - How to play and sync audio?

I've managed to load a video-track of a movie frame by frame into an OpenGL texture with AVFoundation. I followed the steps described in the answer here: iOS4: how do I use video file as an OpenGL texture?
and took some code from the GLVideoFrame sample from WWDC2010 which can be downloaded here.
How do I play the audio-track of the movie synchronously to the video? I think it would not be a good idea to play it in a separate player, but to use the audio-track of the same AVAsset.
AVAssetTrack* audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
I retrieve a videoframe and it's timestamp in the CADisplayLink-callback via
CMSampleBufferRef sampleBuffer = [self.readerOutput copyNextSampleBuffer];
CMTime timestamp = CMSampleBufferGetPresentationTimeStamp( sampleBuffer );
where readerOutput is of type AVAssetReaderTrackOutput*
How to get the corresponding audio-samples?
And how to play them?
Edit:
I've looked around a bit and I think, best would be to use AudioQueue from the AudioToolbox.framework using the approach described here: AVAssetReader and Audio Queue streaming problem
There is also an audio-player in the AVFoundation: AVAudioPlayer. But I don't know exactly how I should pass data to its initWithData-initializer which expects NSData. Furthermore, I don't think it's the best choice for my case because a new AVAudioPlayer-instance would have to be created for every new chunk of audio samples, as I understand it.
Any other suggestions?
What's the best way to play the raw audio samples which I get from the AVAssetReaderTrackOutput?
You want do do an AV composition. You can merge multiple media sources, synchronized temporally, into one output.
http://developer.apple.com/library/ios/#DOCUMENTATION/AVFoundation/Reference/AVComposition_Class/Reference/Reference.html

Resources