AVPlayer Item get a nan duration - ios

I'm playing a file. mp3 from url by stream.
I'm using AVPlayer and when I am trying to get the total time to build a progress bar, I get whenever time is nan.
NSError *setCategoryError = nil;
if ([ [AVAudioSession sharedInstance] isOtherAudioPlaying]) { // mix sound effects with music already playing
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategorySoloAmbient error:&setCategoryError];
} else {
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:&setCategoryError];
}
if (setCategoryError) {
NSLog(#"Error setting category! %ld", (long)[setCategoryError code]);
}
NSURL *url = [NSURL URLWithString:#"http://..//46698"];
AVPlayer *player = [AVPlayer playerWithURL:url];
songPlayer=player;
[songPlayer addObserver:self forKeyPath:#"status" options:0 context:nil];
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (object == songPlayer && [keyPath isEqualToString:#"status"]) {
if (songPlayer.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed");
} else if (songPlayer.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayerStatusReadyToPlay");
[songPlayer play];
[songPlayer addPeriodicTimeObserverForInterval:CMTimeMake(1, 1) queue:dispatch_get_main_queue() usingBlock:^(CMTime time){
CMTime aux = [songPlayer currentTime];
AVPlayerItem *item=[songPlayer currentItem];
CMTime dur=[item duration];
NSLog(#"%f/%f", CMTimeGetSeconds(aux), CMTimeGetSeconds(dur));
}];
} else if (songPlayer.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
I've tried everything.
[item duration]; /// Fail
[[item asset] duration]; /// Fail
and nothing work
Anyone know why?

The value of duration property will be reported as kCMTimeIndefinite until the duration of the underlying asset has been loaded. There are two ways to ensure that the value of duration is accessed only after it becomes available:
Wait until the status of the AVPlayerItem is AVPlayerItemStatusReadyToPlay.
Register for key-value observation of the duration property, requesting the initial value. If the initial value is reported as kCMTimeIndefinite, the AVPlayerItem will notify you of the availability of the item's duration via key-value observing as soon as its value becomes known.

For swift:
if player.currentItem.status == .readyToPlay {
print(currentItem.duration.seconds) // it't not nan
}

I have this problem on iOS 12 (for iOS 13 everything works as expected). Current item's duration is always indefinite. I solve it by using player.currentItem?.asset.duration. Something like this:
private var currentItemDuration: CMTime? {
if #available(iOS 13.0, *) {
return player?.currentItem?.duration
} else {
return player?.currentItem?.asset.duration
}
}
See this answer for macOS: https://stackoverflow.com/a/52668213/7132300 It looks like it's also valid for iOS 12.

#voromax is correct. I added the asset to the playerItem without getting the duration first and duration was nan:
let asset = AVAsset(url: videoUrl)
self.playerItem = AVPlayerItem(asset: asset)
When I loaded the asset.loadValuesAsynchronously first, no more nan and I got the correct duration:
let assetKeys = ["playable", "duration"]
let asset = AVAsset(url: url)
asset.loadValuesAsynchronously(forKeys: assetKeys, completionHandler: {
DispatchQueue.main.async { [weak self] in
self?.playerItem = AVPlayerItem(asset: asset, automaticallyLoadedAssetKeys: assetKeys)
}
})

You can use asset property. It will give you the duration.
self.player.currentItem?.asset.duration.seconds ?? 0

I had the same problem but I was able to get the duration with a different method. Please see my answer here: https://stackoverflow.com/a/38406295/3629481

Related

Getting averagePowerForChannel in AVPlayer

how can i get the averagePowerForChannel in AVPlayer in order to make an audio visualization on my music app!
ive already done the visualization part but im stuck in its engine (realtime volume channel).
i know that by using AVAudioPlayer it can be done easily using the .meteringEnabled Property but for some known reason AVPlayer is a must in my app!
im actualy thinking of using AVAudioPlayer Alongside with AVPlayer to get the desired result but it sounds kind of messy workaround,
how can that affect performance and stability?
thanks in advance
I have an issue with AVPlayer visualisation for about two years. In my case it involves HLS live streaming, in that case, you won't get it running, as of my knowledge.
EDIT This will not let you access the averagePowerForChannel: method, but you will get access to the raw data and with for example FFT get the desired information.
I got it working with local playback, though. You basically wait for the players player item to have a track up and running. At that point you will need to patch an MTAudioProcessingTap into the audio mix.
The processing tap will run callbacks you specify in which you will be able to compute the raw audio data as you need.
Here is a quick example (sorry for heaving it in ObjC, though):
#import <AVFoundation/AVFoundation.h>
#import <MediaToolbox/MediaToolbox.h>
void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut) {};
void finalize(MTAudioProcessingTapRef tap) {};
void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat) {};
void unprepare(MTAudioProcessingTapRef tap) {};
void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames, MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut, CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) {};
- (void)play {
// player and item setup ...
[[[self player] currentItem] addObserver:self forKeyPath:#"tracks" options:kNilOptions context:NULL];
}
//////////////////////////////////////////////////////
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)changecontext:(void *)context
if ([keyPath isEqualToString:#"tracks"] && [[object tracks] count] > 0) {
for (AVPlayerItemTrack *itemTrack in [object tracks]) {
AVAssetTrack *track = [itemTrack assetTrack];
if ([[track mediaType] isEqualToString:AVMediaTypeAudio]) {
[self addAudioProcessingTap:track];
break;
}
}
}
- (void)addAudioProcessingTap:(AVAssetTrack *)track {
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalise;
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks, kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err) {
NSLog(#"error: %#", [NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]);
return;
}
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
[inputParams setAudioTapProcessor:tap];
[audioMix setInputParameters:#[inputParams]];
[[[self player] currentItem] setAudioMix:audioMix];
}
There is some discussion going on over on my question from over two years ago, so make sure to check it out as well.
You will need an audio processor class in combination with AV Foundation to visualize audio samples as well as applying a Core Audio audio unit effect (Bandpass Filter) to the audio data. You can find a sample by Apple here
Essentially you will need to add an observer to you AVPlayer like the below:
// Notifications
let playerItem: AVPlayerItem! = videoPlayer.currentItem
playerItem.addObserver(self, forKeyPath: "tracks", options: NSKeyValueObservingOptions.New, context: nil);
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: videoPlayer.currentItem, queue: NSOperationQueue.mainQueue(), usingBlock: { (notif: NSNotification) -> Void in
self.videoPlayer.seekToTime(kCMTimeZero)
self.videoPlayer.play()
print("replay")
})
Then handle the notification in the overriden method below:
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if (object === videoPlayer.currentItem && keyPath == "tracks"){
if let playerItem: AVPlayerItem = videoPlayer.currentItem {
if let tracks = playerItem.asset.tracks as? [AVAssetTrack] {
tapProcessor = MYAudioTapProcessor(AVPlayerItem: playerItem)
playerItem.audioMix = tapProcessor.audioMix
tapProcessor.delegate = self
}
}
}
}
Here's a link to a sample project on GitHub

iOS 8.4 specific: AVPlayer not playing both video & audio and no errors

EDIT: Tested in 8.3 simulator too, same issue.
I have an app which works perfectly fine in iOS 9.0 onwards (all versions). However specific to iOS 8.4, the AVPlayer doesn't play anything. No audio & video.
Happens on both iPad and iPhone.
I have added observer for status and rate key path and as per the logger, those method do get called as if the avplayer is playing. However in the actual device and simulator both - there is no video and audio.
I have checked the avplayer's error property too and it's null throughout.
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
if (object == ((AppDelegate*)[[UIApplication sharedApplication] delegate]).avplayer) {
if ( [keyPath isEqualToString:#"status"]) {
NSLog(#"Status changed: %#, %#",change,((AppDelegate*)[[UIApplication sharedApplication] delegate]).avplayer.error.localizedDescription);
}
else if ([keyPath isEqualToString:#"rate"] && ((AppDelegate*)[[UIApplication sharedApplication] delegate]).avplayer.status == AVPlayerStatusReadyToPlay ) {
NSLog(#"Rate changed: %#",change);
}
}
}
Output:
2016-03-13 15:01:47.152 XXXXX[47910:2113657] Status changed: {
kind = 1;
new = 1;
}, (null)
2016-03-13 15:01:47.153 XXXXX[47910:2113567] Rate changed: {
kind = 1;
new = 1;
}
2016-03-13 15:01:47.160 XXXXX[47910:2113567] Rate changed: {
kind = 1;
new = 1;
}
I fixed the issue - though I am not sure why this was happening in iOS 8 but not in iOS 9.
The completion loadValuesAsynchronouslyForKeys was not returning to main thread. I checked this by putting [NSThread isMainThread] and noticed it was false.
I fixed it by switching to main thread before doing "replaceCurrentItemWithPlayerItem".
AVAsset *asset = [AVAsset assetWithURL:url];
[asset loadValuesAsynchronouslyForKeys:#[#"duration"] completionHandler:^{
NSLog(#"Main: %d",[NSThread isMainThread]);
dispatch_async(dispatch_get_main_queue(), ^{
AVPlayerItem *newItem = [[AVPlayerItem alloc] initWithAsset:asset];
[((AppDelegate*)[[UIApplication sharedApplication] delegate]).avplayer replaceCurrentItemWithPlayerItem:newItem];
[self addObserversToPlayer];
[((AppDelegate*)[[UIApplication sharedApplication] delegate]).avplayer play];
});
}];

AVPlayer/AVAudioMix fade-in effect clicks in the beginning

I'm trying to implement a fade-in effect based on AVPlayer + AVAudioMix + AVAudioMixInputParameters. It basically works except when playing the audio for the first time after starting my app there is a click in the beginning. Subsequent plays work perfect though, but the first-time glitch is pretty stable and reproducible.
My Play button is enabled only after the AVPlayerItem's status is set to ready, so it's impossible to fire a play method while the player is not ready. In fact it doesn't matter how long I wait after loading the audio file and constructing all the objects.
This happens on OS X, I haven't tested it on iOS (yet).
Note that for this test you need an audio file that starts with sound and not silence. Here is my stripped down code without the GUI part (testFadeIn is the entry point):
static AVPlayer* player;
static void* PlayerItemStatusObserverContext = &PlayerItemStatusObserverContext;
- (void)testFadeIn
{
AVURLAsset* asset = [AVURLAsset.alloc initWithURL:[NSURL fileURLWithPath:#"Helicopter.m4a"] options:#{AVURLAssetPreferPreciseDurationAndTimingKey: #YES}];
AVPlayerItem* item = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:item];
[item addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:PlayerItemStatusObserverContext];
}
- (void)observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
if (context == PlayerItemStatusObserverContext)
{
AVPlayerStatus status = (AVPlayerStatus)[[change objectForKey:NSKeyValueChangeNewKey] integerValue];
if (status == AVPlayerStatusReadyToPlay)
{
[self applyFadeIn];
[self performSelector:#selector(play:) withObject:nil afterDelay:1.0];
}
}
}
- (void)applyFadeIn
{
assert(player.currentItem.tracks.firstObject);
AVMutableAudioMixInputParameters* fadeIn = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:player.currentItem.tracks.firstObject];
[fadeIn setVolume:0 atTime:kCMTimeZero];
[fadeIn setVolume:1 atTime:CMTimeMake(2, 1)];
NSMutableArray* paramsArray = [NSMutableArray new];
[paramsArray addObject:fadeIn];
AVMutableAudioMix* audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = paramsArray;
player.currentItem.audioMix = audioMix;
}
- (void)play:(id)unused
{
[player play];
}
Click! What is wrong with this?
Edit:
An obvious workaround that I use at the moment is: when the player reports it's ready, I do a short 100ms playback with volume=0, then restore currentTime and volume and only then I report to the main app that the player is ready. This way there are no clicks. Interestingly, anything less than 100ms still gives the click.
This seems like an issue with something that's being cached by AVFoundation after the first playback. It's neither the tracks, as they are available when I set the fade in params, nor the seek status.

Decode audio samples from hls stream on ios?

I am trying to decode audio samples from a remote HLS (m3u8) stream on an iOS device for further processing of the data, e.g. record to a file.
As reference stream http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8 is used.
By using an AVURLAsset in combination with the AVPlayer I am able to show the video as preview on a CALayer.
I can also get the raw video data (CVPixelBuffer) by using AVPlayerItemVideoOutput. The audio is hearable over the speaker of the iOS device as well.
This is the code I am using at the moment for the AVURLAsset and AVPlayer:
NSURL* url = [NSURL URLWithString:#"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:#[tracksKey] completionHandler: ^{
dispatch_async(dispatch_get_main_queue(), ^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
NSDictionary *settings = #
{
(id)kCVPixelBufferPixelFormatTypeKey: #(kCVPixelFormatType_32BGRA),
#"IOSurfaceOpenGLESTextureCompatibility": #YES,
#"IOSurfaceOpenGLESFBOCompatibility": #YES,
};
AVPlayerItemVideoOutput* output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
[playerItem addOutput:output];
AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];
[player setVolume: 0.0]; // no preview audio
self.playerItem = playerItem;
self.player = player;
self.playerItemVideoOutput = output;
AVPlayerLayer* playerLayer = [AVPlayerLayer playerLayerWithPlayer: player];
[self.preview.layer addSublayer: playerLayer];
[playerLayer setFrame: self.preview.bounds];
[playerLayer setVideoGravity: AVLayerVideoGravityResizeAspectFill];
[self setPlayerLayer: playerLayer];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemNewAccessLogEntry:) name:AVPlayerItemNewAccessLogEntryNotification object:self.playerItem];
[_player addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:&PlayerStatusContext];
[_playerItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:&PlayerItemStatusContext];
[_playerItem addObserver:self forKeyPath:#"tracks" options:0 context:nil];
}
});
}];
-(void) observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (self.player.status == AVPlayerStatusReadyToPlay && context == &PlayerStatusContext) {
[self.player play];
}
}
To get the raw video data of the HLS stream I use:
CVPixelBufferRef buffer = [self.playerItemVideoOutput copyPixelBufferForItemTime:self.playerItem.currentTime itemTimeForDisplay:nil];
if (!buffer) {
return;
}
CMSampleBufferRef newSampleBuffer = NULL;
CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
timingInfo.duration = CMTimeMake(33, 1000);
int64_t ts = timestamp * 1000.0;
timingInfo.decodeTimeStamp = CMTimeMake(ts, 1000);
timingInfo.presentationTimeStamp = timingInfo.decodeTimeStamp;
CMVideoFormatDescriptionRef videoInfo = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(
NULL, buffer, &videoInfo);
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
buffer,
true,
NULL,
NULL,
videoInfo,
&timingInfo,
&newSampleBuffer);
// do something here with sample buffer...
CFRelease(buffer);
CFRelease(newSampleBuffer);
Now I would like to get access to the raw audio data as well, but had no luck so far.
I tried to use MTAudioProcessingTap as described here:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
Unfortunately I could not get this to work properly. I succeeded in getting access to the underlying assetTrack of the AVPlayerItem but the callback mehtods "prepare" and "process" of the MTAudioProcessingTap is never getting called. I am not sure if I am on the right track here.
AVPlayer is playing the audio of the stream via the speaker, so internally the audio seems to be available as raw audio data. Is it possible to get access to the raw audio data? If it is not possible with AVPlayer, are there any other approaches?
If possible, I would not like to use ffmpeg, because the hardware decoder of the iOS device should be used for the decoding of the stream.

iOS 7 AVPlayer AVPlayerItem duration incorrect in iOS 7

I have the following code in my app:
NSURL *url = [NSURL fileURLWithPath: [self.DocDir stringByAppendingPathComponent: self.FileName] isDirectory: NO];
self.avPlayer = [AVPlayer playerWithURL: url];
Float64 duration = CMTimeGetSeconds(self.avPlayer.currentItem.duration);
This worked fine with iOS 6 but with iOS 7 for some reason it returns NaN. When inspecting self.avPlayer.currentItem.duration the CMTime object has 0's with a flag of 17.
Interestingly the player works fine, just the duration is wrong.
Has anyone else experienced the same issues? I am importing the following:
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#import <MediaPlayer/MediaPlayer.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVAsset.h>
After playing around with different ways of initializing the objects I arrived at a working solution:
AVURLAsset *asset = [AVURLAsset assetWithURL: url];
Float64 duration = CMTimeGetSeconds(asset.duration);
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset: asset];
self.avPlayer = [[AVPlayer alloc] initWithPlayerItem: item];
It appears the duration value isn't always immediately available from an AVPlayerItem but it seems to work fine with an AVAsset immediately.
In iOS 7, for AVPlayerItem already created, you can also get duration from the underlaying asset:
CMTimeGetSeconds([[[[self player] currentItem] asset] duration]);
Instead of get it directly from AVPlayerItem, which gives you a NaN:
CMTimeGetSeconds([[[self player] currentItem] duration]);
The recommended way of doing this, as described in the manual is by observing the player item status:
[self.avPlayer.currentItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew|NSKeyValueObservingOptionInitial context:nil];
Then, inside observeValueForKeyPath:ofObject:change:context:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
// TODO: use either keyPath or context to differentiate between value changes
if (self.avPlayer.currentItem.status == AVPlayerStatusReadyToPlay) {
Float64 duration = CMTimeGetSeconds(self.avPlayer.currentItem.duration);
// ...
}
}
Also, make sure that you remove the observer when you change the player item:
if (self.avPlayer.currentItem) {
[self.avPlayer.currentItem removeObserver:self forKeyPath:#"status"];
}
Btw, you can also observe the duration property directly; however, it's been my personal experience that the results aren't as reliable as they should be ;-)
Swift version
You can get the duration using AVAsset which is AVPlayerItem property:
func getVideoDuration(from player: AVPlayer) -> Double? {
guard let duration = player.currentItem?.asset.duration else { return nil }
let durationSeconds = CMTimeGetSeconds(duration)
return durationSeconds
}
or by creating AVAsset from the scratch:
func getVideoDuration(for videoUrl: URL) -> Double {
let asset = AVAsset(url: videoUrl)
let duration = asset.duration
let durationSeconds = CMTimeGetSeconds(duration)
return durationSeconds
}

Resources