iOS AVPlayer slow down - ios

I am using AVPlayer to play online video in my project. The Video is playing well. Now I want to reduce /increase the fps of the video . Below is my code that I am using:
self.asset = [AVAsset assetWithURL:self.videoUrl];
// the video player
self.player = [AVPlayer playerWithURL:self.videoUrl];
self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.player currentItem]];
self.playerLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.myPlayerView.frame.size.height);
[self.myPlayerView.layer addSublayer:self.playerLayer];
- (void)playerItemDidReachEnd:(NSNotification *)notification {
AVPlayerItem *p = [notification object];
[p seekToTime:kCMTimeZero];
}
Now how should I reduce/increase the fps for the online video?

You can do something like,
-(float)getFrameRateFromAVPlayer
{
float fps=0.00;
if (self.queuePlayer.currentItem.asset) {
AVAssetTrack * videoATrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
if(videoATrack)
{
fps = videoATrack.nominalFrameRate;
}
}
return fps;
}
OR
AVPlayerItem *item = AVPlayer.currentItem; // Your current item
float fps = 0.00;
for (AVPlayerItemTrack *track in item.tracks) {
if ([track.assetTrack.mediaType isEqualToString:AVMediaTypeVideo]) {
fps = track.currentVideoFrameRate;
}
}
Hope this will help :)

AVPlayer allows you to set the current rate of the playback. Basically, it accepts a range of possibility values to control the current AVPlayerItem such as play slow forward, fast forward or reverse with negative rates. As saying in the document, you should check whether or not the current item can support those states of playing
Please try to check it out. The link for your reference https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVPlayer_Class/index.html#//apple_ref/occ/instp/AVPlayer/rate

Related

Replaying AVPlayerItem / AVPlayer without re-downloading

I have an AVPlayer class all set up that streams an audio file. It's a bit long, so I can't post the whole thing here. What I am stuck on is how to allow the user to replay the audio file after they have finished listening to it once. When it finishes the first time, I correctly receive a notification AVPlayerItemDidPlayToEndTimeNotification. When I go to replay it, I immediately receive the same notification, which blocks me from replaying it.
How can I reset this such that the AVPlayerItem doesn't think that it has already played the audio file? I could deallocate everything and set it up again, but I believe that would force the user to download the audio file again, which is pointless and slow.
Here are some parts of the class that I think are relevant. The output that I get when attempting to replay the file looks like this. The first two lines are exactly what I would expect, but the third is a surprise.
is playing no timer audio player has finished playing audio
- (id) initWithURL : (NSString *) urlString
{
self = [super init];
if (self) {
self.isPlaying = NO;
self.verbose = YES;
if (self.verbose) NSLog(#"url: %#", urlString);
NSURL *url = [NSURL URLWithString:urlString];
self.playerItem = [AVPlayerItem playerItemWithURL:url];
self.player = [[AVPlayer alloc] initWithPlayerItem:self.playerItem];
[self determineAudioPlayTime : self.playerItem];
self.lengthOfAudioInSeconds = #0.0f;
[self.player addObserver:self forKeyPath:#"status" options:0 context:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(itemDidFinishPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.playerItem];
}
return self;
}
// this is what gets called when the user clicks the play button after they have
// listened to the file and the AVPlayerItemDidPlayToEndTimeNotification has been received
- (void) playAgain {
[self.playerItem seekToTime:kCMTimeZero];
[self toggleState];
}
- (void) toggleState {
self.isPlaying = !self.isPlaying;
if (self.isPlaying) {
if (self.verbose) NSLog(#"is playing");
[self.player play];
if (!timer) {
NSLog(#"no timer");
CMTime audioTimer = CMTimeMake(0, 1);
[self.player seekToTime:audioTimer];
timer = [NSTimer scheduledTimerWithTimeInterval:1.0
target:self
selector:#selector(updateProgress)
userInfo:nil
repeats:YES];
}
} else {
if (self.verbose) NSLog(#"paused");
[self.player pause];
}
}
-(void)itemDidFinishPlaying:(NSNotification *) notification {
if (self.verbose) NSLog(#"audio player has finished playing audio");
[[NSNotificationCenter defaultCenter] postNotificationName:#"audioFinished" object:self];
[timer invalidate];
timer = nil;
self.totalSecondsPlayed = [NSNumber numberWithInt:0];
self.isPlaying = NO;
}
You can call the seekToTime method when your player received AVPlayerItemDidPlayToEndTimeNotification
func itemDidFinishPlaying() {
self.player.seek(to: CMTime.zero)
self.player.play()
}
Apple recommends using AVQueueplayer with an AVPlayerLooper.
Here's Apple's (slightly revised) sample code:
AVQueuePlayer *queuePlayer = [[AVQueuePlayer alloc] init];
AVAsset *asset = // AVAsset with its 'duration' property value loaded
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
// Create a new player looper with the queue player and template item
self.playerLooper = [AVPlayerLooper playerLooperWithPlayer:queuePlayer
templateItem:playerItem];
// Begin looping playback
[queuePlayer play];
The AVPlayerLooper does all the event listening and playing for you, and the queue player is used to create what they call a "treadmill pattern". This pattern is essentially chaining multiple instances of the same AVAssetItem in a queue player and moving each finished asset back to the beginning of the queue.
The advantage of this approach is that it enables the framework to preroll the next asset (which is the same asset in this case, but its start still needs prerolling) before it arrives, reducing latency between the asset's end and looped start.
This is described in greater detail at ~15:00 in the video here: https://developer.apple.com/videos/play/wwdc2016/503/

Decode audio samples from hls stream on ios?

I am trying to decode audio samples from a remote HLS (m3u8) stream on an iOS device for further processing of the data, e.g. record to a file.
As reference stream http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8 is used.
By using an AVURLAsset in combination with the AVPlayer I am able to show the video as preview on a CALayer.
I can also get the raw video data (CVPixelBuffer) by using AVPlayerItemVideoOutput. The audio is hearable over the speaker of the iOS device as well.
This is the code I am using at the moment for the AVURLAsset and AVPlayer:
NSURL* url = [NSURL URLWithString:#"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:#[tracksKey] completionHandler: ^{
dispatch_async(dispatch_get_main_queue(), ^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
NSDictionary *settings = #
{
(id)kCVPixelBufferPixelFormatTypeKey: #(kCVPixelFormatType_32BGRA),
#"IOSurfaceOpenGLESTextureCompatibility": #YES,
#"IOSurfaceOpenGLESFBOCompatibility": #YES,
};
AVPlayerItemVideoOutput* output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
[playerItem addOutput:output];
AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];
[player setVolume: 0.0]; // no preview audio
self.playerItem = playerItem;
self.player = player;
self.playerItemVideoOutput = output;
AVPlayerLayer* playerLayer = [AVPlayerLayer playerLayerWithPlayer: player];
[self.preview.layer addSublayer: playerLayer];
[playerLayer setFrame: self.preview.bounds];
[playerLayer setVideoGravity: AVLayerVideoGravityResizeAspectFill];
[self setPlayerLayer: playerLayer];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemNewAccessLogEntry:) name:AVPlayerItemNewAccessLogEntryNotification object:self.playerItem];
[_player addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:&PlayerStatusContext];
[_playerItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:&PlayerItemStatusContext];
[_playerItem addObserver:self forKeyPath:#"tracks" options:0 context:nil];
}
});
}];
-(void) observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (self.player.status == AVPlayerStatusReadyToPlay && context == &PlayerStatusContext) {
[self.player play];
}
}
To get the raw video data of the HLS stream I use:
CVPixelBufferRef buffer = [self.playerItemVideoOutput copyPixelBufferForItemTime:self.playerItem.currentTime itemTimeForDisplay:nil];
if (!buffer) {
return;
}
CMSampleBufferRef newSampleBuffer = NULL;
CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
timingInfo.duration = CMTimeMake(33, 1000);
int64_t ts = timestamp * 1000.0;
timingInfo.decodeTimeStamp = CMTimeMake(ts, 1000);
timingInfo.presentationTimeStamp = timingInfo.decodeTimeStamp;
CMVideoFormatDescriptionRef videoInfo = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(
NULL, buffer, &videoInfo);
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
buffer,
true,
NULL,
NULL,
videoInfo,
&timingInfo,
&newSampleBuffer);
// do something here with sample buffer...
CFRelease(buffer);
CFRelease(newSampleBuffer);
Now I would like to get access to the raw audio data as well, but had no luck so far.
I tried to use MTAudioProcessingTap as described here:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
Unfortunately I could not get this to work properly. I succeeded in getting access to the underlying assetTrack of the AVPlayerItem but the callback mehtods "prepare" and "process" of the MTAudioProcessingTap is never getting called. I am not sure if I am on the right track here.
AVPlayer is playing the audio of the stream via the speaker, so internally the audio seems to be available as raw audio data. Is it possible to get access to the raw audio data? If it is not possible with AVPlayer, are there any other approaches?
If possible, I would not like to use ffmpeg, because the hardware decoder of the iOS device should be used for the decoding of the stream.

forwardPlaybackEndTime does not work

AVPlayerItem has this property forwardPlaybackEndTime
The value indicated the time at which playback should end when the
playback rate is positive (see AVPlayer’s rate property).
The default value is kCMTimeInvalid, which indicates that no end time
for forward playback is specified. In this case, the effective end
time for forward playback is the item’s duration.
But I don't know why it does not work. I tried to set it in AVPlayerItemStatusReadyToPlay, duration available callback, ... but it does not have any effect, it just plays to the end
I think that forwardPlaybackEndTime is used to restrict the playhead, right?
In my app, I want to play from the beginning to the half of the movie only
My code looks like this
- (void)playURL:(NSURL *)URL
{
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:URL];
if (self.avPlayer) {
if (self.avPlayer.currentItem && self.avPlayer.currentItem != playerItem) {
[self.avPlayer replaceCurrentItemWithPlayerItem:playerItem];
}
} else {
[self setupAVPlayerWithPlayerItem:playerItem];
}
playerItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
// Play
[self.avPlayer play];
}
How to make forwardPlaybackEndTime work?
Try this
AVPlayerItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
Here 5 is the time till the AVPlayerItem will play.
Set the following on your AVPlayer
AVPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
Then set your notification
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemDidPlayToEndTime:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
with the method obviously
- (void) playerItemDidPlayToEndTime:(NSNotification*)notification
{
// do something here
}
then set your forwardPlaybackEndTime
AVPlayer.currentItem.forwardPlaybackEndTime = CMTimeAdd(AVPlayer.currentItem.currentTime, CMTimeMake(5.0, 1));
and start your avplayer playing
AVPlayer.rate = 1.0;
the notification will be triggered and your track will continue playing. In your handler you can stop it and do a seekToTime or whatever.
Alternatively you can just set a boundary observer
NSArray *array = [NSArray arrayWithObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(5.0, 1)]];
__weak OTHD_AVPlayer* weakself = self;
self.observer_End = [self addBoundaryTimeObserverForTimes:array queue:NULL usingBlock:^{ if( weakself.rate >= 0.0 ) [weakself endBoundaryHit]; }];
I have checked below code its running and streaming stops at specified time:
- (void)playURL
{
NSURL *url = [NSURL URLWithString:#"http://clips.vorwaerts-gmbh.de/VfE_html5.mp4"];
self.playerItem = [AVPlayerItem playerItemWithURL:url];
self.playerItem.forwardPlaybackEndTime = CMTimeMake(10, 1);
self.avPlayer = [AVPlayer playerWithPlayerItem:self.playerItem];
[videoView setPlayer:self.avPlayer];
// Play
[self.avPlayer play];
}
I hope this will help you.
Also please check these tutorials: AVFoundation Framework
I think that you have to update avPlayer's current playerItem.
So,
playerItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
it should be:
self.avPlayer.currentItem.forwardPlaybackEndTime = CMTimeMake(5, 1);

HTTP live stream AVAsset

I am implementing a n HTTP live streaming player with OSX using avplayer.
I am able to stream it properly seek and get duration timing etc.
Now i want to take screen shots and process the frames from it using OpenCV.
I went for using AVASSetImageGenerator. But there is no audio and video tracks with the AVAsset which is associated with player.currentItem.
The tracks are appearing in player.currentItem.tracks.
So i am not able to sue AVAssetGenerator. Can anybody help to find out a solution to extract screenshots and individual frames in such a scenario?
Please find the code below how i am initiating an HTTP live stream
Thanks in advance.
NSURL* url = [NSURL URLWithString:#"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
playeritem = [AVPlayerItem playerItemWithURL:url];
[playeritem addObserver:self forKeyPath:#"status" options:0 context:AVSPPlayerStatusContext];
[self setPlayer:[AVPlayer playerWithPlayerItem:playeritem]];
[self addObserver:self forKeyPath:#"player.rate" options:NSKeyValueObservingOptionNew context:AVSPPlayerRateContext];
[self addObserver:self forKeyPath:#"player.currentItem.status" options:NSKeyValueObservingOptionNew context:AVSPPlayerItemStatusContext];
AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]];
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]];
[newPlayerLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable];
[newPlayerLayer setHidden:YES];
[[[self playerView] layer] addSublayer:newPlayerLayer];
[self setPlayerLayer:newPlayerLayer];
[self addObserver:self forKeyPath:#"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:AVSPPlayerLayerReadyForDisplay];
[self.player play];
Following is how i am checking whether video track is present with the Asset
case AVPlayerItemStatusReadyToPlay:
[self setTimeObserverToken:[[self player] addPeriodicTimeObserverForInterval:CMTimeMake(1, 10) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
[[self timeSlider] setDoubleValue:CMTimeGetSeconds(time)];
NSLog(#"%f,%f,%f",[self currentTime],[self duration],[[self player] rate]);
AVPlayerItem *item = playeritem;
if(item.status == AVPlayerItemStatusReadyToPlay)
{
AVAsset *asset = (AVAsset *)item.asset;
long audiotracks = [[asset tracks] count];
long videotracks = [[asset availableMediaCharacteristicsWithMediaSelectionOptions]count];
NSLog(#"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
}
}]];
AVPlayerItem *item = self.player.currentItem;
if(item.status != AVPlayerItemStatusReadyToPlay)
return;
AVURLAsset *asset = (AVURLAsset *)item.asset;
long audiotracks = [[asset tracksWithMediaType:AVMediaTypeAudio]count];
long videotracks = [[asset tracksWithMediaType:AVMediaTypeVideo]count];
NSLog(#"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
This is an older question but in case someone needs help for that i have an answer
AVURLAsset *asset = /* Your Asset here! */;
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * /* Put the FPS of the source video here */ ; i++){
#autoreleasepool {
CMTime time = CMTimeMake(i, /* Put the FPS of the source video here */);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
// Do what you want with the image, for example save it as UIImage
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
CGImageRelease(image);
}
}
You can easily get the FPS of a Video by using this code:
float fps=0.00;
if (asset) {
AVAssetTrack * videoATrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
if(videoATrack)
{
fps = [videoATrack nominalFrameRate];
}
}
Hope that helps someone who is asking how to get all frames from a video or just some specific (with CMTime for example) frames. Please bear in mind, that saving all frames to an array can impact the memory hardly!

MPMoviePlayerController interrupts AirPlay iPod audio, but it works fine on the device

I'm using MPMoviePlayerController to play a video clip that loops. In the app delegate, I set my AVAudioSession's category to AVAudioSessionCategoryAmbient so that my video clip doesn't interrupt iPod audio.
This works great when playing iPod audio on the device, but when using AirPlay, the audio is briefly interrupted every time my looping video starts over, which is pretty frequent (and annoying).
In AppDelegate.m:
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:#"AVAudioSessionCategoryAmbient" error:nil];
In my video view controller:
self.videoPlayer = [[MPMoviePlayerController alloc] initWithContentURL:url];
self.videoPlayer.useApplicationAudioSession = YES;
self.videoPlayer.controlStyle = MPMovieControlStyleNone;
self.videoPlayer.repeatMode = MPMovieRepeatModeOne;
I've scoured the net and can't seem to find answers. Any advice would be great. Thanks!
Problem solved! I just had to use AVPlayer instead of MPMoviePlayerController.
self.avPlayer = [AVPlayer playerWithURL:url];
self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
self.avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
self.avPlayerLayer.frame = self.view.bounds;
[self.view.layer addSublayer:self.avPlayerLayer];
[self.avPlayer play];
And to make it loop:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(movieDidFinish:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.avPlayer currentItem]];
...
- (void)movieDidFinish:(NSNotification *)notification {
[self.avPlayer seekToTime:kCMTimeZero];
}

Resources