I want to use the AVURLAsset to play a video file but on a server not local file. I have read that the AVURLAsset cant be used directly for remote files .
I read another link of stack Overflow
AVURLAsset cannot load with remote file
this link has some method to use AVURLAsset to play remote files but I am not able to understand it fully. My observer is not being called. Can someone please help me? Actually I don't want to use AVPlayer to play video for some reasons. I am grabbing frames from AVAsset and then rendering them as textures in OpenGL so I need to do this by AVURLAsset only.
Here is code to look at
-(void) startPlayer
{
NSURL *url=[NSURL fileURLWithPath:#"http://gamooz.com/wildlife.mp4"];
pItem = [AVPlayerItem playerItemWithURL:url];
player = [AVPlayer playerWithPlayerItem:pItem];
[player play];
pItem addObserver:self forKeyPath:#"status" options:0 context:nil];
}
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change: (NSDictionary *)change context:(void *)context
{
NSLog(#"heyy");
if ([keyPath isEqualToString:#"status"])
{
AVPlayerItem *pItemTemp = (AVPlayerItem *)object;
if (pItemTemp.status == AVPlayerItemStatusReadyToPlay)
{
///now i can use playerItem asset
asset = (AVURLAsset *)pItemTemp.asset;
}
}
}
but the observer is never getting called. Why is that?
Also I put the observer code in some other function and tried to check if the playerItem is ready to play
-(void) checkForPlayer
{
if (pItem.status == AVPlayerItemStatusReadyToPlay)
{
asset = (AVURLAsset *)pItem.asset;
}
}
it is never giving status equal to ready.
Related
First of all, thank you for your help.
The Questions:
1.I need to test a lot videos.I wrote two methods to test videos.
2.When playing a video status change, and I set avPlayer = nil and avPlayerItem = nil, i also add the #autoreleasepool {}, but the memory is slowly increased, I used the instruments but I did not find memory leaks, I find a lot questions and did not find a solution.
3.Here is my codeļ¼
-(void)CheckVideoURlCanPlay{
#autoreleasepool {
VideoPlayDataObj *videoPlay = [self.needCheckArr objectAtIndex:0];
NSString *str = videoPlay.videoURL;
NSURL *url = [NSURL URLWithString:str];
self.playItem = [AVPlayerItem playerItemWithURL:url];
self.player = [AVPlayer playerWithPlayerItem:self.playItem];
self.playLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playLayer.videoGravity = AVLayerVideoGravityResizeAspect;
self.playLayer.frame=CGRectMake(100, 80, 100, 100);
[self.view.layer addSublayer:self.playLayer];
[self.player play];
[self.playItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:nil];
}}
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSString *,id> *)change context:(void *)context{
#autoreleasepool {
if ([keyPath isEqualToString:#"status"]) {
if (self.player.currentItem.status == AVPlayerItemStatusReadyToPlay) {
#autoreleasepool {
[self.player.currentItem.asset cancelLoading];
[self.player.currentItem cancelPendingSeeks];
[self.player cancelPendingPrerolls];
[self.playItem removeObserver:self forKeyPath:#"status"];
self.playItem = nil;
self.player = nil;
}
[self.needCheckArr removeObjectAtIndex:0];
[NSThread sleepForTimeInterval:5];
[self CheckVideoURlCanPlay];
}
}
}
}
4.I also tried to relieve the current view, but it does not relieve all. If you play a large number of video memory will be high, the final app will be killed.
5.I wonder if after loading a video, it will produce dirty memory?
this my app start memory:13M
this test some videos and i pop the view :
enter image description here
this test some videos again and pop the view :
enter image description here
6.Finally, I hope you can help me, thank you very much.
It does not look like you are ever removing the AVPlayerLayer from the view:
[self.playerLayer removeFromSuperlayer];
Also I cannot see the point of #autoreleasepool here, especially if you have ARC turned on (which you almost certainly do!). #autoreleasepool is usually only useful when you have ARC turned off, or when you are churning through a lot of memory (as in, many megabytes) in a single main-loop invocation and need to control when they get cleaned up.
I want to play vod.m3u8 file which is stored inside app home directory. vod.m3u8 file contains index0, index1, etc... in it. .ts file is physically present in directory.
#EXTM3U
#EXT-X-VERSION:0
#EXT-X-TARGETDURATION:1
#EXT-X-PLAYLIST-TYPE:VOD
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:11,
index0.ts
#EXTINF:10,
index1.ts
#EXTINF:5,
index2.ts
#EXTINF:5,
index4.ts
#EXTINF:10,
index5.ts
#EXTINF:10,
index6.ts
#EXTINF:5,
index7.ts
#EXT-X-ENDLIST
Following is my code to play video
-(void)playLocalVideo
{
NSString *path = [[NSBundle mainBundle] pathForResource:#"vod"
ofType:#"m3u8"];
NSURL *url = [[NSURL alloc] initFileURLWithPath: path];
AVAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
anItem = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:anItem];
AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:player];
layer.frame = self.view.layer.bounds;
UIView *view = [[UIView alloc] initWithFrame:CGRectMake(0, 300, 320, 250)];
[view.layer addSublayer:layer];
[self.view addSubview:view];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context
{
if (object == player && [keyPath isEqualToString:#"status"]) {
if (player.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed");
} else if (player.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayer Ready to Play");
[player play];
} else if (player.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
AVPlayer status is AVPlayer Ready To Play, But it never plays video. If i play any mp4 video then it's playing,but not vod.m3u8 file.
Plz any help.
The issue is probably in the values in your M3U8 files.
Your target duration should describe the maximum playback length of your .TS segments. Your EXTINF says the segments are up to 10 seconds, so you should set #EXT-X-TARGETDURATION:11 instead of 1 .
You probably need a real value for your EXT-X-VERSION. See https://developer.apple.com/library/archive/qa/qa1752/_index.html . Set a minimum value of 2.
An HLS package usually starts with a master manifest that points to variant playlists such as your VOD.m3u8, so you might try creating a master.m3u8 that points to your VOD.m3u8 rather than trying to play your variant playlist directly.
In general, you probably want to start with a working HLS sample and then edit it down to match the properties of your segments, rather than trying to build from scratch and guess at which values are important.
I'm trying to implement a fade-in effect based on AVPlayer + AVAudioMix + AVAudioMixInputParameters. It basically works except when playing the audio for the first time after starting my app there is a click in the beginning. Subsequent plays work perfect though, but the first-time glitch is pretty stable and reproducible.
My Play button is enabled only after the AVPlayerItem's status is set to ready, so it's impossible to fire a play method while the player is not ready. In fact it doesn't matter how long I wait after loading the audio file and constructing all the objects.
This happens on OS X, I haven't tested it on iOS (yet).
Note that for this test you need an audio file that starts with sound and not silence. Here is my stripped down code without the GUI part (testFadeIn is the entry point):
static AVPlayer* player;
static void* PlayerItemStatusObserverContext = &PlayerItemStatusObserverContext;
- (void)testFadeIn
{
AVURLAsset* asset = [AVURLAsset.alloc initWithURL:[NSURL fileURLWithPath:#"Helicopter.m4a"] options:#{AVURLAssetPreferPreciseDurationAndTimingKey: #YES}];
AVPlayerItem* item = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:item];
[item addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:PlayerItemStatusObserverContext];
}
- (void)observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
if (context == PlayerItemStatusObserverContext)
{
AVPlayerStatus status = (AVPlayerStatus)[[change objectForKey:NSKeyValueChangeNewKey] integerValue];
if (status == AVPlayerStatusReadyToPlay)
{
[self applyFadeIn];
[self performSelector:#selector(play:) withObject:nil afterDelay:1.0];
}
}
}
- (void)applyFadeIn
{
assert(player.currentItem.tracks.firstObject);
AVMutableAudioMixInputParameters* fadeIn = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:player.currentItem.tracks.firstObject];
[fadeIn setVolume:0 atTime:kCMTimeZero];
[fadeIn setVolume:1 atTime:CMTimeMake(2, 1)];
NSMutableArray* paramsArray = [NSMutableArray new];
[paramsArray addObject:fadeIn];
AVMutableAudioMix* audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = paramsArray;
player.currentItem.audioMix = audioMix;
}
- (void)play:(id)unused
{
[player play];
}
Click! What is wrong with this?
Edit:
An obvious workaround that I use at the moment is: when the player reports it's ready, I do a short 100ms playback with volume=0, then restore currentTime and volume and only then I report to the main app that the player is ready. This way there are no clicks. Interestingly, anything less than 100ms still gives the click.
This seems like an issue with something that's being cached by AVFoundation after the first playback. It's neither the tracks, as they are available when I set the fade in params, nor the seek status.
I am trying to decode audio samples from a remote HLS (m3u8) stream on an iOS device for further processing of the data, e.g. record to a file.
As reference stream http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8 is used.
By using an AVURLAsset in combination with the AVPlayer I am able to show the video as preview on a CALayer.
I can also get the raw video data (CVPixelBuffer) by using AVPlayerItemVideoOutput. The audio is hearable over the speaker of the iOS device as well.
This is the code I am using at the moment for the AVURLAsset and AVPlayer:
NSURL* url = [NSURL URLWithString:#"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:#[tracksKey] completionHandler: ^{
dispatch_async(dispatch_get_main_queue(), ^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
NSDictionary *settings = #
{
(id)kCVPixelBufferPixelFormatTypeKey: #(kCVPixelFormatType_32BGRA),
#"IOSurfaceOpenGLESTextureCompatibility": #YES,
#"IOSurfaceOpenGLESFBOCompatibility": #YES,
};
AVPlayerItemVideoOutput* output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
[playerItem addOutput:output];
AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];
[player setVolume: 0.0]; // no preview audio
self.playerItem = playerItem;
self.player = player;
self.playerItemVideoOutput = output;
AVPlayerLayer* playerLayer = [AVPlayerLayer playerLayerWithPlayer: player];
[self.preview.layer addSublayer: playerLayer];
[playerLayer setFrame: self.preview.bounds];
[playerLayer setVideoGravity: AVLayerVideoGravityResizeAspectFill];
[self setPlayerLayer: playerLayer];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemNewAccessLogEntry:) name:AVPlayerItemNewAccessLogEntryNotification object:self.playerItem];
[_player addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:&PlayerStatusContext];
[_playerItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:&PlayerItemStatusContext];
[_playerItem addObserver:self forKeyPath:#"tracks" options:0 context:nil];
}
});
}];
-(void) observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (self.player.status == AVPlayerStatusReadyToPlay && context == &PlayerStatusContext) {
[self.player play];
}
}
To get the raw video data of the HLS stream I use:
CVPixelBufferRef buffer = [self.playerItemVideoOutput copyPixelBufferForItemTime:self.playerItem.currentTime itemTimeForDisplay:nil];
if (!buffer) {
return;
}
CMSampleBufferRef newSampleBuffer = NULL;
CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
timingInfo.duration = CMTimeMake(33, 1000);
int64_t ts = timestamp * 1000.0;
timingInfo.decodeTimeStamp = CMTimeMake(ts, 1000);
timingInfo.presentationTimeStamp = timingInfo.decodeTimeStamp;
CMVideoFormatDescriptionRef videoInfo = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(
NULL, buffer, &videoInfo);
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
buffer,
true,
NULL,
NULL,
videoInfo,
&timingInfo,
&newSampleBuffer);
// do something here with sample buffer...
CFRelease(buffer);
CFRelease(newSampleBuffer);
Now I would like to get access to the raw audio data as well, but had no luck so far.
I tried to use MTAudioProcessingTap as described here:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
Unfortunately I could not get this to work properly. I succeeded in getting access to the underlying assetTrack of the AVPlayerItem but the callback mehtods "prepare" and "process" of the MTAudioProcessingTap is never getting called. I am not sure if I am on the right track here.
AVPlayer is playing the audio of the stream via the speaker, so internally the audio seems to be available as raw audio data. Is it possible to get access to the raw audio data? If it is not possible with AVPlayer, are there any other approaches?
If possible, I would not like to use ffmpeg, because the hardware decoder of the iOS device should be used for the decoding of the stream.
I have the following code in my app:
NSURL *url = [NSURL fileURLWithPath: [self.DocDir stringByAppendingPathComponent: self.FileName] isDirectory: NO];
self.avPlayer = [AVPlayer playerWithURL: url];
Float64 duration = CMTimeGetSeconds(self.avPlayer.currentItem.duration);
This worked fine with iOS 6 but with iOS 7 for some reason it returns NaN. When inspecting self.avPlayer.currentItem.duration the CMTime object has 0's with a flag of 17.
Interestingly the player works fine, just the duration is wrong.
Has anyone else experienced the same issues? I am importing the following:
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#import <MediaPlayer/MediaPlayer.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVAsset.h>
After playing around with different ways of initializing the objects I arrived at a working solution:
AVURLAsset *asset = [AVURLAsset assetWithURL: url];
Float64 duration = CMTimeGetSeconds(asset.duration);
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset: asset];
self.avPlayer = [[AVPlayer alloc] initWithPlayerItem: item];
It appears the duration value isn't always immediately available from an AVPlayerItem but it seems to work fine with an AVAsset immediately.
In iOS 7, for AVPlayerItem already created, you can also get duration from the underlaying asset:
CMTimeGetSeconds([[[[self player] currentItem] asset] duration]);
Instead of get it directly from AVPlayerItem, which gives you a NaN:
CMTimeGetSeconds([[[self player] currentItem] duration]);
The recommended way of doing this, as described in the manual is by observing the player item status:
[self.avPlayer.currentItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew|NSKeyValueObservingOptionInitial context:nil];
Then, inside observeValueForKeyPath:ofObject:change:context:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
// TODO: use either keyPath or context to differentiate between value changes
if (self.avPlayer.currentItem.status == AVPlayerStatusReadyToPlay) {
Float64 duration = CMTimeGetSeconds(self.avPlayer.currentItem.duration);
// ...
}
}
Also, make sure that you remove the observer when you change the player item:
if (self.avPlayer.currentItem) {
[self.avPlayer.currentItem removeObserver:self forKeyPath:#"status"];
}
Btw, you can also observe the duration property directly; however, it's been my personal experience that the results aren't as reliable as they should be ;-)
Swift version
You can get the duration using AVAsset which is AVPlayerItem property:
func getVideoDuration(from player: AVPlayer) -> Double? {
guard let duration = player.currentItem?.asset.duration else { return nil }
let durationSeconds = CMTimeGetSeconds(duration)
return durationSeconds
}
or by creating AVAsset from the scratch:
func getVideoDuration(for videoUrl: URL) -> Double {
let asset = AVAsset(url: videoUrl)
let duration = asset.duration
let durationSeconds = CMTimeGetSeconds(duration)
return durationSeconds
}