I am using AVPlayer for stream video from a URL like this :
-(void)initVideo
{
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:[NSString stringWithFormat:#"%#VideoFileURl",Video_BASE_URL]]];
avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer * avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
avPlayerLayer.frame = self.imageUserImage.frame;
avPlayerLayer.videoGravity = AVLayerVideoGravityResize;
[self.imageUserImage.layer addSublayer:avPlayerLayer];
[self.avPlayer.currentItem addObserver:self forKeyPath:#"status" options:0 context:nil];
[avPlayer play];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
if (object == self.avPlayer && [keyPath isEqualToString:#"status"]) {
if (self.avPlayer.currentItem.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed");
} else if (self.avPlayer.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayerStatusReadyToPlay");
} else if (self.avPlayer.currentItem.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
Every thing works fine & my video load as well but in some condition my video is paused forever ( hang up in a frame and doesn't continue ) !
Problem
I only receive the AVPlayerStatusFailed notification in start of a stream if there is a problem but I need to get a notification while streaming like interrupt or loose a connection while streaming.
Please help me how can I handle the network error during streaming ( interrupt in a connection & change a network status -> Like reachability
UPDATE
i find a solution , i should use AVPlayerItemPlaybackStalledNotification
but i don't know why i give this Continuous
Related
I'm creating an ios app with a streaming player maded with AvPlayer. This is my code:
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self InitPlayer];
[self ReadMetaData];
}
-(void)InitPlayer{
NSURL *url = [[NSURL alloc] initWithString:#"http://www.fakeurl.com/stream"];
// create a player view controller
self.player = [AVPlayer playerWithURL:url];
player.closedCaptionDisplayEnabled = NO;
}
-(void)ReadMetaData{
[self.player.currentItem addObserver:self forKeyPath:#"timedMetadata" options:NSKeyValueObservingOptionNew context:nil];
}
- (void) observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object
change:(NSDictionary*)change context:(void*)context {
if ([keyPath isEqualToString:#"timedMetadata"])
{
AVPlayerItem* playerItem = object;
for (AVMetadataItem* metadata in playerItem.timedMetadata)
{
if([metadata.commonKey isEqualToString:#"title"]){
TitleLabel.text=metadata.stringValue;
}
}
}
}
With this code i can successfully play and pause the stream. I can also print the title of the track.
If i try to log the timedMetadata with something like this:
NSLog(#"%#",player.currentItem.timedMetadata);
I retrive that:
"<AVMetadataItem: 0x15649500, identifier=common/title, keySpace=comn, key class = __NSCFConstantString, key=title, commonKey=title, extendedLanguageTag=(null), dataType=(null), time={21888/44100 = 0.496}, duration={INVALID}, startDate=(null), extras={\n}, value=Keepin-'fake song title>"
Now my question is: for that specific stream url the timedMetadata i logged are the only metadata i can retrive? If yes how i can achive a more complex player type (something like "go to next track button","go to previous track button","an history of tracks",ecc...) ? That's the first time i work with stream data and in my expectations there was a lot of information in audio metadata. In real life seems i can get only the track title. There's a problem with my code or the stream source is poor in metadata info?
I tried to find best solution to get metadata from AVPlayer and found it:
-(IBAction) BtnGoClick:(id)sender {
NSURL *url = [[NSURL alloc] initWithString:#"http://cast.loungefm.com.ua/loungefm"];
[self setupAVPlayerForURL:url];
}
-(void) setupAVPlayerForURL: (NSURL*) url {
AVAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
AVPlayerItem *anItem = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:anItem];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
[anItem addObserver:self forKeyPath:#"timedMetadata" options:NSKeyValueObservingOptionNew context:NULL];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (object == player && [keyPath isEqualToString:#"status"]) {
if (player.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed");
} else if (player.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayer Ready to Play");
} else if (player.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
if ([keyPath isEqualToString:#"timedMetadata"])
{
AVPlayerItem* playerItem = object;
for (AVMetadataItem* metadata in playerItem.timedMetadata)
{
if([metadata.commonKey isEqualToString:#"title"]){
NSLog(#"%#",metadata.stringValue);
}
}
}
}
Result:
I want to play a remote .mp4 video in AvPlayer. Here is my code
NSString *encodedString = [urlString stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
NSURL *videoUrl = [[NSURL alloc]initWithString:encodedString];
self.asset = [AVAsset assetWithURL:videoUrl];
self.player = [AVPlayer playerWithURL:videoUrl];
self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.player addObserver:self forKeyPath:#"status" options:0 context:nil];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.player currentItem]];
self.playerLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
[self.myPlayerView.layer addSublayer:self.playerLayer];
self.videoPlaybackPosition = 0;
[self.player play];
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (object == self.player && [keyPath isEqualToString:#"status"]) {
if (self.player.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed --> %ld",(long)self.player.currentItem.status);
} else if (self.player.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayerStatusReadyToPlay");
[self.player play];
} else if (self.player.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
AVPlayerStatusReadyToPlay is taking some time to buffering the video and it only working for one time.
But the same code I use to record a video and play it from temp file. In that case it works fine. After uploading the video to server, when I try to play it from server, the video plays for just 1-2 seconds. After that the video got stuck, and nothing happens. I can't understand where is my problem.
Can any one please help me?
I want to play ts file from server with :
- (void)play {
NSURL *url= [NSURL URLWithString:#"http://10.0.0.18/11.ts"];
player = [[AVPlayer alloc] initWithURL:url];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
if (object == player && [keyPath isEqualToString:#"status"]) {
if (player.status == AVPlayerStatusReadyToPlay) {
[player setVolume:1.0];
[player play];
} else if (player.status == AVPlayerStatusFailed) {
// something went wrong. player.error should contain some information
}
}
}
And the player not play the file.
i check if the file can be download in my browser and it work perfectly.
Any idea how to fix it?
AVPlayer (nor any class in the Apple framework) cannot play TS files.
The only way is to put it in a m3u8 playlist that you distribute via a local http server, and play this playlist. Otherwise, you have to convert the ts to a mp4, see TS2MP4 or GPAC.
My app currently plays 2 videos at the same time, functionally, but I all the SO answers i've seen utilizes lots of Key-Value code. Is it bad/wrong if I just do the bare minimum listed below?
viewdidload
AVURLAsset *asset = [AVURLAsset assetWithURL:[NSURL URLWithString:firstVideo.videoURL]];
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:asset];
self.player1 = [AVPlayer playerWithPlayerItem:item];
[topPlayer setMovieToPlayer:self.player1];
AVURLAsset *asset2 = [AVURLAsset assetWithURL:[NSURL URLWithString:secondVideo.videoURL]];
AVPlayerItem *item2 = [AVPlayerItem playerItemWithAsset:asset2];
self.player2 = [AVPlayer playerWithPlayerItem:item2];
[bottomPlayer setMovieToPlayer:self.player2];
((AVPlayerLayer *)[self.topPlayer layer]).videoGravity = AVLayerVideoGravityResizeAspectFill;
((AVPlayerLayer *)[self.bottomPlayer layer]).videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.player1 play];
[self.player2 play];
The above code is all I use to play a video, and it works fine. Occasionally there is a ~1s delay, how can I wait until both the videos are ready to play, then play them both?
use KVO to observer the player's status, and play videos when both's status is ready.
[yourPlayerItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:nil];
and in kvo:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if ([keyPath isEqualToString:#"status"]) {
AVPlayerStatus status = [change[NSKeyValueChangeNewKey] integerValue];
switch (status) {
case AVPlayerStatusUnknown:
//do something
break;
case AVPlayerStatusReadyToPlay:
{
//check which avplayer is ready, by check the parametric object isEqual:yourPlayerItem
//use a bool value to record the ready status. after two bools are YES, then play the video
}
break;
case AVPlayerStatusFailed:
{
AVPlayerItem *playerItem = (AVPlayerItem *)object;
[self assetFailedToPrepareForPlayback:playerItem.error];
}
break;
}
}
}
I am using AVPlayer to play audio from a URL
In ViewDidLoad:
self.playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:imageText]];
self.player = [AVPlayer playerWithPlayerItem:playerItem];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
[player play];
Observer
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
if (object == player && [keyPath isEqualToString:#"status"]) {
if (player.status == AVPlayerStatusReadyToPlay) {
//[playingLbl setText:#"Playing Audio"];
NSLog(#"fineee");
[playBtn setEnabled:YES];
} else if (player.status == AVPlayerStatusFailed) {
// something went wrong. player.error should contain some information
NSLog(#"not fineee");
NSLog(#"%#",player.error);
}
else if (player.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
but the player sometimes is stuck and does not play the audio but then also status is AVPlayerStatusReadyToPlay. It never goes inside AVPlayerStatusFailed or AVPlayerItemStatusUnknown. As i want to handle AVPlayer's error, it must go inside these as well. Please help!!
You should observe CurrentItem's status. AVPlayer failed because of AVPlayerItem failed, if anythings went wrong, it begins from AVPlayerItem then AVPlayer.
try:
[item addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:nil];
in your observeValueForKeyPath:
if (object == audioPlayer.currentItem && [keyPath isEqualToString:#"status"]) {
if (audioPlayer.currentItem.status == AVPlayerItemStatusFailed) {
NSLog(#"------player item failed:%#",audioPlayer.currentItem.error);
}
}
You can take a look of AVPlayer's handling or use it directly from HysteriaPlayer, my open source project.