Load two AVPlayers with one video - ios

I have two different views that are meant to play the same video, I am creating an app that will switch several times between the two views while the video is running.
I currently load the first view with the video as follows:
NSURL *url = [NSURL URLWithString:#"http://[URL TO VIDEO HERE]"];
AVURLAsset *avasset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:avasset];
player = [[AVPlayer alloc] initWithPlayerItem:item];
playerLayer = [[AVPlayerLayer playerLayerWithPlayer:player] retain];
CGSize size = self.bounds.size;
float x = size.width/2.0-202.0;
float y = size.height/2.0 - 100;
//[player play];
playerLayer.frame = CGRectMake(x, y, 404, 200);
playerLayer.backgroundColor = [UIColor blackColor].CGColor;
[self.layer addSublayer:playerLayer];
NSString *tracksKey = #"tracks";
[avasset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [avasset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
//videoInitialized = YES;
[player play];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
In my second view I want to load the video from the dispatch_get_main_queue so that the video in both views are in sync.
I was hoping someone could help me out with loading the data of the video from the first view into the second view.

It is very simple:
Init the first player:
AVAsset *asset = [AVAsset assetWithURL:URL];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
And the second player in the same way, BUT, use the same asset from the first one.
I have verified, it works.
There is all the info you need on the Apple page:
https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html
This abstraction means that you can play a given asset using different
players simultaneously
this quote is from this page.

I don't think you will be able to get this approach to work. Videos are decoded in hardware and then the graphics buffer is sent to the graphics card. What you seem to want to do is decode a video in one view but then capture the contents of the first view and show it in a second view. That will not stay in sync because it would take time to capture the contents of the first window back into main memory and then those contents would need to be sent to the video card again. Basically, that is not going to work. You also cannot decode two h.264 videos streams and expect them to be in sync.
You could implement this with another approach entirely. If you decode the h.264 video to frames on disk (save each frame as a PNG) and then write your own loop that will decode the Nth PNG in a series of PNGs and then display the results in the two different windows. That will work fast enough to be an effective implementation on newer iPhone 4 and 5 and iPad 2 and 3. If you want to make use of a more advanced implementation, take a look at my AVAnimator library for iOS, you could get this approach working in 20 minutes if you use existing code.

For this ten year old question which has only ten year old answers which are out of date, here's the up to date answer.
var leadPlayer: AVPlayer ... the lead player you want to dupe
This does not work:
let leadPlayerItem: AVPlayerItem = leadPlayer.currentItem!
yourPlayer = AVPlayer(playerItem: leadPlayerItem)
yourPlayer.play()
Apple does not allow that (try it, see error).
This works. You must use the item:
let dupeItem: AVPlayerItem = AVPlayerItem(asset: leadPlayer.currentItem!.asset)
yourPlayer = AVPlayer(playerItem: dupeItem)
yourPlayer.play()
Fortunately it's now that easy.

Related

MP3 Queue Player - Load in background thread?

I have an AVQueuePlayer that is used to play a list of MP3 songs from the internet (http). I need to also know which song is currently playing. The current problem is that loading the song causes a delay that blocks the main thread while waiting for the song to load (first song as well as sequential songs after the first has completed playback).
The following code blocks the main thread:
queuePlayer = [[AVQueuePlayer alloc] init];
[queuePlayer insertItem: [AVPlayerItem playerItemWithURL:url] afterItem: nil]; // etc.
[queuePlayer play]
I am looking for a way to create a playlist of MP3s where the next file to be played back is preloaded in the background.
I tried the following code:
NSArray* tracks = [NSArray arrayWithObjects:#"http://example.com/song1.mp3", #"http://example.com/song2.mp3", #"http://example.com/song3.mp3", nil];
for (NSString* trackName in tracks)
{
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:[NSURL URLWithString:trackName]
options:nil];
AVMutableCompositionTrack* audioTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
NSError* error;
[audioTrack insertTimeRange:CMTimeRangeMake([_composition duration], audioAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
if (error)
{
NSLog(#"%#", [error localizedDescription]);
}
// Store the track IDs as track name -> track ID
[_audioMixTrackIDs setValue:[NSNumber numberWithInteger:audioTrack.trackID]
forKey:trackName];
}
_player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
[_player play];
The issue with this is that I am not sure how to detect when the next song starts playing. Also, the docs don't specify whether or not this will pre-load MP3 files or not.
I am looking for a solution that:
Plays MP3s by pre-loading them in the background prior to playback (ideally start loading the next song before the current song finishes, so it is ready for immediate playback once the current song finishes)
Allow me to view the current song playing.
AVFoundation has some classes designed to do exactly what you're looking for.
It looks like your current solution is to build a single AVPlayerItem that concatenates all of the MP3 files that you want to play. A better solution is to create an AVQueuePlayer with an array of the AVPlayerItem objects that you want to play.
NSArray* tracks = [NSArray arrayWithObjects:#"http://example.com/song1.mp3", #"http://example.com/song2.mp3", #"http://example.com/song3.mp3", nil];
NSMutableArray *playerItems = [[NSMutableArray alloc] init];
for (NSString* trackName in tracks)
{
NSURL *assetURL = [NSURL URLWithString:trackName];
if (!assetURL) {
continue;
}
AVURLAsset* audioAsset = [[AVURLAsset alloc] initWithURL:assetURL
options:nil];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:audioAsset];
[playerItems addObject:playerItem];
}
_player = [[AVQueuePlayer alloc] initWithItems:playerItems];
[_player play];
In answer to your final wrap-up questions:
Yes, AVQueuePlayer DOES preload the next item in the playlist while it's playing the current one.
You can access the currentItem property to determine which AVPlayerItem is currently playing.

How to manually change the streaming video quality in AV player in ios?

I am building application in which online streaming is handled by AV Player(Default iOS player).
I want to add button for HD streaming, how to I achieve that?
The solution I found was to ensure that the underlying AVAsset is ready to return basic info, such as its duration, before feeding it to the AVPlayer. AVAsset has a method loadValuesAsynchronouslyForKeys: which is handy for this:
AVAsset *asset = [AVAsset assetWithURL:self.mediaURL];
[asset loadValuesAsynchronouslyForKeys:#[#"duration"] completionHandler:^{
AVPlayerItem *newItem = [[AVPlayerItem alloc] initWithAsset:asset];
[self.avPlayer replaceCurrentItemWithPlayerItem:newItem];
}];
In my case the URL is a network resource, and replaceCurrentItemWithPlayerItem: will actually block for several seconds waiting for this information to download otherwise.

What is the best way to play video in UITableViewCell

I'm trying to make auto-play video in UITableViewCell depending on cell position.
I'm using the AVPlayer
Here is my code:
__weak typeof(self)this = self;
NSString* videoPath = #"http://test.com/test.mp4";
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL URLWithString:videoPath] options:nil];
NSArray* keys = [NSArray arrayWithObjects:#"playable",nil];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^(){
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
this.avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
this.avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:this.avPlayer];
this.avPlayerLayer.frame = _videoContent.frame;
dispatch_async(dispatch_get_main_queue(),^{
[this.videoContent.layer addSublayer:this.avPlayerLayer];
[this.avPlayer play];
});
}];
But my UITableView is frozen when i scroll the table.
I think there are many time-consuming work but most biggest thing is
[this.avPlayer play]
So my question is that AVPlayer is the best way in this situation?
And is there any way to improve the performance?
Are you sure that creating the AVPlayerItem, AVPlayer, and AVPlayerLayer can all be performed off the main thread? You might want to try putting those inside the block that dispatches on the main queue.
Use the below link , this suits for your question.
https://github.com/PRX/PRXPlayer

How to play youtube video using URL in AVPlayer IOS?

I want to load video in AVPlayer using YouTube URL but it is not showing anything.Whenever i am loading from a local storage using NSBundle it is working fine.Is there is any alternative to load video or we can do something in AVPlayer.
This is my code:
- (void)viewDidLoad
{
[super viewDidLoad];
NSError *setCategoryError = nil;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error: &setCategoryError];
AVAsset *asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:#"http://www.youtube.com/watch?v=zPP6lXaL7KA&feature=youtube_gdata_player"]];
avPlayerItem = [[AVPlayerItem alloc]initWithAsset:asset];
self.songPlayer = [AVPlayer playerWithPlayerItem:avPlayerItem];
self.avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer: self.songPlayer];
self.avPlayerLayer.frame = self.view.layer.bounds;
UIView *newView = [[UIView alloc] initWithFrame:self.view.bounds];
[newView.layer addSublayer:avPlayerLayer];
[self.view addSubview:newView];
[ self.songPlayer play];
}
You should use the iOS Youtube Helper library for playing youtube videos.
https://developers.google.com/youtube/v3/guides/ios_youtube_helper
I don't know if you can use the AVPlayer. I've seen some examples using MPMoviePlayerController on CocoaControls, like this one: https://www.cocoacontrols.com/controls/hcyoutubeparser or this one: https://www.cocoacontrols.com/controls/xcdyoutubevideoplayerviewcontroller
But I don't think using youtube's url directly in your player fits the ToS of the platform. So I will recommend you tu use the Youtube Helper Library if you are planning to publish your app.
Use XCDYouTubeKit pod
Get youtube video id from url like https://www.youtube.com/watch?v=dLEATulyCdw
you can use this code:
extension URL {
func youtubeVideoId() -> String? {
let pattern = #"(?<=(youtu\.be\/)|(v=)).+?(?=\?|\&|$)"#
let testString = absoluteString
if let matchRange = testString.range(of: pattern, options: .regularExpression) {
let subStr = testString[matchRange]
return String(subStr)
} else {
return .none
}
} }
Then call XCDYouTubeClient.default().getVideoWithIdentifier(videoId)
In the completion you can get url. video?.streamURLs contains urls with a different quality, choose desired.
Finally just pass this url to AVPlayer...
Update
Visual explanation
Use first instead of youtubeMaxAvailableQuality
I don't think if you can use this now because i just used this and encountered the same case.
I read the apple document-,it definitely refers to that (You cannot directly create an AVAsset instance to represent the media in an HTTP Live Stream.).
Instead here is apple's example:
NSURL *url = [NSURL URLWithString:#"<#Live stream URL#>];
// You may find a test stream at
http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8.
self.playerItem = [AVPlayerItem playerItemWithURL:url];
see,if not local url,should be playerItemWithURL:^_^

tracks in AVComposition losing time when paused

I've created an AVMutableComposition that consists of a bunch of audio tracks that start at specific times. From there, following Apple recommendations, i turned it into an AVComposition before playing it with AVPlayer.
It all works fine playing this AVPlayer item, but if I pause it and then continue, all the tracks in the composition appear to slip back about 0.2 seconds relative to each other (i.e., they bunch up). Hitting pause and continuing several times compounds the effect and the overlap is more significant (basically if I hit it enough, I will end up with all 8 tracks playing simultaneously).
if (self.player.rate > 0.0) {
//if player is playing, pause
[self.player pause];
} else {
if (self.player) {
[self.player play];
return;
}
*/CODE CREATING COMPOSITION - missed out big chunk of code relating to finding the track and retrieving its position and scale/*
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES]
forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *sourceAsset = [[AVURLAsset alloc] initWithURL:url options:options];
//calculate times
NSNumber *time = [soundArray1 objectAtIndex:1]; //this is the time scale - e.g. 96 or 120 etc.
double timenow = [time doubleValue];
double insertTime = (240*y);
AVMutableCompositionTrack *track =
[composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
//insert the audio track from the asset into the track added to the mutable composition
AVAssetTrack *myTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMTimeRange myTrackRange = myTrack.timeRange;
NSError *error = nil;
[track insertTimeRange:myTrackRange
ofTrack:myTrack
atTime:CMTimeMake(insertTime, timenow)
error:&error];
[sourceAsset release];
}
}
AVComposition *immutableSnapshotOfMyComposition = [composition copy];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:immutableSnapshotOfMyComposition];
self.player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
NSLog(#"here");
[self.player play];
Thanks
OK, this feels a little hacky, but it definitely works if anybody is stuck. If someone has a better answer, do let me know!
Basically, I just save the player.currentTime of the track when I hit pause and remake the track when i hit play, just starting from the point at which i paused it. No discernible delay, but I'd still be happier without wasting this extra processing.
Make sure you properly release your player item after you hit pause, otherwise you'll end up with a giant stack of AVPlayers!
I have a solution that is a bit less hacky but still hacky.
The solution comes from the fact that I noticed that if you seeked on the player, the latency between audio and video introduced by pausing disappeared.
Hence: just save the player.currentTime just before pausing and, player seekToTime just before playing again. It works pretty well on iOS 6, haven't tested on other versions yet.

Resources