I'm using two instances of AVPlayer.
First player audio is working fine but the moment the second audio starts there is a small fraction of time in which the first audio stops.
For first player I have made a singleton class.
This is the second player.
AVAsset *asset = [AVAsset assetWithURL:url];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset] //AVPlayerItem *playerItem=[AVPlayerItem playerItemWithURL:url];
[self.player replaceCurrentItemWithPlayerItem:playerItem]
[self.player play];
Related
I am building application in which online streaming is handled by AV Player(Default iOS player).
I want to add button for HD streaming, how to I achieve that?
The solution I found was to ensure that the underlying AVAsset is ready to return basic info, such as its duration, before feeding it to the AVPlayer. AVAsset has a method loadValuesAsynchronouslyForKeys: which is handy for this:
AVAsset *asset = [AVAsset assetWithURL:self.mediaURL];
[asset loadValuesAsynchronouslyForKeys:#[#"duration"] completionHandler:^{
AVPlayerItem *newItem = [[AVPlayerItem alloc] initWithAsset:asset];
[self.avPlayer replaceCurrentItemWithPlayerItem:newItem];
}];
In my case the URL is a network resource, and replaceCurrentItemWithPlayerItem: will actually block for several seconds waiting for this information to download otherwise.
I am using AVPLayer to player video in UITableView. Video play properly but sometime initially when video played, sound is coming but screen in black. Video is visible after 5-6 second of video is played. I am using following code:
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
self.avPlayer = avPlayer;
__weak CLBAVPlayer *weakSelf = self;
[self.avPlayer addPeriodicTimeObserverForInterval:CMTimeMakeWithSeconds(1.0 / 60.0, NSEC_PER_SEC)
queue:nil
usingBlock:^(CMTime time) {
[weakSelf progress];
}];
self.layer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
Please help me to figure out the issue.
I got the answer for this. Actually the issue was that when I was fetching layer and adding it as sublayer then it was not on main queue. Using dispatch_async with the main queue solved my problem.
I am developing a music player app where the tracks or sounds are played from the url and I m using AVplayer to play the tracks from the url
Basically i want to keep on playing the tracks even user navigated or goes to new other or the application goes to background !! i really don't have a idea how to achieve the same..
these is the demo code I am using to get the track from the URL
NSURL *url = [NSURL URLWithString:#"<#Live stream URL#>];
// You may find a test stream at <http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8>.
self.playerItem = [AVPlayerItem playerItemWithURL:url];
//(optional) [playerItem addObserver:self forKeyPath:#"status" options:0 context:&ItemStatusContext];
self.player = [AVPlayer playerWithPlayerItem:playerItem];
self.player = [AVPlayer playerWithURL:<#Live stream URL#>];
//(optional) [player addObserver:self forKeyPath:#"status" options:0 context:&PlayerStatusContext];
i even wanted to play next track if the current played track is over.
I'm currently using an AVQueuePlayer to play mono mp3 files that are downloaded from a URL. The audio sounds fine when it plays on the speakers, but when I put on headphones, the audio only comes out on one side.
What is the simplest way to ensure that the AVPlayer plays mono audio files on both channels?
As Andreas Zöllner states in his answer:
You can easily add an MTAudioProcessingTap to your existing AVPlayer
item and copy the selected channels samples to the other channel
during your process callback function. Here is a great tutorial
explaining the basics:
http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
The CODE:
NSURL *assetURL = [[NSBundle mainBundle] URLForResource:#"skyfall" withExtension:#"m4a"];
assert(assetURL);
// Create the AVAsset
AVAsset *asset = [AVAsset assetWithURL:assetURL];
assert(asset);
// Create the AVPlayerItem
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
assert(playerItem);
assert([asset tracks]);
assert([[asset tracks] count]);
self.player = [AVPlayer playerWithPlayerItem:playerItem];
assert(self.player);
[self.player play];
I have two different views that are meant to play the same video, I am creating an app that will switch several times between the two views while the video is running.
I currently load the first view with the video as follows:
NSURL *url = [NSURL URLWithString:#"http://[URL TO VIDEO HERE]"];
AVURLAsset *avasset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:avasset];
player = [[AVPlayer alloc] initWithPlayerItem:item];
playerLayer = [[AVPlayerLayer playerLayerWithPlayer:player] retain];
CGSize size = self.bounds.size;
float x = size.width/2.0-202.0;
float y = size.height/2.0 - 100;
//[player play];
playerLayer.frame = CGRectMake(x, y, 404, 200);
playerLayer.backgroundColor = [UIColor blackColor].CGColor;
[self.layer addSublayer:playerLayer];
NSString *tracksKey = #"tracks";
[avasset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [avasset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
//videoInitialized = YES;
[player play];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
In my second view I want to load the video from the dispatch_get_main_queue so that the video in both views are in sync.
I was hoping someone could help me out with loading the data of the video from the first view into the second view.
It is very simple:
Init the first player:
AVAsset *asset = [AVAsset assetWithURL:URL];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
And the second player in the same way, BUT, use the same asset from the first one.
I have verified, it works.
There is all the info you need on the Apple page:
https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html
This abstraction means that you can play a given asset using different
players simultaneously
this quote is from this page.
I don't think you will be able to get this approach to work. Videos are decoded in hardware and then the graphics buffer is sent to the graphics card. What you seem to want to do is decode a video in one view but then capture the contents of the first view and show it in a second view. That will not stay in sync because it would take time to capture the contents of the first window back into main memory and then those contents would need to be sent to the video card again. Basically, that is not going to work. You also cannot decode two h.264 videos streams and expect them to be in sync.
You could implement this with another approach entirely. If you decode the h.264 video to frames on disk (save each frame as a PNG) and then write your own loop that will decode the Nth PNG in a series of PNGs and then display the results in the two different windows. That will work fast enough to be an effective implementation on newer iPhone 4 and 5 and iPad 2 and 3. If you want to make use of a more advanced implementation, take a look at my AVAnimator library for iOS, you could get this approach working in 20 minutes if you use existing code.
For this ten year old question which has only ten year old answers which are out of date, here's the up to date answer.
var leadPlayer: AVPlayer ... the lead player you want to dupe
This does not work:
let leadPlayerItem: AVPlayerItem = leadPlayer.currentItem!
yourPlayer = AVPlayer(playerItem: leadPlayerItem)
yourPlayer.play()
Apple does not allow that (try it, see error).
This works. You must use the item:
let dupeItem: AVPlayerItem = AVPlayerItem(asset: leadPlayer.currentItem!.asset)
yourPlayer = AVPlayer(playerItem: dupeItem)
yourPlayer.play()
Fortunately it's now that easy.