I have an AVPlayer object that I am using to play a video. When the user taps a button, the video is swapped out for a different video, which continues playing. Just using replaceCurrentItem(with:) is resulting in a few tenths of a second of a black frame appearing on my AVPlayerLayer that is displaying the video content.
I have code in place to render an image at the current frame before the AVPlayerItem is swapped out, to bridge the gap before the new AVPlayerItem has a frame ready to display, but the black frame is blocking this image from view. Is there any way to control what the AVPlayerLayer will render before it has actual video data to display?
Alternatively, is there a way to be notified that the AVPlayerLayer (or the AVPlayerItem has actually begun displaying video data? Observing for when the state of the item becomes .readyToPlay triggers too early, hiding the image at that point still leaves the black frame visible.
I had the same flash problem with my macOS app. I resolved it by renewing AVPlayerLayer each time I play a new file.
I have an AVPlayer subclass. The function below removes AVPlayerLayer from its super layer and add a new layer to the view.
class MyVideoPlayer: AVPlayer {
func addLayer(view: NSView) {
view.layer!.sublayers?
.filter { $0 is AVPlayerLayer }
.forEach { $0.removeFromSuperlayer() }
let layer = AVPlayerLayer.init(player: self)
layer.videoGravity = AVLayerVideoGravity.resize
layer.frame = view.bounds
view.layer!.addSublayer(layer)
}
}
In my ViewController, I call this before I play a video content.
myVideoPlayer.addLayer(view: self.view)
Related
In Radio app where for pausing the AVPlayer we are not really pausing it just muting it for some time as per the app requirement.
Here the issue is everything is working fine but MPNowPlayingInfoCenter is not updating to pause when I mute the player. Is there is any way I can control MPNowPlayingInfoCenter controlls programmatically?
Tried the all possible solutions like making setActive(false) but this is causing issue in normal player.
Here is the solution for it
When Pausing I'm muting it for 120 secs and if it still in that mode I'm pausing the player here is the code i'm using for it.
self.player.rate = 0.0 // Automatically handles the MPNowPlayingInfoCenter Controls to Pause state
self.player.isMuted = true
secs = 120
To set MPNowPlayingInfoCenter to paused, you can set playbackRate to 0.0:
let nowPlayingInfo : [String: AnyObject] =
[MPNowPlayingInfoPropertyPlaybackRate: 0.0]
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
Here I have a method that adds the video layer to the UIView that I have setup in IB:
-(void)loadPlayer{
self.asset = [AVAsset assetWithURL:url];
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:self.asset];
self.player = [AVPlayer playerWithPlayerItem:item];
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.contentsGravity = AVLayerVideoGravityResizeAspect;
self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
item.videoComposition = [AVVideoComposition videoCompositionWithPropertiesOfAsset:self.asset];
[self.player pause];
[self.player addObserver:self forKeyPath:#"status" options:0 context:nil];
dispatch_async(dispatch_get_main_queue(), ^{
[self.videoLayer.layer addSublayer:self.playerLayer];
});
}
The above method gets called in the viewDidAppear of the View Controller, so each time the current View Controller loads, the video should start to play.
Here I check for the player status and play the asset if ready:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
if (object == self.player && [keyPath isEqualToString:#"status"]) {
if (self.player.status == AVPlayerStatusReadyToPlay) {
[self.player play];
}
}
}
The problem is that the view did appear, the video starts playing, but only the audio, not video. All I get is black screen in the place of video. Audio is playing back fine, but it does not seem to render the video frames for some reason unknown.
This happens only on iOS 10 devices, the same code when run on a iOS 9 devices, works like a charm. The video shows up as expected.
Is there anything that is changed in the AVFoundation framework for iOS 10 in terms of AVPlayer or so? Or is there anything that I am missing here, coz iOS 9 plays it good.
If anyone has faced this issue, and if you have solved it, please post your solution. It would be helpful.
Thanks.
THE SOLUTION
Thanks for your replies! I just found that, for some reason in iOS 10 the viewDidLayoutSubviews did not get called as it was in iOS 9. Actually I was setting the player layer's frame in the viewDidLayoutSubviews method. Since it didn't get called, I was not able to see the player on screen. What I did was, set the player layer's frame in the loadPlayer method above. And it works fine. Hope this helps someone.
This is the line of code that I moved from viewDidLayoutSubviews to my loadPlayer method and it worked:
self.playerLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
You need to retain your instance of AVPlayerItem i.e. as a property or an instance variable. The reference to the movie player is lost if you do not retain it.
Even some time this rubish happens in ARC as well.Please make it property.
In my case this problem was because i started playing simultaneously with UIView.animate with view.layoutIfNeeded() in animation block.
The fix for that is to not animate layout changes when not needed and not animate when starting playing video.
The solutions on this page did not work for me.
I found another solution to this problem. In my case the video was being played in black but the sound was ok.
I discovered that changing the time slider to scrub the video would make the video appear.
So, my solution was to make the video go the the last frame, then to the first before playing:
Float64 duration = [self.player duration];
[self.player goToTime:duration];
[self.player goToTime:0.0f];
The black color is background color of the AVPlayerViewController. so change the background color of the AVPlayerViewController as superview color. So the black flash will not be visible.
self.view.backgroundColor =[UIColor whiteColor];
avMoviePlayer.view.backgroundColor=[UIColor whiteColor];
[self.view addSubview:avMoviePlayer.view];
Thanks for your replies! I just found that, for some reason in iOS 10 the viewDidLayoutSubviews did not get called as it was in iOS 9. Actually I was setting the player layer's frame in the viewDidLayoutSubviews method. Since it didn't get called, I was not able to see the player on screen. What I did was, set the player layer's frame in the loadPlayer method above.And it works fine. Thanks! Hope this helps someone.
EDIT
This is the line of code that I moved from viewDidLayoutSubviews to my loadPlayer method and it worked:
self.playerLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
here is what I recently studied for AVPlayer
you can try it
XYAVMoviePlayer
enjoy your day :)
All I am trying to accomplish is to place an AVPlayerLayer behind an active OpenGL layer.
I am using a GLKViewController (with respective GLKView). I load a video with AVPlayer and establish a corresponding AVPlayerLayer. Both my GL Layer and AVPlayerLayer appear, however, regardless of what form of "insert sublayer" I call (above, below, at index, etc.), the GL Layer always appears behind the AVPlayerLayer.
_videoData = playerItem;
_videoPlayer = [[AVPlayer alloc] initWithPlayerItem:_videoData];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_videoPlayer];
_playerLayer.frame = CGRectMake(250, 250, 300, 300);
[self.view.layer insertSublayer:_playerLayer below:self.view.layer];
Solved this by inserted a GLKViewController as a child of a UIView with the video player. After much experimentation it appears you cannot place anything (subviews or layers) behind the GL layer when using GLKViewController independently.
I am trying to crop a video, that has been taken with an iPhone camera and then place it on top of a UIImageView. I have been following this SO Question How to crop a video to a circle in iOS?. And I can now take the video and put it on top of another perviously recorded video. Now I want to have the background be a image and the foreground be the cropped video. My main issue right now is getting video cropped in the part I want cropped. I cannot post all the code here but here is where the github repo is and the class that does the modifying is called CustomVideoCompositor.m https://github.com/mayoff/stackoverflow-28258270-video-in-oval-on-video/tree/master/video. And I am having trouble editing it into the circle I want I want it to be a oval that is in the bottom half and higher than wider.
EDIT
I want to make the cut so that only things in this part of the rounded rectangle would be cropped and available.
If you want only a visual effect of cropped video, without modification of video file, you can simply add a mask to your video layer. I mean CALayer mask property.
You can create circle mask like this: draw black rectangle, then draw transparent circle on it.
You'll need to use AVPlayer as this enables you to play several videos back at once or have your video appear on top of another view. Also, by using AVPlayer and an AVPlayerLayer its easy to then make the video appear circular. (Here's a good tutorial on it to learn more: http://jacopretorius.net/2013/02/playing-video-in-ios.html)
Here's the code (for Objective-C):
In your view controller .h file:
#import <AVFoundation/AVFoundation.h>
Then in viewDidLoad:
[super viewDidLoad];
// Set up the image view first (the imageView is an IBOutlet)
self.imageView.image = [UIImage imageNamed:#"image"];
self.imageView.alpha = 0.2;
NSURL *url = [[NSBundle mainBundle] URLForResource:#"video" withExtension:#"mp4"];
AVPlayer *player = [AVPlayer playerWithURL:url];
CGFloat diameter = MIN(self.view.frame.size.width, self.view.frame.size.height) * 0.8;
AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:player];
layer.videoGravity = AVLayerVideoGravityResizeAspectFill;
layer.frame = CGRectMake((self.view.frame.size.width - diameter) / 2,
(self.view.frame.size.height - diameter) / 2,
diameter, diameter);
layer.cornerRadius = diameter / 2;
layer.masksToBounds = YES;
// Put the AVPlayerLayer on top of the image view.
[self.view.layer addSublayer:layer];
[player play];
I have a main view of type UIView. This view has two skview's (left and right) in landscape only orientation. All views and scenes are set with a black background.
I start out with the two skviews hidden. The main screen shows questions on either side of the screen. If the answer is incorrect I set a scene to an effect and then enable that effect and unhide the skview (left or right). The effect is shown for several seconds after which I disable it with [self.leftView presentScene:nil]; I do the same with the rightView. I then hide the views again.
The issue is that when the scenes are shown there is a brief flash of the view in a white color before the scene is rendered appropriately with the black background. In fact this is why I took to hide the skviews to begin with as if I don't they will show with a lighter background despite my setting it to black.
How can I show these scenes without the flash of a lighter background color?
Some representative code:
self.fireSceneLeft = [FireScene sceneWithSize:self.leftView.bounds.size];
self.fireSceneLeft.scaleMode = SKSceneScaleModeAspectFill;
self.fireSceneLeft.backgroundColor = [UIColor blackColor];
[self.leftView presentScene:self.fireSceneLeft];
[self.leftView setHidden:NO];
As for the scene effect itself:
#implementation FireScene
-(SKEmitterNode *)fireEffect
{
SKEmitterNode *emitter = [NSKeyedUnarchiver unarchiveObjectWithFile:[[NSBundle mainBundle] pathForResource:#"FireEffect" ofType:#"sks"]];
emitter.position = CGPointMake(150,80);
emitter.name = #"fire";
emitter.targetNode = self.scene;
emitter.numParticlesToEmit = 2;
return emitter;
}
-(void)update:(CFTimeInterval)currentTime {
[self addChild:[self fireEffect]];
}
#end
I solved this issue by creating another scene that is simply black. Instead of setting the scene I want to move away from to nil, I simply substitute in the black scene. A scene with nothing on it (blank SKView) has a gray color which is the cause of my problem. By starting the app with the black scene and using it when no scene should display, my problem was solved. An important point here as well is pausing the scene. Even a black scene with no movement will use up CPU if not paused. I found the best place to pause the black scene was in the following method of the scene controller:
-(void)update:(CFTimeInterval)currentTime {
[self addChild:[self blackEffect]];
[self.view setPaused:YES];
}
This was also tricky because pausing the scene elsewhere actually paused the older scene. This is a timing problem because even if you call a new scene and then pause, it won't have had the time to get the new scene in place before the pause. The method above was the only place I could reliably call the pause and have it take effect on the correct scene.
On iOS you can't have 2 (or more) SKView instances at the same time. While it works technically and on first sight it is actually unusable because one view will dominate the other(s), leaving very little time for the other view(s) to update and render their contents as well as receiving touch and other events. I suspect the brief flash is related to this.