I am using MPMoviePlayerController for showing video inside UIView and using the code below:
self.moviePlayerSmall = MPMoviePlayerController(contentURL:self.objUrl )
self.moviePlayerSmall.view.frame = self.videoView.bounds
self.videoView.addSubview(self.moviePlayerSmall.view)
self.moviePlayerSmall.view.autoresizingMask = [.FlexibleWidth, .FlexibleHeight]
for subV: UIView in self.moviePlayerSmall.view.subviews {
subV.backgroundColor = UIColor.whiteColor()
}
self.moviePlayerSmall.fullscreen = false
self.moviePlayerSmall.controlStyle = MPMovieControlStyle.Default
self.moviePlayerSmall.scalingMode = .AspectFill
Where videoview is the UIView in which I am adding the MPMoviePlayer view.
The issue I am facing is that whenever the video player comes out of full screen then the player view shows spacing on the left and right sides.
I have tried setting up contentmode, ScaleMode and even AutorezingMask but nothing is working. Have anyone experienced the same issue?
Related
I created IOS app for live streaming with use of AVPlayer. I add the airplay routes button with help of MPVolumeView.
After update of IOS 11, airplay routes button not coming up properly on screen.
Sometime it show on screen sometimes not showing on screen
Anybody help me with it, is it IOS 11 issues or their some change need to MPVolumeView done in my code.
I am using AVPlayerViewController, in which i added MPVolumeView as subview.
self.airButton.frame = CGRect(x:0,y:0,width:45,height:45)
self.airButton.backgroundColor = UIColor.green
let volumeView = MPVolumeView(frame : self.airButton.frame )
volumeView.showsVolumeSlider = false
volumeView.showsRouteButton = true
volumeView.backgroundColor = UIColor.red
self.airButton.addSubview(volumeView)
self.airButton.translatesAutoresizingMaskIntoConstraints = false
self.controlView.addSubview(airButton)
Thanks
I have an application with Swift. I want to play some video by using moviePlayer. I am using the following code.
let url = NSURL.fileURLWithPath(path)
moviePlayer = MPMoviePlayerController(contentURL: url)
// moviePlayer?.controlStyle = .None
if let player = moviePlayer {
player.view.frame = CGRect(x: 20, y: 165, width: widthVideoView, height: heightVideoView)
player.prepareToPlay()
player.scalingMode = .AspectFill
self.view.addSubview(player.view)
}
Playing video is fine no issues. But for now my video is stretched . I think this is happening because of
player.scalingMode = .AspectFill
My screenshot like this. .
So I change to
player.scalingMode = .AspectFit
Then my screen like this (top bottom with black screen.) How can I handle this.
Sorry for the edited answer , If you read the docu : https://developer.apple.com/library/prerelease/ios/documentation/MediaPlayer/Reference/MPMoviePlayerController_Class/index.html#//apple_ref/c/tdef/MPMovieScalingMode
.aspectFit: will create the blackBar because it keeps the aspect ratio of the video
.aspectFill , will act like a zoom on the (there is no black bar but the video get cut)
.Fill will stretch the video to fit (I think you want this one or change the view to follow the aspect ratio ).
If you don't respect the original aspect ratio :It will always stretch or have the black bars too fill .
So you can change the scalingMode to .Fill or change the view to respect the aspect ratio
In my project I am using AVSampleBufferDisplayLayer and AVPlayerLayer. Bot of them have similar interface and videoGravity property.
When I change AVPlayerLayer videoGravity property video is re-sized immediately with animation effect. With AVSampleBufferDisplayLayer videoGravity nothing happen till I change device orientation and then video is re-sized without animation.
How to change videoGravity of AVSampleBufferDisplayLayer to behave similar to the AVPlayerLayer?
The only solution I've found is to reinitialize the AVSampleBufferDisplayLayer for example:
var displayBufferLayer: AVSampleBufferDisplayLayer?
...
func reinitBufferLayer(videoGravity: AVLayerVideoGravity) {
displayBufferLayer?.flush()
displayBufferLayer?.stopRequestingMediaData()
displayBufferLayer?.removeFromSuperlayer()
let bufferLayer = AVSampleBufferDisplayLayer()
bufferLayer.frame = view.bounds
bufferLayer.videoGravity = videoGravity // (i.e. .resizeAspectFill)
bufferLayer.isOpaque = true
view.layer.insertSublayer(bufferLayer, at: 0)
self.displayBufferLayer = bufferLayer
}
I'm trying to present a camera on the only the top half of my screen and my code isn't resizing the camera properly. I'm trying to add a view to the top half of my screen and then have the camera's cameraOverlayView property conform to that view's frame. Regardless of what I try however, the camera still appears in full screen mode. If someone can tell me what I'm doing wrong I'd really appreciate it. Thanks!
// Setting Up The Camera View
cameraView = UIView(frame: CGRectMake(0.0, 0.0, view.bounds.width, view.bounds.width))
view.addSubview(cameraView)
// Setting Up The Camera
var cam = UIImagePickerController()
cam.delegate = self
cam.allowsEditing = false
cam.videoMaximumDuration = 7
cam.videoQuality = UIImagePickerControllerQualityType.TypeMedium
cam.mediaTypes = UIImagePickerController.availableMediaTypesForSourceType(UIImagePickerControllerSourceType.Camera)!
cam.sourceType = UIImagePickerControllerSourceType.Camera
cam.cameraDevice = UIImagePickerControllerCameraDevice.Rear
cam.cameraCaptureMode = UIImagePickerControllerCameraCaptureMode.Video
cam.cameraFlashMode = UIImagePickerControllerCameraFlashMode.Off
cam.showsCameraControls = true
cam.cameraOverlayView = cameraView
cam.cameraOverlayView?.frame = cameraView.frame
camera = cam
self.presentViewController(camera, animated: false, completion: nil)
I do not think what you want to do can be accomplished using a UIImagePickerController. The AVFoundation is what you want to use if you want complete customization of the camera. You can read about it here https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2.
P.S. It is not the simplest framework to use
I have implemented a custom movie player with AVPlayer. On setting the value of videoGravity in AVPlayerLayer to AVLayerVideoGravityResizeAspectFill I see the desired effect in iOS 4.2, 4.3. But somehow on iOS 5.0 it has got no effect. Is anybody seeing a similar issue? Am I doing something wrong?
On iOS5 you should reset layers bounds after setting videoGravity.
This worked for me:
((AVPlayerLayer *)[self layer]).videoGravity = AVLayerVideoGravityResizeAspectFill;
((AVPlayerLayer *)[self layer]).bounds = ((AVPlayerLayer *)[self layer]).bounds;
EDITED: "self" points to a PlayerView (subclass of UIView) object from example "Putting all together":
https://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html
Found the solution to this issue. Tick checkbox "Clip Subviews" in IB for the view with the layer you're going to attach the video player to. Then, set the AVLayerVideoGravityResizeAspectFill of your AVPlayerLayer object. If you don't have the view in IB but you're creating it programmatically, set its clipsToBounds property to YES.