Video stretch issue iOS Swift - ios

I have an application with Swift. I want to play some video by using moviePlayer. I am using the following code.
let url = NSURL.fileURLWithPath(path)
moviePlayer = MPMoviePlayerController(contentURL: url)
// moviePlayer?.controlStyle = .None
if let player = moviePlayer {
player.view.frame = CGRect(x: 20, y: 165, width: widthVideoView, height: heightVideoView)
player.prepareToPlay()
player.scalingMode = .AspectFill
self.view.addSubview(player.view)
}
Playing video is fine no issues. But for now my video is stretched . I think this is happening because of
player.scalingMode = .AspectFill
My screenshot like this. .
So I change to
player.scalingMode = .AspectFit
Then my screen like this (top bottom with black screen.) How can I handle this.

Sorry for the edited answer , If you read the docu : https://developer.apple.com/library/prerelease/ios/documentation/MediaPlayer/Reference/MPMoviePlayerController_Class/index.html#//apple_ref/c/tdef/MPMovieScalingMode
.aspectFit: will create the blackBar because it keeps the aspect ratio of the video
.aspectFill , will act like a zoom on the (there is no black bar but the video get cut)
.Fill will stretch the video to fit (I think you want this one or change the view to follow the aspect ratio ).
If you don't respect the original aspect ratio :It will always stretch or have the black bars too fill .
So you can change the scalingMode to .Fill or change the view to respect the aspect ratio

Related

RTCVideoTrack shows stretched WebRTC

I am using core WebRTC framework and rendering my local stream in IPhone full screen mode. Unfortunately, my video shows stretched, doesn't show like video view in camera app.
I tried to add aspect ratio in RTCMediaConstraints and also used adaptOutputFormatToWidth method to fix the output.
NSDictionary* mandatoryConstraints;
/* want to calculate aspect ratio dynamically */
NSString *aspectRatio = [NSString stringWithFormat:#"%f",(double)4/3];
if (aspectRatio) {
mandatoryConstraints = #{ kRTCMediaConstraintsMaxAspectRatio:
aspectRatio};
}
RTCMediaConstraints *cameraConstraints = [RTCMediaConstraints alloc];
cameraConstraints = [cameraConstraints initWithMandatoryConstraints:mandatoryConstraints optionalConstraints:nil];
RTCAVFoundationVideoSource *localVideoSource = [peerFactory avFoundationVideoSourceWithConstraints:mediaConstraint];
[localVideoSource adaptOutputFormatToWidth:devicewidth:devicewidth fps:30];
In below link, the difference between camera video view and my app call video view is shown
https://drive.google.com/file/d/1HN3KQcJphtC3VzJjlI4Hm-D3u2E6qmdQ/view?usp=sharing
I believe you are rendering your video in RTCEAGLVideoView, which require adjustment for size, you can use RTCMTLVideoView in place of RTCEAGLVideoView.
and if you want to use RTCEAGLVideoView, use RTCEAGLVideoViewDelegate method.
- (void)videoView:(RTCEAGLVideoView *)videoView didChangeVideoSize:(CGSize)size;
this method will give you correct size of the video.
(For Swift) -> Use RTCMTLVideoView and set videoContentMode
#if arch(arm64)
let renderer = RTCMTLVideoView(frame: videoView.frame)
renderer.videoContentMode = .scaleAspectFill
#else
let renderer = RTCEAGLVideoView(frame: videoView.frame)
#endif

iPhone 7 Plus AVPlayer has border around it (Colors mismatch on white)

I'm seeing strange behavior on the iPhone 7 Plus and iPhone 6 Plus. This doesn't happen on the simulator, only the physical device.
If you have an AVPlayer (Video has a white background) and the view to which it is attached has a white background (audio player is smaller than parent view) a border will appear around the AVPlayer.
The goal to do this was to blend the video into the background to create a cool effect. Its working great on every device except the physical Plus model devices.
My best guess is there is some perfect white difference. Does anyone know how to fix this or avoid this?
I had this exact problem and my solution was to add the AVPlayerLayer inside an UIView container and adding a mask onto the playerLayer with 1pt inset.
override func layoutSubview() {
super.layoutSubviews()
// .. sets frame to players source size
let maskLayer = playerLayer.mask ?? CALayer()
maskLayer.frame = playerLayer.bounds.insetBy(dx: 1, dy: 1)
maskLayer.backgroundColor = UIColor.white.cgColor
playerLayer.mask = maskLayer
}

MPMoviePlayer view issue when coming out of full screen

I am using MPMoviePlayerController for showing video inside UIView and using the code below:
self.moviePlayerSmall = MPMoviePlayerController(contentURL:self.objUrl )
self.moviePlayerSmall.view.frame = self.videoView.bounds
self.videoView.addSubview(self.moviePlayerSmall.view)
self.moviePlayerSmall.view.autoresizingMask = [.FlexibleWidth, .FlexibleHeight]
for subV: UIView in self.moviePlayerSmall.view.subviews {
subV.backgroundColor = UIColor.whiteColor()
}
self.moviePlayerSmall.fullscreen = false
self.moviePlayerSmall.controlStyle = MPMovieControlStyle.Default
self.moviePlayerSmall.scalingMode = .AspectFill
Where videoview is the UIView in which I am adding the MPMoviePlayer view.
The issue I am facing is that whenever the video player comes out of full screen then the player view shows spacing on the left and right sides.
I have tried setting up contentmode, ScaleMode and even AutorezingMask but nothing is working. Have anyone experienced the same issue?

AVPreviewLayer contentsGravity not filling Layer

I'm using AVFoundation to do some video recording and I've looked all over for how to get the video to aspect fill. I've read through avfoundation guide by apple and class referene for AVCaptureVideoPreviewLayer. I also read this question here AVFoundation camera preview layer not working . Here is my code
videoPreviewLayer.frame = captureView.frame
videoPreviewLayer.frame.origin = CGPoint(x: 0, y: 0)
videoPreviewLayer.backgroundColor = UIColor.redColor().CGColor
videoPreviewLayer.contentsGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer.masksToBounds = true
captureView.layer.addSublayer(videoPreviewLayer)
I put this in viewDidLayoutSubviews() so that I can get the correct size for captureView.frame which is the UIView my previewLayer is inside of. Any clue why aspectFill isn't working? As you can see from the red background the layer is the correct size but the contentsGravity isn't filling the layer.
Found my answer in this link AVCaptureVideoPreviewLayer . I needed to use videoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill instead of videoPreviewLayer.contentsGravity = AVLayerVideoGravityResizeAspectFill
For me, the answer for my problem was provided by NSGangster in the question:
I put this in viewDidLayoutSubviews() so that I can get the correct size for captureView.frame which is the UIView my previewLayer is inside of.
I originally had the code in viewDidLoad() which meant the layer was resizing before the true bounds of the view was determined.
Thanks for sharing.

UIImagePicker that only fills the top half of the screen

I'm trying to present a camera on the only the top half of my screen and my code isn't resizing the camera properly. I'm trying to add a view to the top half of my screen and then have the camera's cameraOverlayView property conform to that view's frame. Regardless of what I try however, the camera still appears in full screen mode. If someone can tell me what I'm doing wrong I'd really appreciate it. Thanks!
// Setting Up The Camera View
cameraView = UIView(frame: CGRectMake(0.0, 0.0, view.bounds.width, view.bounds.width))
view.addSubview(cameraView)
// Setting Up The Camera
var cam = UIImagePickerController()
cam.delegate = self
cam.allowsEditing = false
cam.videoMaximumDuration = 7
cam.videoQuality = UIImagePickerControllerQualityType.TypeMedium
cam.mediaTypes = UIImagePickerController.availableMediaTypesForSourceType(UIImagePickerControllerSourceType.Camera)!
cam.sourceType = UIImagePickerControllerSourceType.Camera
cam.cameraDevice = UIImagePickerControllerCameraDevice.Rear
cam.cameraCaptureMode = UIImagePickerControllerCameraCaptureMode.Video
cam.cameraFlashMode = UIImagePickerControllerCameraFlashMode.Off
cam.showsCameraControls = true
cam.cameraOverlayView = cameraView
cam.cameraOverlayView?.frame = cameraView.frame
camera = cam
self.presentViewController(camera, animated: false, completion: nil)
I do not think what you want to do can be accomplished using a UIImagePickerController. The AVFoundation is what you want to use if you want complete customization of the camera. You can read about it here https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2.
P.S. It is not the simplest framework to use

Resources