I am working on scanning barcode ios App that uses AVFoundation
So i have created a square box with constraint using the interface builder. The square box is all good with the constraints. Perfectly fine.
i have this following code to add the avcapturelayer to the square box.
self.captureLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
[self.captureLayer setFrame:self.cameraPreviewView.layer.bounds];
[self.captureLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[self.cameraPreviewView.layer addSublayer:self.captureLayer];
the layer follows the leading space from the square box constraint, but not with the trailing. The new added AVlayer goes off the screen(to the right) while the square box itself is all good. What am I missing here?
thanks!
I think you should try to set self.captureLayer bounds/position instead of frame ?
Cheers!
This might be happening if you are setting the frame in viewDidLoad. If so, try doing it in viewWillAppear:instead.
This may solve your problem
CGRect bounds=view.layer.bounds;
captureLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureLayer.bounds=bounds;
captureLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
Or
as you are you are using AVLayerVideoGravityResizeAspectFill so it will go out of screen , you can use AVLayerVideoGravityResizeAspectFit instead.
you need to set clipToBound=YES; when using AVLayerVideoGravityResizeAspectFill (when using you View )
view.clipToBound=YES;
and than add Sublayer to view
view.layer.masksToBounds = YES;
Related
I have this AVFoundation camera app of mine. The camera preview is the result of a filter, applied by didOutputSampleBuffer method.
When I setup the camera I am following what apple did on one of their sample codes (CIFunHouse):
// setting up the camera
CGRect bounds = [self.containerOpenGL bounds];
_eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES3];
_videoPreviewView = [[GLKView alloc] initWithFrame:bounds
context:_eaglContext];
[self.containerOpenGL addSubview:_videoPreviewView];
[self.containerOpenGL sendSubviewToBack:_videoPreviewView];
id<MTLDevice> device = MTLCreateSystemDefaultDevice();
NSDictionary *options = #{kCIContextUseSoftwareRenderer : #(NO),
kCIContextPriorityRequestLow : #(YES),
kCIContextWorkingColorSpace : [NSNull null]};
_ciContext = [CIContext contextWithEAGLContext:_eaglContext options:options];
[_videoPreviewView bindDrawable];
_videoPreviewViewBounds = CGRectZero;
_videoPreviewViewBounds.size.width = _videoPreviewView.drawableWidth;
_videoPreviewViewBounds.size.height = _videoPreviewView.drawableHeight;
dispatch_async(dispatch_get_main_queue(), ^(void) {
CGAffineTransform transform = CGAffineTransformMakeRotation(M_PI);
_videoPreviewView.transform = transform;
_videoPreviewView.frame = bounds;
});
self.containerOpenGL is a full screen view and is constrained to the four corners of the screen. Autorotation is on.
But this is the problem...
When I setup the GLKView and self.ciContext it is created assuming the device is on a particular orientation. If the device is on a particular orientation and I run the application, the previewView will fit the entire self.containerOpenGL area but when I rotate the device the previewView will be out center.
I see that Apple code works perfectly and they don't use any constraints. They do not use any autorotation method, no didLayoutSubviews and when you rotate the device, running their code, everything rotates except the preview view. And worse than that, my previewView appears to rotate but not their's.
Is this black magic? How do I they do that?
They add their preview view to a uiwindow and that is why it does not rotate. I hope this answers the question. If not I will continue to look through their source code.
Quote from source code.
we make our video preview view a subview of the window, and send it to the back; this makes FHViewController's view (and its UI elements) on top of the video preview, and also makes video preview unaffected by device rotation
They also add this
_videoPreviewView.enableSetNeedsDisplay = NO;
This may keep it from responding as well
Edit: It appears that now the preview rotates and the UI does as well so to combat this you can add a second window and send it to the back and make the main window clear and add the previewView to the second window with a dummyViewController that tells it not to autorotate by overriding the appropriate method. This will allow the preview to not rotate but the UI to rotate.
I'm using AVFoundation to do some video recording and I've looked all over for how to get the video to aspect fill. I've read through avfoundation guide by apple and class referene for AVCaptureVideoPreviewLayer. I also read this question here AVFoundation camera preview layer not working . Here is my code
videoPreviewLayer.frame = captureView.frame
videoPreviewLayer.frame.origin = CGPoint(x: 0, y: 0)
videoPreviewLayer.backgroundColor = UIColor.redColor().CGColor
videoPreviewLayer.contentsGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer.masksToBounds = true
captureView.layer.addSublayer(videoPreviewLayer)
I put this in viewDidLayoutSubviews() so that I can get the correct size for captureView.frame which is the UIView my previewLayer is inside of. Any clue why aspectFill isn't working? As you can see from the red background the layer is the correct size but the contentsGravity isn't filling the layer.
Found my answer in this link AVCaptureVideoPreviewLayer . I needed to use videoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill instead of videoPreviewLayer.contentsGravity = AVLayerVideoGravityResizeAspectFill
For me, the answer for my problem was provided by NSGangster in the question:
I put this in viewDidLayoutSubviews() so that I can get the correct size for captureView.frame which is the UIView my previewLayer is inside of.
I originally had the code in viewDidLoad() which meant the layer was resizing before the true bounds of the view was determined.
Thanks for sharing.
I'm trying to get the video output on my screen in Swift. But the screen stays completely white. I found this tutorial in ObjC and I followed it (only in Swift style syntax).
In there there is a line previewLayer.frame = myView.bounds;
But the field .frame seems to be read only in swift. And I think this might be why I don't see anything on the screen.
How can I set the frame for the previewLayer in Swift?
I see three points in that tutorial where you could end up not displaying the preview, and thus getting a white screen. Below are the Obj-C and Swift counterparts.
1) You might have missed adding the input to the capture session:
// [session addInput:input];
session.addInput(input)
2) You might not have initialized the preview layer's bounds to that of your view controller:
// UIView *myView = self.view;
// previewLayer.frame = myView.bounds;
previewLayer.frame = self.view.bounds
3) You might not have added the preview layer as a sublayer of your view:
// [self.view.layer addSublayer:previewLayer];
self.view.layer.addSublayer(previewLayer)
I have implemented a custom movie player with AVPlayer. On setting the value of videoGravity in AVPlayerLayer to AVLayerVideoGravityResizeAspectFill I see the desired effect in iOS 4.2, 4.3. But somehow on iOS 5.0 it has got no effect. Is anybody seeing a similar issue? Am I doing something wrong?
On iOS5 you should reset layers bounds after setting videoGravity.
This worked for me:
((AVPlayerLayer *)[self layer]).videoGravity = AVLayerVideoGravityResizeAspectFill;
((AVPlayerLayer *)[self layer]).bounds = ((AVPlayerLayer *)[self layer]).bounds;
EDITED: "self" points to a PlayerView (subclass of UIView) object from example "Putting all together":
https://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html
Found the solution to this issue. Tick checkbox "Clip Subviews" in IB for the view with the layer you're going to attach the video player to. Then, set the AVLayerVideoGravityResizeAspectFill of your AVPlayerLayer object. If you don't have the view in IB but you're creating it programmatically, set its clipsToBounds property to YES.
I must do something wrong, but dont know what..
I try to add a subView with this code:
subMenuView = [[UISubMenuViewMainController alloc] init];
[subMenuView.view setFrame:CGRectMake(10,0,990,100)];
subMenuView.view.backgroundColor = [UIColor whiteColor];
[self.view addSubview:subMenuView.view];
I want my view to be at (10,0) and have 990/100 in width/height
but i dont get the expected result
Let me if I m wrong, If I want a 10x10 square view at the center i have to add the following line:
[subMenuView.view setFrame:CGRectMake(512,384,10,10)];
That s not what I get, the position is correct, but the width/height are wrong, any ideas?
if you use autolayout,the call setFrame have no use,try call setTranslatesAutoresizingMaskIntoConstraints before setFrame
problem fixed by setting
self.view.autoresizesSubviews = NO;
The code is a little unconventional, but I'll hazard a guess it's because you are setting the view's frame before you add it as a subview. Hence the subview's position probably gets changed by layoutSubviews before you see it.
So try putting the second line of code last and see if that does the trick.
Try using floats instead of ints:
[subMenuView.view setFrame:CGRectMake(10,0,990,100)];
to:
[subMenuView.view setFrame:CGRectMake(10.0f,0.0f,990.0f,100.0f)];