Apply CGAffineTransform prior to layer being displayed - ios

I'm trying to scale an AVCaptureVideoPreviewLayer so it displays a "zoomed in" preview from the camera; I can make this work using setAffineTransform:CGAffineTransformMakeScale(2.0f, 2.0f), but only once the layer is visible on screen. That is, the scale transform doesn't seem to take effect unless I call it (from within a CATransaction) during a callback like viewDidAppear.
What I'd really like to do is have the scale transform be applied prior to the layer becoming visible--e.g. at the time that I initialize it and add it to the UIView--but all my attempts to do so have proven futile. If I set the transform at this point, and then attempt to read the value back once it is visible, it does appear to have the desired scale transform set -- yet it does not have any effect on the view. I'm new to iOS programming, so I'm sure there's something basic I'm missing here; any help is appreciated!

Two simple alternative approaches, one (better) for iOS 7.x and one for earlier versions of iOS:
float zoomLevel = 2.0f;
if ([device respondsToSelector:#selector(setVideoZoomFactor:)]
&& device.activeFormat.videoMaxZoomFactor >= zoomLevel) {
// iOS 7.x with compatible hardware
if ([device lockForConfiguration:nil]) {
[device setVideoZoomFactor:zoomLevel];
[device unlockForConfiguration];
}
} else {
// Lesser cases
CGRect frame = captureVideoPreviewLayer.frame;
float width = frame.size.width * zoomLevel;
float height = frame.size.height * zoomLevel;
float x = (frame.size.width - width)/2;
float y = (frame.size.height - height)/2;
captureVideoPreviewLayer.bounds = CGRectMake(x, y, width, height);
}

Related

CALayer frame gives strange position

I am currently trying to use CALayer to show a mask and then use this mask to crop the picture according to this mask but I can't find a way to get the good position and size of my mask in my image.
When I draw my mask, I use kCAGravityResizeAspectFill to keep the ratio of my image. In this case the layer use the height to fill my screen height and compute the proper width / (x, y) to keep the ratio.
CGRect screenViewRect = [self.viewForBaselineLayout bounds];
CGFloat screenViewWidth = screenViewRect.size.width;
CGFloat screenViewHeight = screenViewRect.size.height;
masqueLayer.frame = CGRectMake(screenViewWidth *0.45, screenViewHeight*0.05, screenViewWidth *0.10, screenViewHeight*0.92);
masqueLayer.contents = (__bridge id)([UIImage imageNamed:masqueJustif].CGImage);
masqueLayer.contentsGravity = kCAGravityResizeAspectFill;
[self.layer insertSublayer:masqueLayer atIndex:2];
As I get the mask on my screen I can easily see that the screenViewWidth *0.10 is not respected as I wanted, but my trouble come from the fact that when I get the frame or my layer the width isn't updated and so I can't get the real position of my layer on my screen.
Is there a method to get the real position of my layer on my screen.
I am actually trying to get the crop rectangle with this (considering my ratio is 21/29.7 as it is a A4 mask). This code is actually working on IPad but not on Iphone as the ratio is different :
CGRect outputRect = [masqueLayer convertRect:masqueLayer.bounds toLayer:self.layer];
outputRect.origin.y *= 2;
outputRect.size.height *= 2;
outputRect.size.width = outputRect.size.height * (21/29.7);
I also tried using my mask percentage :
CGRect outputRect = masqueLayer.frame;
outputRect.origin.y = its.size.height * 0.05;
outputRect.origin.x = its.size.width * 0.45 * masqueLayer.anchorPoint.x;
outputRect.size.height = its.size.height * 0.92;
outputRect.size.width = its.size.height * 0.92 * (21/29.7);
Here is a screenshot of my mask on another layer. I want to extract the image bounded by the blue corner (which is the border of my layer)
Thanks.

Aligning arrow with content view with custom iPad popover

I am customizing the appearance of my popovers by subclassing UIPopoverBackgroundView. Everything is working fine, except there are some instances where there is a gap between the arrow and the content view. If I keep rotating the device to dismiss the popover then bring it back up, sometimes they line up and others they do not. I'm logging the frame sizes and they are consistently the same, so I have no idea what's going on. I'm attaching screenshots and my layoutSubviews code below.
Where you see me subtracting 4 in the case of arrow direction right this was based on trial & error of getting it to work at least some of the time. I hate coding like this because I'm not sure what this space of 4 points is and why I don't need it in all cases of arrow direction, for example.
In the images the 1st shot shows the problem. This happened on a random triggering of the popover. The next image shows it randomly working when triggered later within the same execution of the app.
Also worth mentioning, I found some sample code at http://blog.teamtreehouse.com/customizing-the-design-of-uipopovercontroller and tried this version of layoutSubviews, even though it's already very similar to what I have. Even with this code I was still seeing the same issue.
EDIT: Worth mentioning it might be obvious from my screenshots but I am not including a border for my popover:
+ (UIEdgeInsets)contentViewInsets
{
return UIEdgeInsetsMake(0.0f, 0.0f, 0.0f, 0.0f);
}
Possibly also worth mentioning I am not using an image file for my arrow image but rather am drawing it custom. Again, even when I comment out the drawing I'm still having the issue, so I don't think it's related, but wanted to mention it for full disclosure.
Any help greatly appreciated. Thanks.
CGSize arrowSize = CGSizeMake(kCMAPopoverBackgroundViewArrowBase, kCMAPopoverBackgroundViewArrowHeight);
// Leaving code out for drawArrowImage - even when I leave the arrowImageView as
// an empty image view and set a non-clear background color this gap is still
// sporadically showing up
self.arrowImageView.image = [self drawArrowImage:arrowSize];
float arrowXCoordinate = 0.0f;
float arrowYCoordinate = 0.0f;
CGAffineTransform arrowRotation = CGAffineTransformIdentity;
float borderMargin = 2.0f;
switch (self.arrowDirection) {
case UIPopoverArrowDirectionUp:
arrowXCoordinate = ((self.bounds.size.width / 2.0f) - (arrowSize.width / 2.0f));
arrowYCoordinate = 0.0f;
break;
case UIPopoverArrowDirectionDown:
arrowXCoordinate = ((self.bounds.size.width / 2.0f) - (arrowSize.width / 2.0f));
arrowYCoordinate = self.bounds.size.height - arrowSize.height;
arrowRotation = CGAffineTransformMakeRotation(M_PI);
break;
case UIPopoverArrowDirectionLeft:
arrowXCoordinate = 0.0f - (arrowSize.height / 2.0f) + borderMargin;
arrowYCoordinate = ((self.bounds.size.height / 2.0f) - (arrowSize.height / 2.0f));
arrowRotation = CGAffineTransformMakeRotation(-M_PI_2);
break;
case UIPopoverArrowDirectionRight:
arrowXCoordinate = self.bounds.size.width - arrowSize.height - 4.0f;
arrowYCoordinate = ((self.bounds.size.height / 2.0f) - (arrowSize.height / 2.0f));
arrowRotation = CGAffineTransformMakeRotation(M_PI_2);
break;
default:
break;
}
self.arrowImageView.frame = CGRectMake(arrowXCoordinate, arrowYCoordinate, arrowSize.width, arrowSize.height);
[self.arrowImageView setTransform:arrowRotation];
Is your arrow image square? If not, I'm guessing that the transform is affecting its frame slightly and this is generating the unexpected gap.
Possibly you are getting subpixel values in your calculations (most likely 0.5)
Try using floorf or ceilf to make sure your values are landing exactly on pixel integral values.
For example...
arrowXCoordinate = floorf(((self.bounds.size.width / 2.0f) - (arrowSize.width / 2.0f)));
See floor reference.

Pentagon with UISLiders as circumradii

I want to draw a pentagon in iOS with UISliders as the circumradii (a set of five UISliders going in different directions and originating from the center of a pentagon). Currently I have rotated five UISliders using a common anchorPoint (:setAnchorPoint), but they dont seem to originate from a common point.
How can I go about this? Do I need to start working on a custom control? What more from Quartz2D can I use?
Some specific indicators would be useful (please don't just say "Use Quartz2D")
Here is what I have achieved Imgur
And here is what I want to achieve Imgur
I defined the various sliders as an IBOutletCollection, so I can access them as an array in the controller. Then I run the following code in viewWillAppear:
CGPoint viewCenter = self.view.center;
for (NSInteger i = 0, count = self.sliders.count; i < count; ++i) {
UISlider *slider = self.sliders[i];
CGFloat angle = 2.0f * M_PI / count;
CALayer *layer = slider.layer;
layer.anchorPoint = CGPointMake(0.0f, 0.5f);
layer.position = viewCenter;
layer.transform = CATransform3DMakeRotation(i * angle - M_PI_2, 0.0f, 0.0f, 1.0f);
}
Here's a screenshot of the result:
Is this that you want?
I would try just to turn the sliders. The following code should help:
firstSlider.transform=CGAffineTransformRotate(firstSlider.transform,72.0/180*M_PI);
secondSlider.transform=CGAffineTransformRotate(secondSlider.transform,144.0/180*M_PI);
//and so on
The 72.0/180*M_PI is transformation of degrees to radians.
Hope it helps.
P.S. don't have a mac nearby to check if it works

Acceleration, moving items

I'm now developing a game that uses acceleration to play. I found out how to make my item move, but not to change its 'origin', or more precisely, the origin for acceleration calculation:
In fact, my image moves, and its center is defined like this:
imageView.center = CGPointMake(230, 240);
As you can see, I use landscape mode. But, I want that my image moves "progressively". What i mean by progressively is like in the game Lane Splitter:
You can see that the bike moves, and for example, when he's completely on the left side, the man can orient his iPad horizontally, but the bike doesn't go back in the middle of the screen. I don't know how to do that, because when I try a solution, my image moves, but gets back to the center as soon as my iPhone is horizontal. I understand why, but I don't know how to correct this.
This is my current code:
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{
int i = 0;
float current;
if (i == 0)
{
imageView.center = CGPointMake(230, 240);
current = 240;
i++;
}
//try to modify the origin of acceleration
imageView.center = CGPointMake(230, current - (acceleration.y*200));
current = imageView.center.y;
}
The problem is that i is a local variable. Your code is equivalent to
imageView.center = CGPointMake(230, 240);
float current = 240;
imageView.center = CGPointMake(230, current - (acceleration.y*200));
[imageView center];
Instead, try something like this (assuming your image view is at the right location on startup):
CGPoint current = imageView.center;
current.y -= acceleration.y*200;
imageView.center = current;
Also bear in mind that acceleration.y is in the device coordinate space; you'll need to compensate for interface rotation if your UI supports multiple orientations.

How to track successive CGAffineTransforms so as to arrive at a specific point on screen?

How do you execute multiple CGAffineTransform operations (in animation blocks) without keeping track of every operation executed?
The translation operation doesn't take x,y coordinates but instead values to shift by. So unless you know where you are currently translated to, say at "location 2," how do you know what values to shift by to get to "location 3?"
For example:
A UIView's frame is at (0, 0) - Position 1. I set the transform to translate to (768, 0) and rotate -90 degrees - Position 2. Some time passes and now I want to move to (768, 1024) and rotate another -90 degrees - Position 3.
How do I know what to translate by to move from Position 2 to Position 3?
In context, I'm trying to achieve an iPad view like the following:
a UIView that takes up the entire screen
a UIToolbar that takes up the top edge and is on top of the UIView
when the iPad rotates, the UIView stays with the device, but the toolbar will rotate so that it is always on the top edge of the screen.
I am using CGAffineTransform translate and rotate to move the toolbar. Works great, except when I rotate the iPad multiple times. The first translate/rotate will work perfect. The following transforms will be off because I don't know the correct values to shift by.
UPDATE:
It looks like if I take the current translation (tx, ty) values in the UIView.transform struct and use the difference between them and the new location, it works. However, if the view has been rotated, this does not work. The tx and ty values can be flipped because of the previous rotation. I'm not sure how to handle that.
UPDATE 2:
Doing some research, I've found that I can get the original, unrotated points from tx, ty by getting the abs value of the points and possibly swapping x and y if the UIView is perpendicular. Now I am stuck figuring out how to correctly apply the next set of transforms in the right order. It seems no matter how I concat them, the UIView ends up in the wrong place. It just seems like this is all too complicated and there must be an easier way.
The answer is, apparently, you don't track the transforms.
So the way to rotate the toolbar around the screen is by not concatenating a rotate and translate transform. Instead, create a rotate transform and set the frame in the animation block. Further, based on the new UIInterfaceOrientation, set the degrees to rotate based on the compass values of 0, -90, -180, -270. Also, set the frame size base on the same locations.
So:
CGPoint portrait = CGPointMake(0, 0);
CGPoint landscapeLeft = CGPointMake(768 - 44, 0);
CGPoint landscapeRight = CGPointMake(0, 0);
CGPoint upsideDown = CGPointMake(0, 1024 - 44);
CGSize portraitSize = CGSizeMake(768, 44);
CGSize landscapeLeftSize = CGSizeMake(44, 1024);
CGSize landscapeRightSize = CGSizeMake(44, 1024);
CGSize upsideDownSize = CGSizeMake(768, 44);
CGFloat rotation;
CGRect newLocation;
switch(orientation) {
case UIDeviceOrientationPortrait:
NSLog(#"Changing to Portrait");
newLocation.origin = portrait;
newLocation.size = portraitSize;
rotation = 0.0;
break;
case UIDeviceOrientationLandscapeRight:
NSLog(#"Changing to Landscape Right");
newLocation.origin = landscapeRight;
newLocation.size = landscapeRightSize;
rotation = -90.0;
break;
case UIDeviceOrientationLandscapeLeft:
NSLog(#"Changing to Landscape Left");
newLocation.origin = landscapeLeft;
newLocation.size = landscapeLeftSize;
rotation = -270.0;
break;
case UIDeviceOrientationPortraitUpsideDown:
NSLog(#"Changing to Upside Down");
newLocation.origin = upsideDown;
newLocation.size = upsideDownSize;
rotation = -180.0;
break;
default:
NSLog(#"Unknown orientation: %d", orientation);
newLocation.origin = portrait;
newLocation.size = portraitSize;
rotation = 0.0;
break;
}
CGRect frame = newLocation;
CGAffineTransform transform = CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(rotation));
if(lastOrientation) {
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:.3];
[UIView setAnimationCurve:UIViewAnimationCurveEaseIn];
[UIView setAnimationBeginsFromCurrentState:YES];
}
toolbar.transform = transform;
toolbar.frame = frame;
// Commit the changes
if(lastOrientation) {
[UIView commitAnimations];
}
lastOrientation = orientation;
This works beautifully. However, an unexpected problem is that UI elements that iOS shows on your behalf are not oriented correctly. I.e., modal windows and popovers all keep the same orientation as the underlying UIView. That problem renders this whole thing moot.

Resources