Find the direction UISlider is pointing - ios

I have a UIView class that contains a custom UISlider. When this UIView is added to a viewController it is randomly rotated using
newSlider.transform = CGAffineTransformRotate(newSlider.transform, degreesToRadians(random));
Now what i'm trying to do is animate the UISlider thumb image flying off the end of the slider.
The problem i'm facing is getting the coordinates of the start/end of the slider so I can work out which way the thumb image is travelling.
I have tried using trackRectForBounds but it gives me the exact same coordinates regardless of the rotation applied to the UIView
I have tried these inside my UIView class:
CGRect trackRect = [customSlider trackRectForBounds:customSlider.bounds];
and
CGRect trackRect2 = [customSlider trackRectForBounds:self.window.bounds];
which give me {{2, 1}, {288, 50}} & {{2, 215}, {316, 50}} regardless of rotation. I think it's giving me the rect from within the UIView and not the screen.

Getting the end-points of slider can be done like this:
CGFloat rotation = [[newSlider.layer valueForKeyPath:#"transform.rotation.z"] floatValue];
CGFloat halfWidth = newSlider.bounds.size.width/2;
CGPoint sliderCenter = newSlider.center;
sliderCenter.y += newSlider.bounds.size.height/2; //this shouldn't be needed but it seems it is
CGFloat x1 = sliderCenter.x - halfWidth * cos(rotation);
CGFloat x2 = sliderCenter.x + halfWidth * cos(rotation);
CGFloat y1 = sliderCenter.y - halfWidth * sin(rotation);
CGFloat y2 = sliderCenter.y + halfWidth * sin(rotation);
CGPoint T1 = CGPointMake(x1, y1);
CGPoint T2 = CGPointMake(x2, y2);

Related

how do I make the base of each UIView a tangent to a circle so they radiate from the centre?

I am trying to find angles of rotation for a series of light and dark rectangular UIViews placed at regular points on a circle perimeter. Each point on the circle is calculated as an angle of displacement from the centre and I have tried using the same angle to rotate each UIView so it radiates from the centre. But I didn't expect it to look like this.
I expected the angle of displacement from the centre to be the same as the angle of rotation for each new UIView. Is this assumption correct ? and if so, how do I make the base of each UIView a tangent to a circle so they radiate from the centre ?
UPDATE
In case someone finds it useful here is an update of my original code. The problem as explained by rmaddy has been rectified.
I’ve included two versions of the transform statement and their resulting rotated UIViews. Result on the left uses radians + arcStart + M_PI / 2.0, result on right uses radians + arcStart.
Here is the method.
- (void)sprocket
{
CGRect canvas = [UIScreen mainScreen].bounds;
CGPoint circleCentre = CGPointMake((canvas.size.width)/2, (canvas.size.height)/2);
CGFloat width = 26.0f;
CGFloat height = 50.0f;
CGPoint miniViewCentre;
CGFloat circleRadius = 90;
int miniViewCount = 16;
for (int i = 0; i < miniViewCount; i++)
{
// to place the next view calculate angular displacement along an arc
CGFloat circumference = 2 * M_PI;
CGFloat radians = circumference * i / miniViewCount;
CGFloat arcStart = M_PI + 1.25f; // start circle from this point in radians;
miniViewCentre.x = circleCentre.x + circleRadius * cos(radians + arcStart);
miniViewCentre.y = circleCentre.y + circleRadius * sin(radians + arcStart);
CGPoint placeMiniView = CGPointMake(miniViewCentre.x, miniViewCentre.y);
CGRect swivellingFrame = CGRectMake(placeMiniView.x, placeMiniView.y, width, height);
UIView *miniView = [[UIView alloc] initWithFrame:swivellingFrame];
if ((i % 2) == 0)
{
miniView.backgroundColor = [UIColor darkGrayColor];
}
else
{
miniView.backgroundColor = [UIColor grayColor];
}
miniView.layer.borderWidth = 1;
miniView.layer.borderColor = [UIColor blackColor].CGColor;
miniView.layer.cornerRadius = 3;
miniView.clipsToBounds = YES;
miniView.layer.masksToBounds = YES;
miniView.alpha = 1.0;
// using the same angle rotate the view around its centre
miniView.transform = CGAffineTransformRotate(miniView.transform, radians + arcStart + M_PI / 2.0);
[page1 addSubview:miniView];
}
}
The problem is your calculation of the center of each miniView is based on radians plus arcStart but the transform of each miniView is only based on radians.
Also note that angle 0 is at the 3 o'clock position of the circle. You actually want a 90° (or π/2 radians) rotation of miniView so the rectangle "sticks out" from the circle.
You need two small changes to make your code work:
Change the loop to:
for (int i = 0; i < miniViewCount; i++)
And change the transform:
miniView.transform = CGAffineTransformRotate(miniView.transform, radians + arcStart + M_PI / 2.0);

SKSpriteNode ScaleUpFrom?

Problem
Scaling up a 2D image, scales up from the image centre point, however. I need it to scale up from a specific co-ordinate?
[backgroundImage setScale:rifleZoom];
My current technique to scaling up the image.
I need to scale up from centre screen as opposed to centre image
Now my way to place a rifle shot on the screen # centre is this way:
CGPoint positionNow = CGPointMake(backgroundImage.position.x, backgroundImage.position.y);
CGPoint positionPrev = CGPointMake(0.5, 0.5);
float xdiff = positionNow.x - positionPrev.x;
float ydiff = positionNow.y - positionPrev.y;
CGPoint newPositionOne = CGPointMake(0.5- xdiff, 0.5 - ydiff);
newPositionOne = CGPointMake(newPositionOne.x/rifleZoom, newPositionOne.y/rifleZoom);
Now this works perfectly no matter how much the image is scaled, however. I cannot seem to implement it into scaling the image up from the centre of the screen opposed to centre of the image.
What I've Tried
I've tried to change the position of the image before scaling it up. To the same same point make as newPositionOne However, this does not work or not being done right.
EDIT
This scales and brings centre to screen centrepoint, or messes up completely. It's a little too off the cuff to make a decision exactly what it's doing.
CGPoint positionNow = CGPointMake(backgroundImage.position.x, backgroundImage.position.y);
CGPoint positionPrev = CGPointMake(0.5, 0.5);
float xdiff = positionNow.x - positionPrev.x;
float ydiff = positionNow.y - positionPrev.y;
CGPoint newPositionOne = CGPointMake(0.5- xdiff, 0.5 - ydiff);
newPositionOne = CGPointMake(newPositionOne.x/rifleZoom, newPositionOne.y/rifleZoom);
double xPosition = newPositionOne.x / backgroundImage.size.width;
double yPosition = newPositionOne.y / backgroundImage.size.height;
CGPoint prevAnchorPoint = backgroundImage.anchorPoint;
backgroundImage.anchorPoint = CGPointMake(xPosition,yPosition);
double positionX = backgroundImage.position.x + (backgroundImage.anchorPoint.x - prevAnchorPoint.x) * backgroundImage.size.width;
double positionY = backgroundImage.position.y + (backgroundImage.anchorPoint.y - prevAnchorPoint.y) * backgroundImage.size.height;
backgroundImage.position = CGPointMake(positionX,positionY);
[backgroundImage runAction:[SKAction repeatAction:[SKAction scaleTo:rifleZoom duration:1.0] count:1]];
You can use the anchor point property of the background node to change the point from which image scales. By default the anchorPoint is at (0.5,0.5). This indicates the center of the node. If you make the anchorPoint (0,0), then its moved to the bottom left corner.
anchorPoint : Defines the point in the sprite that corresponds to the
node’s position. You specify the value for this property in the unit
coordinate space. The default value is (0.5,0.5), which means that the
sprite is centered on its position.
When you shift the anchorPoint, you have to adjust the background position again to counteract the movement due to changing the anchorPoint.
So you can use,
CGPoint xPosition = convertedPoint.x / background.size.width
CGPoint yPosition = convertedPoint.y / background.size.height
CGPoint prevAnchorPoint = background.anchorPoint
background.anchorPoint = CGPointMake(xPosition,yPosition)
CGFloat positionX = background.position.x + (background.anchorPoint.x - prevAnchorPoint.x) * background.size.width
CGFloat positionY = background.position.y + (background.anchorPoint.y - prevAnchorPoint.y) * background.size.height
background.position = CGPointMake(positionX,positionY)

Zoom a rotated image inside scroll view to fit (fill) frame of overlay rect

Through this question and answer I've now got a working means of detecting when an arbitrarily rotated image isn't completely outside a cropping rect.
The next step is to figure out how to correctly adjust it's containing scroll view zoom to ensure that there are no empty spaces inside the cropping rect. To clarify, I want to enlarge (zoom in) the image; the crop rect should remain un-transformed.
The layout hierarchy looks like this:
containing UIScrollView
UIImageView (this gets arbitrarily rotated)
crop rect overlay view
... where the UIImageView can also be zoomed and panned inside the scrollView.
There are 4 gesture events that occur that need to be accounted for:
Pan gesture (done): accomplished by detecting if it's been panned incorrectly and resets the contentOffset.
Rotation CGAffineTransform
Scroll view zoom
Adjustment of the cropping rect overlay frame
As far as I can tell, I should be able to use the same logic for 2, 3, and 4 to adjust the zoomScale of the scroll view to make the image fit properly.
How do I properly calculate the zoom ratio necessary to make the rotated image fit perfectly inside the crop rect?
To better illustrate what I'm trying to accomplish, here's an example of the incorrect size:
I need to calculate the zoom ratio necessary to make it look like this:
Here's the code I've got so far using Oluseyi's solution below. It works when the rotation angle is minor (e.g. less than 1 radian), but anything over that and it goes really wonky.
CGRect visibleRect = [_scrollView convertRect:_scrollView.bounds toView:_imageView];
CGRect cropRect = _cropRectView.frame;
CGFloat rotationAngle = fabs(self.rotationAngle);
CGFloat a = visibleRect.size.height * sinf(rotationAngle);
CGFloat b = visibleRect.size.width * cosf(rotationAngle);
CGFloat c = visibleRect.size.height * cosf(rotationAngle);
CGFloat d = visibleRect.size.width * sinf(rotationAngle);
CGFloat zoomDiff = MAX(cropRect.size.width / (a + b), cropRect.size.height / (c + d));
CGFloat newZoomScale = (zoomDiff > 1) ? zoomDiff : 1.0 / zoomDiff;
[UIView animateWithDuration:0.2
delay:0.05
options:NO
animations:^{
[self centerToCropRect:[self convertRect:cropRect toView:self.zoomingView]];
_scrollView.zoomScale = _scrollView.zoomScale * newZoomScale;
} completion:^(BOOL finished) {
if (![self rotatedView:_imageView containsViewCompletely:_cropRectView])
{
// Damn, it's still broken - this happens a lot
}
else
{
// Woo! Fixed
}
_didDetectBadRotation = NO;
}];
Note I'm using AutoLayout which makes frames and bounds goofy.
Assume your image rectangle (blue in the diagram) and crop rectangle (red) have the same aspect ratio and center. When rotated, the image rectangle now has a bounding rectangle (green) which is what you want your crop scaled to (effectively, by scaling down the image).
To scale effectively, you need to know the dimensions of the new bounding rectangle and use a scale factor that fits the crop rect into it. The dimensions of the bounding rectangle are rather obviously
(a + b) x (c + d)
Notice that each segment a, b, c, d is either the adjacent or opposite side of a right triangle formed by the bounding rect and the rotated image rect.
a = image_rect_height * sin(rotation_angle)
b = image_rect_width * cos(rotation_angle)
c = image_rect_width * sin(rotation_angle)
d = image_rect_height * cos(rotation_angle)
Your scale factor is simply
MAX(crop_rect_width / (a + b), crop_rect_height / (c + d))
Here's a reference diagram:
Fill frame of overlay rect:
For a square crop you need to know new bounds of the rotated image which will fill the crop view.
Let's take a look at the reference diagram:
You need to find the altitude of a right triangle (the image number 2). Both altitudes are equal.
CGFloat sinAlpha = sin(alpha);
CGFloat cosAlpha = cos(alpha);
CGFloat hypotenuse = /* calculate */;
CGFloat altitude = hypotenuse * sinAlpha * cosAlpha;
Then you need to calculate the new width for the rotated image and the desired scale factor as follows:
CGFloat newWidth = previousWidth + altitude * 2;
CGFloat scale = newWidth / previousWidth;
I have implemented this method here.
I will answer using sample code, but basically this problem becomes really easy, if you will think in rotated view coordinate system.
UIView* container = [[UIView alloc] initWithFrame:CGRectMake(80, 200, 100, 100)];
container.backgroundColor = [UIColor blueColor];
UIView* content2 = [[UIView alloc] initWithFrame:CGRectMake(-50, -50, 150, 150)];
content2.backgroundColor = [[UIColor greenColor] colorWithAlphaComponent:0.5];
[container addSubview:content2];
[self.view setBackgroundColor:[UIColor blackColor]];
[self.view addSubview:container];
[container.layer setSublayerTransform:CATransform3DMakeRotation(M_PI / 8.0, 0, 0, 1)];
//And now the calculations
CGRect containerFrameInContentCoordinates = [content2 convertRect:container.bounds fromView:container];
CGRect unionBounds = CGRectUnion(content2.bounds, containerFrameInContentCoordinates);
CGFloat midX = CGRectGetMidX(content2.bounds);
CGFloat midY = CGRectGetMidY(content2.bounds);
CGFloat scaleX1 = (-1 * CGRectGetMinX(unionBounds) + midX) / midX;
CGFloat scaleX2 = (CGRectGetMaxX(unionBounds) - midX) / midX;
CGFloat scaleY1 = (-1 * CGRectGetMinY(unionBounds) + midY) / midY;
CGFloat scaleY2 = (CGRectGetMaxY(unionBounds) - midY) / midY;
CGFloat scaleX = MAX(scaleX1, scaleX2);
CGFloat scaleY = MAX(scaleY1, scaleY2);
CGFloat scale = MAX(scaleX, scaleY);
content2.transform = CGAffineTransformScale(content2.transform, scale, scale);

Calculate rotation angle for anchor point

I got stuck with a problem where I need to reposition views to predefined locations.
All views have a UIPanGestureRecognizer and a UIRotationGestureRecognizer and are positioned/rotated inside the controllers view. Upon a certain event the views should move to a new position with a new rotation angle.
Everything works fine but a soon as one of the gesture recognizer was active and thus the anchorPoint has changed repositioning/rotation fails.
Here is my method I try to use to take the shift in the anchorPoint into account.
- (CGPoint)centerPointWithInVisibleAreaForPoint:(CGPoint)point
{
CGPoint anchorP = self.layer.anchorPoint;
anchorP.x -= 0.5;
anchorP.y -= 0.5;
CGRect rect = self.bounds;
CGFloat widthDelta = CGRectGetWidth(self.bounds) * anchorP.x;
CGFloat heightDelta = CGRectGetHeight(self.bounds) * anchorP.y;
CGPoint newCenter = CGPointMake(point.x + widthDelta, point.y + heightDelta);
return newCenter;
}
The controller asks for the corrected center point and sets the center value of the view. Afterwards the rotation transform is set using CGAffineTransformConcat(view.transform, CGAffineTransformMakeRotation(differenceAngle)).
I think the problem is caused by the fact that the predefined target angle is based on a rotation around the center which is obviously different when rotated around a different anchorPoint, but I don't know how to compensate for that.
The only solution I found (and it is after all the most easiest one) is to reset the anchorPoint to 0.5/0.5 and correct the position accordingly.
- (void)resetAnchorPoint
{
if (!CGPointEqualToPoint(self.layer.anchorPoint, CGPointMake(0.5, 0.5))) {
CGFloat width = CGRectGetWidth(self.bounds);
CGFloat height = CGRectGetHeight(self.bounds);
CGPoint newPoint = CGPointMake(width * 0.5, height * 0.5);
CGPoint oldPoint = CGPointMake(width * self.layer.anchorPoint.x, height * self.layer.anchorPoint.y);
newPoint = CGPointApplyAffineTransform(newPoint, self.transform);
oldPoint = CGPointApplyAffineTransform(oldPoint, self.transform);
CGPoint position = self.layer.position;
position.x += (newPoint.x - oldPoint.x);
position.y += (newPoint.y - oldPoint.y);
[CATransaction setDisableActions:YES];
self.layer.position = position;
self.layer.anchorPoint = CGPointMake(0.5, 0.5);
[CATransaction setDisableActions:NO];
}
}

Get the distance between two CGPoints in a UIImageView nested in a UIScrollView

CGRect screenSize = [[UIScreen mainScreen]bounds];
CGFloat screenWidth = screenSize.size.width;
CGRect visibleRect;
visibleRect.origin = scrollView.contentOffset;
visibleRect.size = imageView.frame.size;
CGPoint midPoint = CGPointMake(visibleRect.origin.x + visibleRect.size.width / 2, visibleRect.origin.y + visibleRect.size.height / 2);
CGPoint place = CGPointMake(class.xValue, class.yValue);
xValue and yValue being 'int's declared in the class.
-(int)distanceFrom:(CGPoint)point1 to:(CGPoint)point2{
CGFloat xDist = ((point2.x) - (point1.x));
CGFloat yDist = ((point2.y) - (point1.y));
return (sqrt((xDist * xDist) + (yDist * yDist)));
}
The problem is that once I change the zoomScale by scrolling in the app and recall the method the numbers change drastically (~1000 pixels). How to I take into account for the zoomScale? Thanks in advance,
Alex
when you are zooming in, the result from distance method is multiplied by zoomScale of the scroll view. If you want get actual distance, try dividing the distance by zoomScale of the scroll view.

Resources