SKSpriteNode ScaleUpFrom? - ios

Problem
Scaling up a 2D image, scales up from the image centre point, however. I need it to scale up from a specific co-ordinate?
[backgroundImage setScale:rifleZoom];
My current technique to scaling up the image.
I need to scale up from centre screen as opposed to centre image
Now my way to place a rifle shot on the screen # centre is this way:
CGPoint positionNow = CGPointMake(backgroundImage.position.x, backgroundImage.position.y);
CGPoint positionPrev = CGPointMake(0.5, 0.5);
float xdiff = positionNow.x - positionPrev.x;
float ydiff = positionNow.y - positionPrev.y;
CGPoint newPositionOne = CGPointMake(0.5- xdiff, 0.5 - ydiff);
newPositionOne = CGPointMake(newPositionOne.x/rifleZoom, newPositionOne.y/rifleZoom);
Now this works perfectly no matter how much the image is scaled, however. I cannot seem to implement it into scaling the image up from the centre of the screen opposed to centre of the image.
What I've Tried
I've tried to change the position of the image before scaling it up. To the same same point make as newPositionOne However, this does not work or not being done right.
EDIT
This scales and brings centre to screen centrepoint, or messes up completely. It's a little too off the cuff to make a decision exactly what it's doing.
CGPoint positionNow = CGPointMake(backgroundImage.position.x, backgroundImage.position.y);
CGPoint positionPrev = CGPointMake(0.5, 0.5);
float xdiff = positionNow.x - positionPrev.x;
float ydiff = positionNow.y - positionPrev.y;
CGPoint newPositionOne = CGPointMake(0.5- xdiff, 0.5 - ydiff);
newPositionOne = CGPointMake(newPositionOne.x/rifleZoom, newPositionOne.y/rifleZoom);
double xPosition = newPositionOne.x / backgroundImage.size.width;
double yPosition = newPositionOne.y / backgroundImage.size.height;
CGPoint prevAnchorPoint = backgroundImage.anchorPoint;
backgroundImage.anchorPoint = CGPointMake(xPosition,yPosition);
double positionX = backgroundImage.position.x + (backgroundImage.anchorPoint.x - prevAnchorPoint.x) * backgroundImage.size.width;
double positionY = backgroundImage.position.y + (backgroundImage.anchorPoint.y - prevAnchorPoint.y) * backgroundImage.size.height;
backgroundImage.position = CGPointMake(positionX,positionY);
[backgroundImage runAction:[SKAction repeatAction:[SKAction scaleTo:rifleZoom duration:1.0] count:1]];

You can use the anchor point property of the background node to change the point from which image scales. By default the anchorPoint is at (0.5,0.5). This indicates the center of the node. If you make the anchorPoint (0,0), then its moved to the bottom left corner.
anchorPoint : Defines the point in the sprite that corresponds to the
node’s position. You specify the value for this property in the unit
coordinate space. The default value is (0.5,0.5), which means that the
sprite is centered on its position.
When you shift the anchorPoint, you have to adjust the background position again to counteract the movement due to changing the anchorPoint.
So you can use,
CGPoint xPosition = convertedPoint.x / background.size.width
CGPoint yPosition = convertedPoint.y / background.size.height
CGPoint prevAnchorPoint = background.anchorPoint
background.anchorPoint = CGPointMake(xPosition,yPosition)
CGFloat positionX = background.position.x + (background.anchorPoint.x - prevAnchorPoint.x) * background.size.width
CGFloat positionY = background.position.y + (background.anchorPoint.y - prevAnchorPoint.y) * background.size.height
background.position = CGPointMake(positionX,positionY)

Related

how do I make the base of each UIView a tangent to a circle so they radiate from the centre?

I am trying to find angles of rotation for a series of light and dark rectangular UIViews placed at regular points on a circle perimeter. Each point on the circle is calculated as an angle of displacement from the centre and I have tried using the same angle to rotate each UIView so it radiates from the centre. But I didn't expect it to look like this.
I expected the angle of displacement from the centre to be the same as the angle of rotation for each new UIView. Is this assumption correct ? and if so, how do I make the base of each UIView a tangent to a circle so they radiate from the centre ?
UPDATE
In case someone finds it useful here is an update of my original code. The problem as explained by rmaddy has been rectified.
I’ve included two versions of the transform statement and their resulting rotated UIViews. Result on the left uses radians + arcStart + M_PI / 2.0, result on right uses radians + arcStart.
Here is the method.
- (void)sprocket
{
CGRect canvas = [UIScreen mainScreen].bounds;
CGPoint circleCentre = CGPointMake((canvas.size.width)/2, (canvas.size.height)/2);
CGFloat width = 26.0f;
CGFloat height = 50.0f;
CGPoint miniViewCentre;
CGFloat circleRadius = 90;
int miniViewCount = 16;
for (int i = 0; i < miniViewCount; i++)
{
// to place the next view calculate angular displacement along an arc
CGFloat circumference = 2 * M_PI;
CGFloat radians = circumference * i / miniViewCount;
CGFloat arcStart = M_PI + 1.25f; // start circle from this point in radians;
miniViewCentre.x = circleCentre.x + circleRadius * cos(radians + arcStart);
miniViewCentre.y = circleCentre.y + circleRadius * sin(radians + arcStart);
CGPoint placeMiniView = CGPointMake(miniViewCentre.x, miniViewCentre.y);
CGRect swivellingFrame = CGRectMake(placeMiniView.x, placeMiniView.y, width, height);
UIView *miniView = [[UIView alloc] initWithFrame:swivellingFrame];
if ((i % 2) == 0)
{
miniView.backgroundColor = [UIColor darkGrayColor];
}
else
{
miniView.backgroundColor = [UIColor grayColor];
}
miniView.layer.borderWidth = 1;
miniView.layer.borderColor = [UIColor blackColor].CGColor;
miniView.layer.cornerRadius = 3;
miniView.clipsToBounds = YES;
miniView.layer.masksToBounds = YES;
miniView.alpha = 1.0;
// using the same angle rotate the view around its centre
miniView.transform = CGAffineTransformRotate(miniView.transform, radians + arcStart + M_PI / 2.0);
[page1 addSubview:miniView];
}
}
The problem is your calculation of the center of each miniView is based on radians plus arcStart but the transform of each miniView is only based on radians.
Also note that angle 0 is at the 3 o'clock position of the circle. You actually want a 90° (or π/2 radians) rotation of miniView so the rectangle "sticks out" from the circle.
You need two small changes to make your code work:
Change the loop to:
for (int i = 0; i < miniViewCount; i++)
And change the transform:
miniView.transform = CGAffineTransformRotate(miniView.transform, radians + arcStart + M_PI / 2.0);

Core Animation rotation around point

I would like to make an IBOutlet rotate around a specific point in the parent view,
currently i only know how to rotate it around an anchor point, however, i want to use a point outside the object's layer.
The rotation angle is calculated relative to the device heading from the point.
- (void)viewDidLoad{
[super viewDidLoad];
locationManager=[[CLLocationManager alloc] init];
locationManager.desiredAccuracy = kCLLocationAccuracyBest;
locationManager.headingFilter = 1;
locationManager.delegate=self;
[locationManager startUpdatingHeading];
[locationManager startUpdatingLocation];
compassImage.layer.anchorPoint=CGPointZero;
}
- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading{
// Convert Degree to Radian and move the needle
float oldRad = -manager.heading.trueHeading * M_PI / 180.0f;
float newRad = -newHeading.trueHeading * M_PI / 180.0f;
CABasicAnimation *theAnimation;
theAnimation=[CABasicAnimation animationWithKeyPath:#"transform.rotation"];
theAnimation.fromValue = [NSNumber numberWithFloat:oldRad];
theAnimation.toValue=[NSNumber numberWithFloat:newRad];
theAnimation.duration = 0.5f;
[compassImage.layer addAnimation:theAnimation forKey:#"animateMyRotation"];
compassImage.transform = CGAffineTransformMakeRotation(newRad);
NSLog(#"%f (%f) => %f (%f)", manager.heading.trueHeading, oldRad, newHeading.trueHeading, newRad);
}
How can i rotate the UIImageView around (x,y) by alpha?
For rotating around a specific point (internal or external) you can change the anchor point of the layer and then apply a regular rotation transform animation similar to what I wrote about in this blog post.
You just need to be aware that the anchor point also affects where the layer appears on the screen. When you change the anchor point you will also have to change the position to make the layer appear in the same place on the screen.
Assuming that the layer is already placed in it's start position and that the point to rotate around is known, the anchor point and the position can be calculated like this (note that the anchor point is in the unit coordinate space of the layer's bounds (meaning that x and y range from 0 to 1 within the bounds)):
CGPoint rotationPoint = // The point we are rotating around
CGFloat minX = CGRectGetMinX(view.frame);
CGFloat minY = CGRectGetMinY(view.frame);
CGFloat width = CGRectGetWidth(view.frame);
CGFloat height = CGRectGetHeight(view.frame);
CGPoint anchorPoint = CGPointMake((rotationPoint.x-minX)/width,
(rotationPoint.y-minY)/height);
view.layer.anchorPoint = anchorPoint;
view.layer.position = rotationPoint;
Then you simply apply a rotation animation to it, for example:
CABasicAnimation *rotate = [CABasicAnimation animationWithKeyPath:#"transform.rotation.z"];
rotate.toValue = #(-M_PI_2); // The angle we are rotating to
rotate.duration = 1.0;
[view.layer addAnimation:rotate forKey:#"myRotationAnimation"];
Just note that you have changed the anchor point so the position is no longer in the center of the frame and if you apply other transforms (such as a scale) they will also be relative to the anchor point.
Translated David's answer to swift 3:
let rotationPoint = CGPoint(x: layer.frame.width / 2.0, y: layer.frame.height / 2.0) // The point we are rotating around
print(rotationPoint.debugDescription)
let width = layer.frame.width
let height = layer.frame.height
let minX = layer.frame.minX
let minY = layer.frame.minY
let anchorPoint = CGPoint(x: (rotationPoint.x-minX)/width,
y: (rotationPoint.y-minY)/height)
layer.anchorPoint = anchorPoint;
layer.position = rotationPoint;

UIPanGestureRecognizer - Translations and rotations not happening evenly with negatives vs positives

Towards the top of the file I have this:
#define DEGREES_TO_RADIANS(x) (M_PI * x / 180.0)
In viewdidload I have this:
imageView.layer.anchorPoint = CGPointMake(0.5,0.5);
Then finally in the gesture recognizer I have this:
- (IBAction)handlePan:(UIPanGestureRecognizer *)sender {
CGPoint translation = [sender translationInView:self.view];
xPos += translation.x;
if(xPos > 150) xPos = 150;
if(xPos < -150) xPos = -150;
float rotate = 0;
if(xPos >= 50) {
rotate = (xPos - 50) * 0.025;
} else if(xPos <= -50) {
rotate = (xPos + 50) * 0.025;
}
imageView.transform = CGAffineTransformTranslate(CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(rotate)), (xPos * 0.35), 0);
}
This appears to be applying the exact same effects to negatives (left) and positives (right).
With the UIImageView that is being rotated being placed in the center of the screen horizontally, and its anchor set to the center I would expect the maximum in both directions to be cut off equally by the edge of the screen. Instead the image goes much further to the right.
Left effect (negative):
Right effect (positive):
It looks like autolayout changes the anchor point as the image view is rotated and the frame changes.

Calculate rotation angle for anchor point

I got stuck with a problem where I need to reposition views to predefined locations.
All views have a UIPanGestureRecognizer and a UIRotationGestureRecognizer and are positioned/rotated inside the controllers view. Upon a certain event the views should move to a new position with a new rotation angle.
Everything works fine but a soon as one of the gesture recognizer was active and thus the anchorPoint has changed repositioning/rotation fails.
Here is my method I try to use to take the shift in the anchorPoint into account.
- (CGPoint)centerPointWithInVisibleAreaForPoint:(CGPoint)point
{
CGPoint anchorP = self.layer.anchorPoint;
anchorP.x -= 0.5;
anchorP.y -= 0.5;
CGRect rect = self.bounds;
CGFloat widthDelta = CGRectGetWidth(self.bounds) * anchorP.x;
CGFloat heightDelta = CGRectGetHeight(self.bounds) * anchorP.y;
CGPoint newCenter = CGPointMake(point.x + widthDelta, point.y + heightDelta);
return newCenter;
}
The controller asks for the corrected center point and sets the center value of the view. Afterwards the rotation transform is set using CGAffineTransformConcat(view.transform, CGAffineTransformMakeRotation(differenceAngle)).
I think the problem is caused by the fact that the predefined target angle is based on a rotation around the center which is obviously different when rotated around a different anchorPoint, but I don't know how to compensate for that.
The only solution I found (and it is after all the most easiest one) is to reset the anchorPoint to 0.5/0.5 and correct the position accordingly.
- (void)resetAnchorPoint
{
if (!CGPointEqualToPoint(self.layer.anchorPoint, CGPointMake(0.5, 0.5))) {
CGFloat width = CGRectGetWidth(self.bounds);
CGFloat height = CGRectGetHeight(self.bounds);
CGPoint newPoint = CGPointMake(width * 0.5, height * 0.5);
CGPoint oldPoint = CGPointMake(width * self.layer.anchorPoint.x, height * self.layer.anchorPoint.y);
newPoint = CGPointApplyAffineTransform(newPoint, self.transform);
oldPoint = CGPointApplyAffineTransform(oldPoint, self.transform);
CGPoint position = self.layer.position;
position.x += (newPoint.x - oldPoint.x);
position.y += (newPoint.y - oldPoint.y);
[CATransaction setDisableActions:YES];
self.layer.position = position;
self.layer.anchorPoint = CGPointMake(0.5, 0.5);
[CATransaction setDisableActions:NO];
}
}

Find the direction UISlider is pointing

I have a UIView class that contains a custom UISlider. When this UIView is added to a viewController it is randomly rotated using
newSlider.transform = CGAffineTransformRotate(newSlider.transform, degreesToRadians(random));
Now what i'm trying to do is animate the UISlider thumb image flying off the end of the slider.
The problem i'm facing is getting the coordinates of the start/end of the slider so I can work out which way the thumb image is travelling.
I have tried using trackRectForBounds but it gives me the exact same coordinates regardless of the rotation applied to the UIView
I have tried these inside my UIView class:
CGRect trackRect = [customSlider trackRectForBounds:customSlider.bounds];
and
CGRect trackRect2 = [customSlider trackRectForBounds:self.window.bounds];
which give me {{2, 1}, {288, 50}} & {{2, 215}, {316, 50}} regardless of rotation. I think it's giving me the rect from within the UIView and not the screen.
Getting the end-points of slider can be done like this:
CGFloat rotation = [[newSlider.layer valueForKeyPath:#"transform.rotation.z"] floatValue];
CGFloat halfWidth = newSlider.bounds.size.width/2;
CGPoint sliderCenter = newSlider.center;
sliderCenter.y += newSlider.bounds.size.height/2; //this shouldn't be needed but it seems it is
CGFloat x1 = sliderCenter.x - halfWidth * cos(rotation);
CGFloat x2 = sliderCenter.x + halfWidth * cos(rotation);
CGFloat y1 = sliderCenter.y - halfWidth * sin(rotation);
CGFloat y2 = sliderCenter.y + halfWidth * sin(rotation);
CGPoint T1 = CGPointMake(x1, y1);
CGPoint T2 = CGPointMake(x2, y2);

Resources