Keep pannable UIView inside bounds - ios

I am developing an iOS application which contains a scalable and pannable UIView. Since the user is allowed to pan and scale the view, I did like to keep the UIView within the boundaries of the screen.
I searched a lot on the internet, but there does not seem to be examples that really fits my need.
Below is the panning code I wrote:
private void HandlePan(UIPanGestureRecognizer recognizer)
{
if (recognizer.State != UIGestureRecognizerState.Began &&
recognizer.State != UIGestureRecognizerState.Changed)
return;
var translation = recognizer.TranslationInView(this);
_posX += translation.X;
_posY += translation.Y;
var maxX = (Bounds.Size.Width / 2) * _currentScale;
var maxY = (Bounds.Size.Height / 2) * _currentScale;
// TODO: The min values are wrong
var minX = (Bounds.Size.Width / 2) / _currentScale;
var minY = (Bounds.Size.Height / 2) / _currentScale;
if (_posX > maxX)
_posX = maxX;
else if (_posX < minX)
_posX = minX;
if (_posY > maxY)
_posY = maxY;
else if (_posY < minY)
_posY = minY;
var translatedCenter = new CGPoint(_posX, _posY);
Center = translatedCenter;
recognizer.SetTranslation(CGPoint.Empty, this);
}
I managed to get the boundaries working with only two sides of the screen. maxX and maxY are correct. I just can't figure out the way on how to calculate the correct minX and minY values.
Below I added a screen recording to show what is going wrong:
You can see that the coordinates are blocking (even more when the scale is incrementing > 1) when I try to drag to the right/bottom side of the image (which are the minX and minY values).
What is wrong in this calculation or what should the correct calculation be? Note that maxX and maxY are perfectly working.
var minX = (Bounds.Size.Width / 2) / _currentScale;
var minY = (Bounds.Size.Height / 2) / _currentScale;

Another approach for calculating boundaries could also be:
int centerPoint = (int)(Bounds.Size.Width / 2);
var minX = centerPoint - ((_currentScale - 1) * centerPoint);
if (_posX < minX)
_posX = minX;

After lots of trial and error I managed to fix it by moving the image manually to the borders of the frame and writing down all the values.
The values looked like this:
Bounds = 375
maxX = 187.5, minX = 187.5, scale = 1
maxX = 549.039223765595, minX = -173.989914508274, scale = 2.92820919341651
maxX = 915.278849493097, minX = -539.668274239639, scale = 4.88052876267261
maxX = 1575.12891219963, minX = -1198.9316584188, scale = 8.40338022601214
Which means that the following formula is able to get the min values.
var minX = -(maxX - Bounds.Size.Width);
var minY = -(maxY - Bounds.Size.Height);
Now it's working fine.

Related

Show direction to CGPoint SpriteKit

I need an arrow (white circle) that shows the direction to CGPoint while user move his icon and camera.
I mean that arrow (white circle) needs to take position on the edge of visible screen and shows the way that helps user to return to followed CGPoint.
Demo gif
You are expecting for two things:
place the arrow on the correct screen side given the target CGPoint position
Orient the arrow towards the target CGPoint
In your touchesMoved(_:) method, you can update the arrow rotation and position, not tested but the principle should work :
private func placeArrow(at sourceNode: SKNode, for targetNode: SKNode) {
//do not display arrow if already on screen
guard targetNode.position.x > cameraNode.position.x - screenSizeX/2
&& targetNode.position.x < cameraNode.position.x + screenSizeX/2
&& targetNode.position.y > cameraNode.position.y - screenSizeY/2
&& targetNode.position.y < cameraNode.position.y + screenSizeY/2
{
arrowNode.isHidden = true
return
}
//find arrow position, if on the left place to the left side, else on the right
//place at the medium y between the 2 points
let screenSizeX = UIScreen.main.bounds.width
let screenSizeY = UIScreen.main.bounds.height
let ymin = cameraNode.position.y - screenSizeY/2 + 10
let ymax = cameraNode.position.y + screenSizeY/2 - 10
let midY = (sourceNode.position.y + targetNode.position.y)/2
var clampedMidY = midY
if midY > ymax {
clampedMidY = ymax
} else if midY < ymin {
clampedMidY = ymin
}
arrowNode.position = CGPoint(x: (targetNode.position.x < sourceNode.position.x) ? cameraNode.position.x - screenSizeX/2 : cameraNode.position.x + screenSizeX/2, y: clampedMidY)
//find arrow orientation
//see https://stackoverflow.com/questions/38411494/rotating-a-sprite-towards-a-point-and-move-it-towards-it-with-a-duration
let v1 = CGVector(dx:0, dy:1)
let v2 = CGVector(dx: targetNode.position.x - sourceNode.position.x, dy: targetNode.position.y - sourceNode.position.y)
arrowNode.zRotation = atan2(v2.dy, v2.dx) - atan2(v1.dy, v1.dx)
}

Cocos2d-x Parallax with Accelerometer (How to stop smoothly when reaching the edges and when changing direction)

I am creating a game that has 3 layers of background. They are added to a CCParallaxNode and it's moved by tilting the device to the right, left, up and down. I am using this code to move the CCParallaxNode (accelerometer delegate method - didAccelerate):
void SelectScreen::didAccelerate(cocos2d::CCAcceleration *pAccelerationValue)
{
float deceleration = 0.1f, sensitivity = 30.0f, maxVelocity = 200;
accelX = pAccelerationValue->x * sensitivity;
accelY = pAccelerationValue->z * sensitivity;
parallaxMovementX = parallaxMovementX * deceleration + pAccelerationValue->x * sensitivity;
parallaxMovementX = fmaxf(fminf(parallaxMovementX, maxVelocity), -maxVelocity);
float offset = -calibration * sensitivity;
parallaxMovementY = (parallaxMovementY * deceleration + pAccelerationValue->z * sensitivity) + offset;
}
Then, in the update method:
void SelectScreen::update(float dt)
{
CCNode* node = getChildByTag(100);
float maxX = (Data::getInstance()->getWinSize().width * 2) + 100;
float minX = node->getContentSize().width - 100;
float maxY = Data::getInstance()->getWinSize().height * 0.1f;
float minY = -200;
float diffX = parallaxMovementX;
float diffY = parallaxMovementY;
float newX = node->getPositionX() + diffX;
float newY = node->getPositionY() + diffY;
newX = MIN(MAX(newX, minX), maxX);
newY = MIN(MAX(newY, minY), maxY);
if(isUpdating)
node->setPositionX(newX);
if(isUpdatingY)
node->setPositionY(newY);
}
The movement is nicely done, however, when reaching any of the 4 edges it stops abruptly. Also, when changing direction (eg. moving to the right then moving to the left) it does it abruptly.
Question: How can I do a smooth stop and a smooth direction change (maybe some little bouncing effect)? I think this is also related to the accelerometer data (when going fast it must bounce longer that it should when going slow).
Thanks in advance.
You need some math to smooth the movements.
Try checking the code here:
http://www.nscodecenter.com/preguntas/10768/3d-parallax-con-accelerometer

Animation with an image in iPad

Hi I have an image like a round top of a table.
I want to move it clockwise when ever user swipes from left to right and counter clockwise when user swipes from right to left.
Like moving a round table top in real time.
How can I do this in the app?
I am using the following code for rotation. Its from the TrackBall example.
The problem I am having is the when ever the image spins, it changes its position.
- (CATransform3D)rotationTransformForLocation:(CGPoint)location
{
CGFloat trackBallCurrentPoint[3] = {location.x - trackBallCenter.x, location.y - trackBallCenter.y, 0.0f};
if(fabs(trackBallCurrentPoint[0] - trackBallStartPoint[0]) < kTol && fabs(trackBallCurrentPoint[1] - trackBallStartPoint[1]) < kTol)
{
return CATransform3DIdentity;
}
CGFloat dist = trackBallCurrentPoint[0] * trackBallCurrentPoint[0] + trackBallCurrentPoint[1] * trackBallCurrentPoint[1];
if(dist > trackBallRadius * trackBallRadius)
{
// outside the center of the sphere so make it zero
trackBallCurrentPoint[2] = 0.0f;
}
else
{
trackBallCurrentPoint[2] = sqrt(trackBallRadius * trackBallRadius - dist);
}
// cross product yields the rotation vector
CGFloat rotationVector[3];
rotationVector[0] = trackBallStartPoint[1] * trackBallCurrentPoint[2] - trackBallStartPoint[2] * trackBallCurrentPoint[1];
rotationVector[1] = -trackBallStartPoint[0] * trackBallCurrentPoint[2] + trackBallStartPoint[2] * trackBallCurrentPoint[0];
rotationVector[2] = trackBallStartPoint[0] * trackBallCurrentPoint[1] - trackBallStartPoint[1] * trackBallCurrentPoint[0];
// calc the angle between the current point vector and the starting point vector
// use arctan so we get all eight quadrants instead of just the positive ones
// cos(a) = (start . current) / (||start|| ||current||)
// sin(a) = (||start X current||) / (||start|| ||current||)
// a = atan2(sin(a), cos(a))
CGFloat startLength = sqrt(trackBallStartPoint[0] * trackBallStartPoint[0] + trackBallStartPoint[1] * trackBallStartPoint[1] + trackBallStartPoint[2] * trackBallStartPoint[2]);
CGFloat currentLength = sqrt(trackBallCurrentPoint[0] * trackBallCurrentPoint[0] + trackBallCurrentPoint[1] * trackBallCurrentPoint[1] + trackBallCurrentPoint[2] * trackBallCurrentPoint[2]);
CGFloat startDotCurrent = trackBallStartPoint[0] * trackBallCurrentPoint[0] + trackBallStartPoint[1] * trackBallCurrentPoint[1] + trackBallStartPoint[2] * trackBallCurrentPoint[2]; // (start . current)
// start X current we have already calcualted in the rotation vector
CGFloat rotationLength = sqrt(rotationVector[0] * rotationVector[0] + rotationVector[1] * rotationVector[1] + rotationVector[2] * rotationVector[2]);
CGFloat angle = atan2(rotationLength / (startLength * currentLength), startDotCurrent / (startLength * currentLength));
// normalize the rotation vector
rotationVector[0] = rotationVector[0] / rotationLength;
rotationVector[1] = rotationVector[1] / rotationLength;
rotationVector[2] = rotationVector[2] / rotationLength;
CATransform3D rotationTransform = CATransform3DMakeRotation(angle, rotationVector[0], rotationVector[1], rotationVector[2]);
return CATransform3DConcat(baseTransform, rotationTransform);
}
Thanks in advance.
Take a look at a question I posed... you might be trying to do the same thing (I don't think the question covered it, but after getting rotation working I implemented pan gesture to allow the user to spin the disc in either direction)
How to rotate a flat object around its center in perspective view?

How to rotate a triangle?

I'm struggling with rotating a triangle resulting from a UIRotationGestureRecognizer. If you could look over my approach and offer suggestions, I'd greatly appreciate it.
I ask the gesture recognizer object for the rotation, which the documentation says is returned in radians.
My strategy had been to think of each vertex as a point on a circle that exists between the center of the triangle and the vertex, and then use the radians of rotation to find the new point on that circumference. I'm not totally sure this is a valid approach, but I wanted to at least try it. Visually I'd know whether or not it was working.
Here's the code I created in that attempt:
- (CGPoint)rotateVertex:(CGPoint)vertex byRadians:(float)radians
{
float deltaX = center.x - vertex.x;
float deltaY = center.y - vertex.y;
float currentAngle = atanf( deltaX / deltaY );
float newAngle = currentAngle + radians;
float newX = cosf(newAngle) + vertex.x;
float newY = sinf(newAngle) + vertex.y;
return CGPointMake(newX, newY);
}
When executed, there's a slight rotation at the beginning, but then as I continue rotating my fingers the vertices just start getting farther away from the center point, indicating I'm confusing something here.
I looked at what the CGContextRotateCTM could do for me, but ultimately I need to know what the vertices are after the rotation, so just rotating the graphics context doesn't appear to leave me with those changed coordinates.
I also tried the technique described here but that resulted in the triangle being flipped about the second vertex, which seems odd, but then that technique works with p and q being the x and y coordinates of the second vertex.
Thanks for taking a look!
Solved: Here is the corrected function. It assumes you have calculated the center of the triangle. I used the 1/3(x1 + x2 + x3), 1/3(y1 + y2 + y3) method described on the Wikipedia article on Centroids.
- (CGPoint)rotatePoint:(CGPoint)currentPoint byRadians:(float)radiansOfRotation
{
float deltaX = currentPoint.x - center.x;
float deltaY = currentPoint.y - center.y;
float radius = sqrtf(powf(deltaX, 2.0) + powf(deltaY, 2.0));
float currentAngle = atan2f( deltaY, deltaX );
float newAngle = currentAngle + radiansOfRotation;
float newRun = radius * cosf(newAngle);
float newX = center.x + newRun;
float newRise = radius * sinf(newAngle);
float newY = center.y + newRise;
return CGPointMake(newX, newY);
}
Of noteworthy relevance to why the first code listing did not work was that the arguments to atan2 were reversed. Also, the correct calculation of the delta values was reversed.
You're forgetting to multiply by the radius of the circle. Also, since the Y axis points down in the UIKit coordinate system, you have to subtract instead of add the radians and negate the y coordinate at the end. And you need to use atan2 only gives output in the range -pi/2 to pi/2:
float currentAngle = atan2f(deltaY, deltaX);
float newAngle = currentAngle - radians;
float radious = sqrtf(powf(deltaX, 2.0) + powf(deltaY, 2.0));
float newX = radius * cosf(newAngle) + vertex.x;
float newY = -1.0 * radius * sinf(newAngle) + vertex.y;
The answer is embedded now in the original question. Gun shy about proper decorum ;-)

Get angle from 2 positions

I have 2 objects and when I move one, I want to get the angle from the other.
For example:
Object1X = 211.000000, Object1Y = 429.000000
Object2X = 246.500000, Object2Y = 441.500000
I have tried the following and every variation under the sun:
double radians = ccpAngle(Object1,Object2);
double degrees = ((radians * 180) / Pi);
But I just get 2.949023 returned where I want something like 45 degrees etc.
Does this other answer help?
How to map atan2() to degrees 0-360
I've written it like this:
- (CGFloat) pointPairToBearingDegrees:(CGPoint)startingPoint secondPoint:(CGPoint) endingPoint
{
CGPoint originPoint = CGPointMake(endingPoint.x - startingPoint.x, endingPoint.y - startingPoint.y); // get origin point to origin by subtracting end from start
float bearingRadians = atan2f(originPoint.y, originPoint.x); // get bearing in radians
float bearingDegrees = bearingRadians * (180.0 / M_PI); // convert to degrees
bearingDegrees = (bearingDegrees > 0.0 ? bearingDegrees : (360.0 + bearingDegrees)); // correct discontinuity
return bearingDegrees;
}
Running the code:
CGPoint p1 = CGPointMake(10, 10);
CGPoint p2 = CGPointMake(20,20);
CGFloat f = [self pointPairToBearingDegrees:p1 secondPoint:p2];
And this returns 45.
Hope this helps.
Here's how I'm doing it in Swift for those interested, it's based on #bshirley's answer above w/ a few modifications to help match to the calayer rotation system:
extension CGFloat {
var degrees: CGFloat {
return self * CGFloat(180) / .pi
}
}
extension CGPoint {
func angle(to comparisonPoint: CGPoint) -> CGFloat {
let originX = comparisonPoint.x - x
let originY = comparisonPoint.y - y
let bearingRadians = atan2f(Float(originY), Float(originX))
var bearingDegrees = CGFloat(bearingRadians).degrees
while bearingDegrees < 0 {
bearingDegrees += 360
}
return bearingDegrees
}
}
This provides a coordinate system like this:
90
180 0
270
Usage:
point.angle(to: point2)
CGPoint.zero.angle(to: CGPoint(x: 0, y: 1)) // 90
I modified #tomas' solution to be streamlined. It's likely (it was for me) that this math is going to be called frequently.
In my incarnation, you have to perform the difference between the two points yourself (or if you're lucky, (0,0) is already one of your points). The value being calculated is the direction of the point from (0,0). Yes, that's simple enough and you could inline it if you really want to. My preference is for more readable code.
I also converted it to a function call:
CGFloat CGPointToDegree(CGPoint point) {
// Provides a directional bearing from (0,0) to the given point.
// standard cartesian plain coords: X goes up, Y goes right
// result returns degrees, -180 to 180 ish: 0 degrees = up, -90 = left, 90 = right
CGFloat bearingRadians = atan2f(point.y, point.x);
CGFloat bearingDegrees = bearingRadians * (180. / M_PI);
return bearingDegrees;
}
If you don't want negative values, you need to convert it yourself. Negative values were fine for me - no need to make unneeded calculations.
I was using this in a cocos2d environment, this is how I call it: (Mathematically, we are translating the plane to make p0 the origin. Thus subtracting p0 from p1 (p0 - p0 = {0,0}). The angles are unchanged when the plane is translated.)
CGPoint p0 = self.position;
CGPoint p1 = other.position;
CGPoint pnormal = ccpSub(p1, p0);
CGFloat angle = CGPointToDegree(pnormal);
ccpSub is provided by cocos2d, it's subtraction of a tuple - you can do that yourself if you don't have that available
aside: it's generally not polite style to name the method as above with the CG___ naming scheme, which identifies the function as part of CoreGraphics - so if you want to rename it to MyConvertCGPointToBearing() or FredLovesWilma() then you should do that.
Tomas' answer in Swift 5
func angle(between starting: CGPoint, ending: CGPoint) -> CGFloat {
let center = CGPoint(x: ending.x - starting.x, y: ending.y - starting.y)
let radians = atan2(center.y, center.x)
let degrees = radians * 180 / .pi
return degrees > 0 ? degrees : 360 + degrees
}
There is no angle between two points. If you want to know the angle between the vectors from the origin (0,0) to the objects, use the scalar (dot) product:
theta = arccos ( (veca dot vecb) / ( |veca| * |vecb| )
The math std lib of the language your are using surely provides functions for arcus cosine, scalar product and length.
The vertex of the angle is the point (0,0).
Consider object1X=x1 ....object2Y=y2.
Angle(object1-object2) =
90 * ( (1 + sign(x1)) * (1 - sign(y1^2))
- (1 + sign(x2)) * (1 - sign(y2^2)) )
+ 45 * ( (2 + sign(x1)) * sign(y1)
- (2 + sign(x2)) * sign(y2) )
+ 180/pi() * sign(x1*y1) * atan( (abs(x1) - abs(y1)) / (abs(x1) + abs(y1)) )
- 180/pi() * sign(x2*y2) * atan( (abs(x2) - abs(y2)) / (abs(x2) + abs(y2)) )
Will leave it here. Corrected code, plus with rotation of the axis by 90 degrees counterclockwise. I've used it for touches. viewCenter is just center of the view
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let location = touch.location(in: self)
guard let viewCenter = self.viewCenter else { return }
let angle = angle(between: CGPoint(x: location.x, y: location.y) , ending:viewCenter)
print(angle)
}
}
func angle(between starting: CGPoint, ending: CGPoint) -> CGFloat {
let center = CGPoint(x: ending.x - starting.x, y: ending.y - starting.y)
let angle90 = deg2rad(90)
//Rotate axis by 90 degrees counter clockwise
let rotatedX = center.x * cos(angle90) + center.y * sin(angle90)
let rotatedY = -center.x * sin(angle90) + center.y * cos(angle90)
let radians = atan2(rotatedY, rotatedX)
let degrees = radians * 180 / .pi
return degrees > 0 ? degrees : degrees + 360
}
func deg2rad(_ number: CGFloat) -> CGFloat {
return number * .pi / 180
}

Resources