I have a line, which is a sprite made by using this code
CGPoint diff = ccpSub(startLocation, endLocation);
float rads = atan2f( diff.y, diff.x);
float degs = -CC_RADIANS_TO_DEGREES(rads);
float dist = ccpDistance(endLocation, startLocation);
CCSprite *line = [CCSprite spriteWithFile:#"line.png"];
[line setAnchorPoint:ccp(0.0f, 0.5f)];
[line setPosition:endLocation];
[line setScaleX:dist / line.boundingBox.size.width];
[line setRotation: degs];
line.tag = 1;
[_lines addObject:line];
[self addChild:line];
Now in my collision detection code I use the following code to create a CGRect:
CGRect lineRect = CGRectMake(
line.position.x - (line.contentSize.width/2),
line.position.y - (line.contentSize.height/2),
line.contentSize.width,
line.contentSize.height);
This of course is faulty because the line is made using an angle.
I'm trying to compare a rectangle, a square sprite, with this rectangle.
The idea is that a character is moving and the player can draw a line, if the character hits the line it's bounces of in the opposite direction.
I have the angle, the x&y position of one side of the line and the length of the line.
How do I get the other x&y position of the other side of the line?
Hope you guys can help me out.
Thanks in advance!
It's a little hard to know if I understand you correctly, but when you say "...the other x&y position of the other side of the line," I'm assuming you meant what is the coordinates of the opposite endpoint of the line.
If so, I believe the actual formula you may be looking for is:
{x1, y1} = starting point that you already have
{x2, y2] = point you're looking to find
x2 = x1 + (distance * cosA)
y2 = y1 + (distance * sinA)
But as I mentioned earlier, I'm a little unsure whether this would give you exactly what you needed. If you're actually looking for the points of the vertices of the opposite side of the bounding box for the Line.png, then perhaps you could still use that formula, but change the degrees by 90 to determine the coordinates offset from the x&y that you already know.
Anyway, I hope that at least helps put you in a helpful direction.
You should probably look up the documentation for CGRectApplyAffineTransform, CGAffineTransformMakeRotation and possibly CGAffineTransformMakeTranslation.
Related
I am working on an iOS camera based app, in which I have to capture a first point and then I need to draw the line to the current focus point to the first captured point. MagicPlan works this way.
Here is an image:
I have tried to fix a point for first point using accelerometer values and the tilted angle of the device. But, no luck so far. And how would i draw the line to the second point from the first point?
This is the code that i have tried so far:
if (self.motionManager.deviceMotionAvailable)
{
[self.motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMDeviceMotion *motion, NSError *error) {
CATransform3D transform;
transform = CATransform3DMakeRotation(motion.attitude.pitch, 1, 0, 0);
transform = CATransform3DRotate(transform,motion.attitude.roll, 0, 1, 0);
transform = CATransform3DRotate(transform,motion.attitude.yaw, 0, 0, 1);
self.viewObject.layer.transform = transform;
}];
}
if (self.motionManager.deviceMotionActive)
{
/**
* Pulling gravity values from deviceMotion sensor
*/
CGFloat x = [self convertRadianToDegree:self.motionManager.deviceMotion.gravity.x];
CGFloat y = [self convertRadianToDegree:self.motionManager.deviceMotion.gravity.y];
CGFloat z = [self convertRadianToDegree:self.motionManager.deviceMotion.gravity.z];
CGFloat r = sqrtf(x*x + y*y + z*z);
/**
* Calculating device forward/backward title angle in degrees
*/
CGFloat tiltForwardBackward = acosf(z/r) * 180.0f / M_PI - 90.0f;
[self.lblTilForwardBackward setText:[#(tiltForwardBackward) stringValue]];
}
You have a lot of issues to resolve here. It isn't just a matter of adjusting for camera orientation as the height that the camera is being held at and position of the camera in the room are also changing. Even in MagicPlan, when the person turns around, the camera moves (rotates about the axis going through the person's head down to his feet).
There is quite a lot of algebra and rotation/translation matrix operations to work out. No one is going to do this for you. You'll have to figure it all out and derive it yourself (or look it up from old graphics text books).
I suggest doing something as straight forward and multi-step as possible (so you can debug each step along the way). Assume flat ground (indoor environment).
Get camera position/orientation/focal length from the first snapshot.
Figure out the touch point in real world Cartesian coordinates(start with video coordinates and translate via roll/pitch/yaw and ray traced projection to ground plane(using camera height).
From the focal length you can figure out the field of view and depth to center of field of view and using camera orientation and click distance from center of screen determine xyz offset from some origin (your feet maybe).
Determine and track camera position and orientation relative to that origin.
On second snapshot (or motion awake), figure out (center or touched point) distance from origin and exact xyz (as above).
Once you have those two points in xyz you can plot the line by taking the standard orthogonal projection onto the view plane. Clipping as needed in case original point is out of the FOV.
I have two image views. The first is the blueish arrow, and the second is the white circle, with a black dot drawn to represent the center of the circle.
I'm trying to rotate the arrow so it's anchor point is the black dot in the picture like this
Right now I'm setting the anchor point of the arrow's layer to a point calculated like this
CGFloat y = _userImageViewContainer.center.y - CGRectGetMinY(_directionArrowView.frame);
CGFloat x = _userImageViewContainer.center.x - CGRectGetMinX(_directionArrowView.frame);
CGFloat yOff = y / CGRectGetHeight(_directionArrowView.frame);
CGFloat xOff = x / CGRectGetWidth(_directionArrowView.frame);
_directionArrowView.center = _userImageViewContainer.center;
CGPoint anchor = CGPointMake(xOff, yOff);
NSLog(#"anchor: %#", NSStringFromCGPoint(anchor));
_directionArrowView.layer.anchorPoint = anchor;
Since the anchor point is set as a percentage of the view, i.e. the coords for the center are (.5, .5), I'm calculating the percentage of the height in arrow's frame where the black dot falls. But my math, even after working out by hand, keeps resulting in .5, which isn't right because it's further than half way down when the arrow is in the original position (vertical, with the point up).
I'm rotating based on the user's compass heading
CLHeading *heading = [notif object];
// update direction of arrow
CGFloat degrees = [self p_calculateAngleBetween:[PULAccount currentUser].location.coordinate
and:_user.location.coordinate];
_directionArrowView.transform = CGAffineTransformMakeRotation((degrees - heading.trueHeading) * M_PI / 180);
The rotation is correct, it's just the anchor point that's not working right. Any ideas of how to accomplish this?
I've always found the anchor point stuff flaky, especially with rotation. I'd try something like this.
CGPoint convertedCenter = [_directionArrowView convertPoint:_userImageViewContainer.center fromView:_userImageViewContainer ];
CGSize offset = CGSizeMake(_directionArrowView.center.x - convertedCenter.x, _directionArrowView.center.y - convertedCenter.y);
// I may have that backwards, try the one below if it offsets the rotation in the wrong direction..
// CGSize offset = CGSizeMake(convertedCenter.x -_directionArrowView.center.x , convertedCenter.y - _directionArrowView.center.y);
CGFloat rotation = 0; //get your angle (radians)
CGAffineTransform tr = CGAffineTransformMakeTranslation(-offset.width, -offset.height);
tr = CGAffineTransformConcat(tr, CGAffineTransformMakeRotation(rotation) );
tr = CGAffineTransformConcat(tr, CGAffineTransformMakeTranslation(offset.width, offset.height) );
[_directionArrowView setTransform:tr];
NB. the transform property on UIView is animatable, so you could put that last line there in an animation block if desired..
Maybe better use much easier solution - make arrow image size bigger, and square. So the black point will be in center of image.
Please compare attached images and you understand what I'm talking about
New image with black dot in center
Old image with shifted dot
Now you can easy use standard anchor point (0.5, 0.5) to rotate edited image
Im trying to find out how to tell if my 2 round frames are intersecting each other. since they are round i cant use cgrectintersectsrect and am not sure how to go about this. is there a cgframeintersectsframe or something along those lines?
for my round uiimageviews i did
circle1 = [[uiimageview alloc] initwithframe:cgrectmake (100,100,50,50);
circle1.layer.cornerradius = 25;
circle1.clipstobounds = yes;
[self.view addsubview:circle1];
the other circle is basically like that too except with a different x and y origin
I also alreday imported quartzcore
Just calculate the distance between the centers of your circles and check if it's smaller than the radius:
float distanceBetweenCenters = sqrt(pow(circle1.center.x - circle2.center.x, 2) +
pow(circle1.center.y - circle2.center.y, 2));
BOOL isIntersecting = distanceBetweenCenters <= 2 * radius;
This will tell you whether the circles are intersecting or touching each other. Replace the <= with < to exclude 'touching'.
I have a UIImageView which contains my main Character, and I have made the UIImageView appear circle see below code for creating my character
copter = [[UIImageView alloc] initWithFrame:CGRectMake(100, 500, 90, 90)];
[copter setContentMode:UIViewContentModeScaleAspectFit];
copter.layer.cornerRadius = roundf(copter.frame.size.width/2.0);
copter.layer.masksToBounds = YES;
[copter startAnimating];
[[self view] addSubview:copter];
[self setBat:copter];
I am having trouble with my collision between my character and other objects in the game. The collision is being detected on the rectangle rather than the circle. I have searched everywhere for an answer to fix this but no luck.
Here is my collision code I am using:
self.batVelocity += [link duration]*FBDownardBatAccelerationIpad;
[[self copter] setFrame:CGRectOffset([[self copter] frame], 0, [self batVelocity])];
UIView *removeBlock = nil;
for (UIView *block in [self blocks]) {
[block setFrame:CGRectOffset([block frame], [link duration]*FBSidewaysVelocityIpad, 0)];
if (CGRectIntersectsRect([[self copter] frame], [block frame])) {
[self failed];
So basically I need the circle bounds of the character to collide with a rectangle object, not the rectangle bounds of which the circle is in. I hope that makes sense.
Thanks in advance
I'd use a CGPathRef or CGMutablePathRef. So as #user3386109 suggests, check for collision between rects first, then if you have a pathRef for your sprite or whatever it is you can use CGPathContainsPoint(). Quartz has a few other comparison functions for CGPathRef for Rects as well, and for comparing a path against a second path, check the docs. I find CGPathRef to be quite efficient, but don't forget to release it with a CGPathRelease() to match every CGPathCreate or CGPathCopy() function or the leaks can add up real fast..
Right, so first you check for overlapping rectangles (which you've already done). When you find an overlap, then you need to refine the collision test. For each corner of the offending rectangle, compute the distance from the center of your character to the corner of the offending rectangle. If the distance from the center to any of the corners is less than the radius of the circle, then you have a collision. To optimize the code a little bit, compute (dx * dx + dy * dy) and compare that to the radius squared. That way you don't have to compute any square roots.
Upon further review, you also need to do an edge check in addition to the corner check. For example, if the top y value is above the center of the circle and the bottom y value is below the center of the circle, then compute the difference in x between the rectangle left edge and the center of the circle, and if that distance is less than the radius, then a collision has occurred. Likewise for the other three edges of the rectangle.
Here's some pseudo-code for the corner checking
int dx, dy, radius, radiusSquared;
radiusSquared = radius * radius;
for ( each rectangle that overlaps the player rectangle )
{
for ( each corner of the rectangle )
{
dx = corner.x - center.x;
dy = corner.y - center.y;
if ( dx * dx + dy * dy < radiusSquared )
Collision!!!
}
}
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Rotate CGPath without changing its position
I searched and tested a variety of code for a couple of hours and I can't get this to work.
I am adding an arbitrary UIBezierPath at a random location to a CAShapeLayer which gets added to a view. I need to rotate the path so that I can handle device rotations. I can rotate the layer instead of the path. I just need the result to be rotated.
I already have methods to handle transforming the bezier path by scaling and translation. It works great, but now I need to simply rotate 90 degrees left or right.
Any recommendations on how to do this?
Basic code:
UIBezierPath *path = <create arbitrary path>
CAShapeLayer *layer = [CAShapeLayer layer];
[self addPathToLayer:layer
fromPath:path];
// I could get the center of the box but where is the box center for the view it is in?
// CGRect box = CGPathGetPathBoundingBox(path.CGPath);
// layer.anchorPoint = ? How to find the center of the box for the anchor point?
// Rotating here appears to rotate around 0,0 of the view
layer.transform = CATransform3DMakeRotation(DegreesToRadians(-90), 0.0, 0.0, 1.0);
I see the following post:
BezierPath Rotation in a UIView
I suppose I could rotate as-is and then translate the path back into place. I just need to figure out what the translation values would be.
I should also state that what I am seeing after I try to rotate is that the image moves off-screen somewhere. I tried rotating 25 degrees to see movement and it pivots around the view's origin of 0,0 so that if I rotate 90 degrees the image is off-screen. I am running these test WITHOUT rotating the device - just to see how rotation works.
UPDATE #1 - 12/4/2012: For some bizarre reason if I set the position to a value I found empirically it moves the rotated bezier path into the correct position after rotation:
layer.position = CGPointMake(280, 60);
This values are a guess from starting/stopping the app and making adjustments. I have no idea why I need to adjust the position on rotation. The anchor point should be in the center of the layer. However, I did find that both the frame and position of a CAShapeLayer are all ZERO even though the path is set, and also the fact that the path is in the correct position within the view. The 280, 60 position shifts the path into what would be the center of the path bounding box when a rotation of +90 is made. If I change the rotation value I need to adjust the position. I should not have to do this manually adjustment.
I think a last resort is to somehow convert the bezier path to an image and then add it. I found that if I set the layer content to an image, then rotate, it rotates about its center point with no positional adjustment needed. Not so with setting the path.
UPDATE #2 12/4/2012 - I tried setting the frame and with fiddling I get it to center as follows:
CGRect box = CGPathGetPathBoundingBox(path.CGPath);
CGRect rect = CGRectMake(0, 0, box.origin.x + (3.5 * box.size.width), box.origin.y + (3.5 * box.size.height));
layer.frame = rect;
layer.transform = CATransform3DMakeRotation(DegreesToRadians(90), 0.0, 0.0, 1.0);
Why multiply by 3.5? I have no clue. I found that adding the box origin with about 3.5 times the size of the box shifts the rotated CAShapeLayer path to about where it should be.
There must be a better way to do this. This is a better solution than my previous post since the frame size does not depend on the rotation angle. I just don't know why the frame needs to be set to the value I am setting it to. I THOUGHT it should be
CGRectMake(0, 0, box.origin.x + (box.size.width / 2), box.origin.y + (box.size.height / 2));
However, it shifts the image to the left too much.
Another clue I found is that if I set the frame of [self view].frame (the frame of the entire parent view, which is the screen of the iPhone), then rotate, the rotation point is the center of the screen, an the path/image orbits around this center point. This is why I tried shifting the frame to what the center of the path should be so that it orbits around the box center.
UPDATE #3 12/4/2012 - I tried to render the layer as an image. However, it appears that just setting the path of a layer does not make it an "image" in the layer since it is empty
CGRect box = CGPathGetPathBoundingBox(path.CGPath);
layer.frame = box;
UIImage *image = [ImageHelper imageFromLayer:layer]; // ImageHelper library I created
CAShapeLayer *newLayer = [CAShapeLayer layer];
newLayer.frame = CGRectMake(box.origin.x, box.origin.y, image.size.width, image.size.height);
newLayer.contents = (id) image.CGImage;
It appears that rotating the layer with its path set is no different than simply rotating the bezier path itself. I will go back to rotating the bezier path and see if I can fiddle with the position elements or something. There's got to be a solution to this.
Goal: Rotate a UIBezierPath around its center point within the view it was originally created in.
UPDATE #4 12/4/2012 - I ran a series of tests measuring the values needed for translation in order to place a UIBezierPath in its previous center location.
CGAffineTransform rotate = CGAffineTransformMakeRotation(DegreesToRadians(-15));
[path applyTransform:rotate];
// CGAffineTransform translate = CGAffineTransformMakeTranslation(-110, 70); // -45
CGAffineTransform translate = CGAffineTransformMakeTranslation(-52, -58); // -15
[path applyTransform:translate];
However, the ratios of x/y translations do not correspond so I cannot extrapolate what translation is required based on the angle. It appears that 'CGAffineTransformMakeRotation' uses some arbitrary anchor put to make the rotation, which at the moment appears to be maybe (viewWidth / 2, 0). I am making this much harder than it needs to be. There's something I am missing to make a simple rotation so that the center point is maintained. I just need to "spin" the path 90 degrees left or right.
UPDATE #5 12/4/2012 - After running additional tests it appears that the anchor point for rotating a UIBezierPath is the origin from where all of the points were drawn. In this case the origin is 0,0 and all of the points are relative to that point. Therefore, it a rotation is applied, the rotation is occurring around the origin, and is why the path shifts up-right on -90 and up-left on 90. I need to somehow set the anchor point for the rotation to the center so it "spins" around the center, rather than the original origin point. 12 hours spent on this one issue.
After some detailed analysis and graphing the bounding box on paper I found my assertion that the origin of 0,0 is correct.
A solution to this problem is to translate the path (the underlying matrix) to the origin, with the center of the bounding box at origin, rotate, then translate the path back to its original location.
Here's how to rotate a UIBezierPath 90 degrees:
CGAffineTransform translate = CGAffineTransformMakeTranslation(-1 * (box.origin.x + (box.size.width / 2)), -1 * (box.origin.y + (box.size.height / 2)));
[path applyTransform:translate];
CGAffineTransform rotate = CGAffineTransformMakeRotation(DegreesToRadians(90));
[path applyTransform:rotate];
translate = CGAffineTransformMakeTranslation((box.origin.x + (box.size.width / 2)), (box.origin.y + (box.size.height / 2)));
[path applyTransform:translate];
Plug in -90 degrees to rotate in the other direction.
This formula can be used when rotating the device from portrait to landscape and vice/versa.
I still don't think this is the ideal solution but the result is what I need for now.
If anyone has a better solution for this please post.
UPDATE 12/7/2012 - I found what I think is the best solution, and very simple as I though it would be. Rather than using rotate, translate, and scale methods on the bezier path, I instead extract the array of points as CGPoint objects, and scale/translate them as needed based on the view size as well as the orientation. I then create a new bezier path and set the layer to this path.
The result is perfect scaling, translation, rotation.