Touchpoint coordinates in iOS are not mapped properly - ios

I'm trying create an application that allows the user to move a frame over an image, so that I can apply some effects on a selected region.
I need to allow the user to precisely drag and scale the masked-frame on the image. I need this to be exact, just like any other photo app does.
My strategy is to get the touch points of the user, on a touch-moved event, and scale my frame accordingly. That was pretty intuitive. I coded the following stuff for handling the touch moved event :
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:[self view]];
float currX = touchPoint.x;
float currY = touchPoint.y;
/*proceed with other operations on currX and currY,
which is coming out quite well*/
}
But the only problem is, the coordinates of the currX and currY variables are not quite where they are supposed to be. There is a parallax error, which keeps shifting from device to device. I also think the x and y coordinates gets swapped in the case of an iPad.
Could you please help me to figure out how to get the exact touch coordinates?
My background image is in one view (imageBG) and the masked frame is in a separate one (maskBG). I have tried out :
CGPoint touchPoint = [touch locationInView:[maskBG view]];
and
CGPoint touchPoint = [touch locationInView:[imageBG view]];
...but the same problem persists. I have also noticed the error on touch being worse on an iPad than on an iPhone or iPod.

image.center = [[[event allTouches] anyObject] locationInView:self.view];

Hi your issue is the image and the iPhone screen are not necessarily in same aspect ratio.Your touch point might not translate correctly to your actual image.
- (UIImage*) getCroppedImage {
CGRect rect = self.movingView.frame;
CGPoint a;
a.x=rect.origin.x-self.imageView.frame.origin.x;
a.y=rect.origin.y-self.imageView.frame.origin.y;
a.x=a.x*(self.imageView.image.size.width/self.imageView.frame.size.width);
a.y=a.y*(self.imageView.image.size.height/self.imageView.frame.size.height);
rect.origin=a;
rect.size.width=rect.size.width*(self.imageView.image.size.width/self.imageView.frame.size.width);
rect.size.height=rect.size.height*(self.imageView.image.size.height/self.imageView.frame.size.height);
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
// translated rectangle for drawing sub image
CGRect drawRect = CGRectMake(-rect.origin.x, -rect.origin.y, self.imageView.image.size.width, self.imageView.image.size.height);
// clip to the bounds of the image context
// not strictly necessary as it will get clipped anyway?
CGContextClipToRect(context, CGRectMake(0, 0, rect.size.width, rect.size.height));
// draw image
[self.imageView.image drawInRect:drawRect];
// grab image
UIImage* croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return croppedImage;
}
This is what i did to crop my moving view is the rect which i pass for cropping see how its being translated to reflect correctly on image.Make sure the image view on which the user sees image is aspectfit content mode.
Note:- I make the rect of image view fit the aspectFit image
use this to do it
- (CGSize)makeSize:(CGSize)originalSize fitInSize:(CGSize)boxSize
{
widthScale = 0;
heightScale = 0;
widthScale = boxSize.width/originalSize.width;
heightScale = boxSize.height/originalSize.height;
float scale = MIN(widthScale, heightScale);
CGSize newSize = CGSizeMake(originalSize.width * scale, originalSize.height * scale);
return newSize;
}

have you tried these:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:selectedImageView];
float currX = (touchPoint.x)/selectedImageView.frame.size.width;
float currY = (touchPoint.y)/selectedImageView.frame.size.height;
/*proceed with other operations on currX and currY,
which is coming out quite well*/
}
or you can also use UIPanGestureRecognizer..

Related

iOS: Cropping around a drawn path?

I'm using ACEDrawingView to draw within a view.
How would I detect the width and height of the drawing, so that I can crop around it, something like this:
Update: After #Duncan pointed me in the right direction, I was able to look through the source code and found the following:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
// save all the touches in the path
UITouch *touch = [touches anyObject];
previousPoint2 = previousPoint1;
previousPoint1 = [touch previousLocationInView:self];
currentPoint = [touch locationInView:self];
if ([self.currentTool isKindOfClass:[ACEDrawingPenTool class]]) {
CGRect bounds = [(ACEDrawingPenTool*)self.currentTool addPathPreviousPreviousPoint:previousPoint2 withPreviousPoint:previousPoint1 withCurrentPoint:currentPoint];
CGRect drawBox = bounds;
drawBox.origin.x -= self.lineWidth * 2.0;
drawBox.origin.y -= self.lineWidth * 2.0;
drawBox.size.width += self.lineWidth * 4.0;
drawBox.size.height += self.lineWidth * 4.0;
self.drawingBounds = bounds; // I added this property to allow me to extract the bounds and use it in my view controller
[self setNeedsDisplayInRect:drawBox];
}
else if ([self.currentTool isKindOfClass:[ACEDrawingTextTool class]]) {
[self resizeTextViewFrame: currentPoint];
}
else {
[self.currentTool moveFromPoint:previousPoint1 toPoint:currentPoint];
[self setNeedsDisplay];
}
}
However I get this when I test the bounds:
I'm going to keep trying to figure it out, but if anyone could help that would be great!
Update 3: Using CGContextGetPathBoundingBox I was finally able to achieve it.
Every time you get a touchesMoved, record the location of the point you are now drawing. When you are all done, you have all the points. Now look at the largest x value and the smallest x value and the largest y value and the smallest y value in all of those points. That's the bounding box of the drawing.
Another approach (which you've already discovered) is to save the CGPath and then call CGContextGetPathBoundingBox. Basically that does exactly the same thing.
Note that a path has no thickness, whereas your stroke does. You will need to inset the bounding box negatively to allow for this (my screencast doesn't do that).
I'm not familiar with the AceDrawingView class. I can tell you how to do it with iOS frameworks though:
Create your path as a UIBezierPath.
Interrogate the bounds property of the path.

Measure distance of line drawn on iPhone screen

I am trying to creating an application that allows the user to draw a line on the screen and it measures the distance of the drawn line. I have been able to successfully draw the line but I don't know how to measure it. The line does not have to be perfectly straight either. It is basically a squiggle. If someone could please point me in the right direction or help guide me that would be awesome. I am using Xcode 5.1.1 and objective-c. I only just started dabbling in the language this summer.
EDIT: I am looking to measure the distance in either inches or cm. I would like the measurement to be the entire line, to follow the curve of the line. The distance not the displacement.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwipe = YES; //swipe declared in header
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self.view]; //tracking finger movement on screen
UIGraphicsBeginImageContext(CGSizeMake(320, 568)); // 568 iphone 5, 480 is iphone 4 (320,525)
[drawImage.image drawInRect:CGRectMake(0, 0, 320, 568)]; // 0,0 centered in corner
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound); //round line end
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0); // width of line
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [[UIColor redColor] CGColor]); //sets color to red (change red to any color for that color)
//CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0,1,0,1); //green color
CGContextBeginPath(UIGraphicsGetCurrentContext()); //start of when drawn path
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
[drawImage setFrame:CGRectMake(0, 0, 320, 568)]; //(320, 568)
drawImage.image = UIGraphicsGetImageFromCurrentImageContext(); //importnant
UIGraphicsEndImageContext(); //finished drawing for time period
lastPoint = currentPoint;
[self.view addSubview:drawImage]; //adds to page
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject]; //touch fires of touch
location = [touch locationInView:touch.view];
lastClick = [NSDate date];
lastPoint = [touch locationInView:self.view]; //stops connecting to previous line
lastPoint.y -= 0;
[super touchesEnded:touches withEvent: event];
}
First add a property that cumulates the length of the path.
#property(nonatomic, assign) CGFloat pathLength;
Initialize it to 0.0 when user begins drawing the path. Maybe do this in touchesBegan, or the same place elsewhere in your code where you realize you're beginning to draw. Add a method that computes the cartesian distance between points:
- (CGFloat)distanceFrom:(CGPoint)p1 to:(CGPoint)p2 {
CGFloat x = (p2.x - p1.x);
CGFloat y = (p2.y - p1.y);
return sqrt(x*x + y*y);
}
As you get touches moved, you are already handling the current and last touch positions. All you must do now is cumulate the distance between successive points:
// in touches moved, after you have lastPoint and currentPoint
self.pathLength += [self distanceFrom:currentPoint to:lastPoint];
There are quite a few refs here and elsewhere for converting these points to inches or cm. As far as I can see all are fraught with the inability to get the device resolution at runtime from the SDK. If you're willing to add a (dangerous) constant to the code, you can get PPI here, and divide that into the pathLength computed above.

Cropping ImageView to Produce a New Image

I am trying to crop an Image whenever the user touches the UIImageView on the screen. The UIImageView is 640 X 300 area and I allow user to touch anywhere in the UIImageView. Then I use the following code to view the croppedImage and it always show me the wrong image. I think I am having trouble getting the correct coordinates.
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint location = [[touches anyObject] locationInView:self.view];
UIImage *originalImage = self.imageView.image;
NSLog(#"x = %f, y = %f",location.x,location.y);
CGRect cropRegion = CGRectMake(location.x, location.y, 10, 10);
CGImageRef subImage = CGImageCreateWithImageInRect(originalImage.CGImage, cropRegion);
UIImage *croppedImage = [UIImage imageWithCGImage:subImage];
}
You're getting the coordinates relative to self.view, not relative to the image view. Try:
CGPoint location = [[touches anyObject] locationInView:self.imageView];

Define custom touch area in custom UIControl object

I am creating a custom UIControl object as detailed here. It is all working well except for the touch area.
I want to find a way to limit the touch area to only part of the control, in the example above I want it to be restricted to the black circumference only rather than the whole control area.
Any idea?
Cheers
You can override UIView's pointInside:withEvent: to reject unwanted touches.
Here's a method that checks if the touch occurred in a ring around the center of the view:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
UITouch *touch = [[event touchesForView:self] anyObject];
if (touch == nil)
return NO;
CGPoint touchPoint = [touch locationInView:self];
CGRect bounds = self.bounds;
CGPoint center = { CGRectGetMidX(bounds), CGRectGetMidY(bounds) };
CGVector delta = { touchPoint.x - center.x, touchPoint.y - center.y };
CGFloat squareDistance = delta.dx * delta.dx + delta.dy * delta.dy;
CGFloat outerRadius = bounds.size.width * 0.5;
if (squareDistance > outerRadius * outerRadius)
return NO;
CGFloat innerRadius = outerRadius * 0.5;
if (squareDistance < innerRadius * innerRadius)
return NO;
return YES;
}
To detect other hits on more complex shapes you can use a CGPath to describe the shape and test using CGPathContainsPoint. Another way is to use an image of the control and test the pixel's alpha value.
All that depends on how you build your control.

Draw images evenly spaced along a path in iOS

What I want to do is move my finger across the screen (touchesMoved) and draw evenly spaced images (perhaps CGImageRefs) along the points generated by the touchesMoved. I can draw lines, but what I want to generate is something that looks like this (for this example I am using an image of an arrow but it could be any image, could be a picture of my dog :) ) The main thing is to get the images evenly spaced when drawing with a finger on an iPhone or iPad.
First of all HUGE props go out to Kendall. So, based on his answer, here is the code to take a UIImage, draw it on screen along a path (not a real pathRef, just a logical path created by the points) based on the distance between the touches and then rotate the image correctly based on the VECTOR of the current and previous points. I hope you like it:
First you need to load an image to be used as a CGImage over and over again:
NSString *imagePath = [[NSBundle mainBundle] pathForResource:#"arrow.png" ofType:nil];
UIImage *img = [UIImage imageWithContentsOfFile:imagePath];
image = CGImageRetain(img.CGImage);
make sure in your dealloc that you call
CGImageRelease(image);
then in touchesBegan, just store the starting point in a var that is scoped outside the method (declare it in your header like this :) in this case I am drawing into a UIView
#interface myView : UIView {
CGPoint lastPoint;
}
#end
then in touches Began:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
lastPoint = [touch locationInView:self];
}
and finally in touchesMoved, draw the bitmap to the screen and then when your distance has moved enough (in my case 73, since my image is 73 pixels x 73 pixels) draw that image to the screen, save the new image and set lastPoint equal to currentPoint
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self];
double deltaX = lastPoint.x - currentPoint.x;
double deltaY = lastPoint.y - currentPoint.y;
double powX = pow(deltaX,2);
double powY = pow(deltaY,2);
double distance = sqrt(powX + powY);
if (distance >= 73){
lastPoint = currentPoint;
UIGraphicsBeginImageContext(self.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, self.frame.size.width, self.frame.size.height)];
CGContextSaveGState(UIGraphicsGetCurrentContext());
float angle = atan2(deltaX, deltaY);
angle *= (M_PI / 180);
CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(currentPoint.x, currentPoint.y, 73, 73),[self CGImageRotatedByAngle:image angle:angle * -1]);
CGContextRestoreGState(UIGraphicsGetCurrentContext());
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
distance = 0;
}
}
- (CGImageRef)CGImageRotatedByAngle:(CGImageRef)imgRef angle:(CGFloat)angle
{
CGFloat angleInRadians = angle * (M_PI / 180);
CGFloat width = CGImageGetWidth(imgRef);
CGFloat height = CGImageGetHeight(imgRef);
CGRect imgRect = CGRectMake(0, 0, width, height);
CGAffineTransform transform = CGAffineTransformMakeRotation(angleInRadians);
CGRect rotatedRect = CGRectApplyAffineTransform(imgRect, transform);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef bmContext = CGBitmapContextCreate(NULL,
rotatedRect.size.width,
rotatedRect.size.height,
8,
0,
colorSpace,
kCGImageAlphaPremultipliedFirst);
CGContextSetAllowsAntialiasing(bmContext, FALSE);
CGContextSetInterpolationQuality(bmContext, kCGInterpolationNone);
CGColorSpaceRelease(colorSpace);
CGContextTranslateCTM(bmContext,
+(rotatedRect.size.width/2),
+(rotatedRect.size.height/2));
CGContextRotateCTM(bmContext, angleInRadians);
CGContextTranslateCTM(bmContext,
-(rotatedRect.size.width/2),
-(rotatedRect.size.height/2));
CGContextDrawImage(bmContext, CGRectMake(0, 0,
rotatedRect.size.width,
rotatedRect.size.height),
imgRef);
CGImageRef rotatedImage = CGBitmapContextCreateImage(bmContext);
CFRelease(bmContext);
[(id)rotatedImage autorelease];
return rotatedImage;
}
this will create an image that looks like this :
Going to add the following (with some changes to the above code in order to try and fill in the voids where touchesMoved is missing some points when you move fast:
CGPoint point1 = CGPointMake(100, 200);
CGPoint point2 = CGPointMake(300, 100);
double deltaX = point2.x - point1.x;
double deltaY = point2.y - point1.y;
double powX = pow(deltaX,2);
double powY = pow(deltaY,2);
double distance = sqrt(powX + powY);
distance = 0;
for (int j = 1; j * 73 < distance; j++ )
{
double x = (point1.x + ((deltaX / distance) * 73 * j));
double y = (point1.y + ((deltaY / distance) * 73 * j));
NSLog(#"My new point is x: %f y :%f", x, y);
}
Assuming that you already have code that tracks the user's touch as they move their touch around the screen, it sounds like you want to detect when they have moved a distance equal to the length of your image, at which time you want to draw another copy of your image under their touch.
To achieve this, I think you will need to:
calculate the length (width) of your image
implement code to draw copies of your image onto your view, rotated to any angle
each time the user's touch moves (e.g. in touchesMoved:):
calculate the delta of the touch each time it moves and generate the "length" of that delta (e.g. something like sqrt(dx^2 + dy^2))
accumulate the distance since the last image was drawn
if the distance has reached the length of your image, draw a copy of your image under the touch's current position, rotated appropriately (probably according to the vector from the position of the last image to the current position)
How does that sound?

Resources