I am trying to draw an arrow using my finger on the screen. the idea was that by touching the screen I set the initial coordinates of my arrow, and as I dragged on the screen the arrow would extend and follow my finger. the height and width of the arrow will be the same, its the size of the arrow that matters. The arrow will be longer as I drag it away from the starting point. I tried dong it with something like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UIGraphicsBeginImageContext(CGSizeMake(1536, 2048));
UITouch *touch = [touches anyObject];
CGPoint p1 = [touch locationInView:self.view];
CGSize size;
size.width = 50;
size.height = 400;
CGContextRef context = UIGraphicsGetCurrentContext();
[self drawArrowWithContext:context atPoint:p1 withSize:size lineWidth:4 arrowHeight:20 andColor:[UIColor whiteColor]];
// converts your context into a UIImage
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
// Adds that image into an imageView and sticks it on the screen.
UIImageView *imageView = [[UIImageView alloc] initWithImage:image];
[self.view addSubview:imageView];
}
and
- (void) drawArrowWithContext:(CGContextRef)context atPoint:(CGPoint)startPoint withSize: (CGSize)size lineWidth:(float)width arrowHeight:(float)aheight andColor:(UIColor *)color
{
float width_wing = (size.width - width) / 2;
float main = size.height-aheight;
CGContextSetFillColorWithColor(context, [color CGColor]);
CGContextSetStrokeColorWithColor(context, [color CGColor]);
CGPoint rectangle_points[] = {
CGPointMake(startPoint.x + width_wing, startPoint.y + 0.0),
CGPointMake(startPoint.x + width_wing, startPoint.y + main),
CGPointMake(startPoint.x + 0.0, startPoint.y + main), // left point
CGPointMake(startPoint.x + size.width / 2, startPoint.y + size.height),
CGPointMake(startPoint.x + size.width, startPoint.y + main), // right point
CGPointMake(startPoint.x + size.width-width_wing, startPoint.y + main),
CGPointMake(startPoint.x + size.width-width_wing, startPoint.y + 0.0),
CGPointMake(startPoint.x + width_wing, startPoint.y + 0.0),
};
CGContextAddLines(context, rectangle_points, 8);
CGContextFillPath(context);
}
The arrow does appear on the screen if I run the code from the touches moved in a normal IBOutlet, but that was not the idea. I haven't managed to get this code to work yet, but I think that even if it worked, it would cause a crash, since I am deleting and redrawing the shape each time. Is this the right approach? What should I do?
Basically, there are some different options.
Stick to the UIImageView approach and stop recreating the image view all the time. Keep a reference to that UIImageView in the class detecting the touch events and just replace the image if something changed. Might be worth the extra effort for off-screen-drawing if you need the image for something else.
Implement the arrow drawing dynamically in an extra view.
The second one I will describe here briefly:
The class detecting the event needs a member variable/property, ArrowView *arrowView and something to remember the start point, CGPoint startPoint maybe.
ArrowView needs properties for the arrow parameters, CGPoint arrowStart, CGSize arrowSize.
In the touch event handler, do
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint position = [touch locationInView:self.view];
[self.startPoint startFrom:position];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint next = [touch locationInView:self.view];
CGSize size;
size.width = next.x - self.startPoint.x;
size.height = next.y - self.startPoint.y;
self.arrowView.arrowPoint = self.startPoint;
self.arrowView.arrowSize = size;
[self.arrowView setNeedsDisplay];
}
In ArrowView:
- (void)drawRect:(CGRect)rect {
[super drawRect:rect];
CGContextRef context = UIGraphicsGetCurrentContext();
// Now do the drawing stuff here using that context!
// self.arrowSize / self.arrowPoint do contain your drawing parameters.
...
}
I hope that gives you an idea.
For further reading, start here.
Yo can try to use UIPanGestureRecognizer to follow your finger and draw above of it.
Related
I am creating a custom UIControl object as detailed here. It is all working well except for the touch area.
I want to find a way to limit the touch area to only part of the control, in the example above I want it to be restricted to the black circumference only rather than the whole control area.
Any idea?
Cheers
You can override UIView's pointInside:withEvent: to reject unwanted touches.
Here's a method that checks if the touch occurred in a ring around the center of the view:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
UITouch *touch = [[event touchesForView:self] anyObject];
if (touch == nil)
return NO;
CGPoint touchPoint = [touch locationInView:self];
CGRect bounds = self.bounds;
CGPoint center = { CGRectGetMidX(bounds), CGRectGetMidY(bounds) };
CGVector delta = { touchPoint.x - center.x, touchPoint.y - center.y };
CGFloat squareDistance = delta.dx * delta.dx + delta.dy * delta.dy;
CGFloat outerRadius = bounds.size.width * 0.5;
if (squareDistance > outerRadius * outerRadius)
return NO;
CGFloat innerRadius = outerRadius * 0.5;
if (squareDistance < innerRadius * innerRadius)
return NO;
return YES;
}
To detect other hits on more complex shapes you can use a CGPath to describe the shape and test using CGPathContainsPoint. Another way is to use an image of the control and test the pixel's alpha value.
All that depends on how you build your control.
In my app, a circle is drawn based on a users drag. e.g. a user taps, that is the center of a circle that will be drawn, and as they drag their finger, the circle will grow to that point. This works, except for some reason the center moves down and to the right as the radius of the circle grows. Why is this happening? Here is what I am trying:
#implementation CircleView{
CGPoint center;
CGPoint endPoint;
CGFloat distance;
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
center = [touch locationInView:self];
[self setNeedsDisplay];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
endPoint = [touch locationInView:self];
CGFloat xDist = (endPoint.x - center.x);
CGFloat yDist = (endPoint.y - center.y);
distance = sqrt((xDist * xDist) + (yDist * yDist));
[self setNeedsDisplay];
}
- (void)drawRect:(CGRect)rect{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 2.0);
CGContextSetStrokeColorWithColor(context, [UIColor blueColor].CGColor);
CGRect rectangle = CGRectMake(center.x,center.y,distance, distance);
CGContextAddEllipseInRect(context, rectangle);
CGContextStrokePath(context);
}
What should be happening is that center point never ever moves, and the circle should just grow. Any ideas?
CGRect rectangle = CGRectMake(center.x,center.y,distance, distance);
should be:
CGRect rectangle = CGRectMake(center.x - distance, center.y - distance, distance * 2, distance * 2);
Because in CGRectMake you have to specify the origin (and the size) of the rectangle, not the center.
CGRect rectangle = CGRectMake(center.x - distance, center.y - distance, distance * 2, distance * 2);
I'm trying to draw a ruller which follow the touches on the screen. On the first touch, I set the first point and on all the other, the ruller follow the finger on ther screen resizing and rotating around the first point depending on where the finger is.
I tried to do this :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self.view];
self.regleFirstPoint = p;
UIImageView* img = [[UIImageView alloc] initWithImage:self.regleImg];
img.frame = CGRectMake(self.regleFirstPoint.x, self.regleFirstPoint.y, 0, self.regleImg.size.height);
[self.view addSubview:img];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self.view];
// Width
float deltaY = p.y - self.regleFirstPoint.y;
float deltaX = p.x - self.regleFirstPoint.x;
float width = sqrt((deltaX * deltaX) + (deltaY * deltaY));
// Angle
float angleInRadians = atanf(deltaY / deltaX);
float angleInDegrees = angleInRadians * 180 / M_PI; // JUST FOR INFO
NSLog(#"angle : %f / %f", angleInRadians, angleInDegrees);
// Resizing image
UIImageView* img = [self.regles lastObject];
img.frame = CGRectMake(self.regleFirstPoint.x, self.regleFirstPoint.y, width, self.regleImg.size.height/2);
img.center = self.regleFirstPoint;
CGAffineTransform transform = CGAffineTransformIdentity;
img.transform = CGAffineTransformRotate(transform, angleInRadians);
}
The ruller doesn't follow the finger correctly, I think I missed something. What's wrong with my code ?
EDIT : I also tried this after some researches :
// Resizing images
img.frame = CGRectMake(self.regleFirstPoint.x, self.regleFirstPoint.y, largeur, self.regleImg.size.height/2);
[img.layer setAnchorPoint:CGPointMake(self.regleFirstPoint.x / img.bounds.size.width, self.regleFirstPoint.y / img.bounds.size.height)];
img.transform = CGAffineTransformRotate(img.transform, angleInRadians);
This is just a question of order:
Set transform of your view to identity
Change the frame of your view
Finally, apply your transform
Here is an updated piece of code, just used an UIView instead of your image:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
// Hold first touch
self.regleFirstPoint = [touch locationInView:self.view];
// Reset view / image
_rulerView.transform = CGAffineTransformIdentity;
_rulerView.frame = CGRectMake(self.regleFirstPoint.x, self.regleFirstPoint.y, 0, CGRectGetHeight(_rulerView.frame));
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self.view];
// Compute width
float deltaX = p.x - self.regleFirstPoint.x;
float deltaY = p.y - self.regleFirstPoint.y;
float width = sqrt((deltaX * deltaX) + (deltaY * deltaY));
// Compute angle
float angleInRadians = atan2(deltaY, deltaX);
NSLog(#"angle (rad) : %f", angleInRadians);
NSLog(#"angle (deg) : %f", angleInRadians * 180 / M_PI);
// First reset the transformation to identity
_rulerView.transform = CGAffineTransformIdentity;
// Set anchor point
_rulerView.layer.anchorPoint = CGPointMake(0.0f, 0.5f);
// Resizing view / image
_rulerView.frame = CGRectMake(self.regleFirstPoint.x, self.regleFirstPoint.y, width, CGRectGetHeight(_rulerView.frame));
// Reset the layer position to the first point
_rulerView.layer.position = self.regleFirstPoint;
// Apply rotation transformation
_rulerView.transform = CGAffineTransformMakeRotation(angleInRadians);
}
Hope that helps.
Cyril
I'm trying create an application that allows the user to move a frame over an image, so that I can apply some effects on a selected region.
I need to allow the user to precisely drag and scale the masked-frame on the image. I need this to be exact, just like any other photo app does.
My strategy is to get the touch points of the user, on a touch-moved event, and scale my frame accordingly. That was pretty intuitive. I coded the following stuff for handling the touch moved event :
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:[self view]];
float currX = touchPoint.x;
float currY = touchPoint.y;
/*proceed with other operations on currX and currY,
which is coming out quite well*/
}
But the only problem is, the coordinates of the currX and currY variables are not quite where they are supposed to be. There is a parallax error, which keeps shifting from device to device. I also think the x and y coordinates gets swapped in the case of an iPad.
Could you please help me to figure out how to get the exact touch coordinates?
My background image is in one view (imageBG) and the masked frame is in a separate one (maskBG). I have tried out :
CGPoint touchPoint = [touch locationInView:[maskBG view]];
and
CGPoint touchPoint = [touch locationInView:[imageBG view]];
...but the same problem persists. I have also noticed the error on touch being worse on an iPad than on an iPhone or iPod.
image.center = [[[event allTouches] anyObject] locationInView:self.view];
Hi your issue is the image and the iPhone screen are not necessarily in same aspect ratio.Your touch point might not translate correctly to your actual image.
- (UIImage*) getCroppedImage {
CGRect rect = self.movingView.frame;
CGPoint a;
a.x=rect.origin.x-self.imageView.frame.origin.x;
a.y=rect.origin.y-self.imageView.frame.origin.y;
a.x=a.x*(self.imageView.image.size.width/self.imageView.frame.size.width);
a.y=a.y*(self.imageView.image.size.height/self.imageView.frame.size.height);
rect.origin=a;
rect.size.width=rect.size.width*(self.imageView.image.size.width/self.imageView.frame.size.width);
rect.size.height=rect.size.height*(self.imageView.image.size.height/self.imageView.frame.size.height);
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
// translated rectangle for drawing sub image
CGRect drawRect = CGRectMake(-rect.origin.x, -rect.origin.y, self.imageView.image.size.width, self.imageView.image.size.height);
// clip to the bounds of the image context
// not strictly necessary as it will get clipped anyway?
CGContextClipToRect(context, CGRectMake(0, 0, rect.size.width, rect.size.height));
// draw image
[self.imageView.image drawInRect:drawRect];
// grab image
UIImage* croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return croppedImage;
}
This is what i did to crop my moving view is the rect which i pass for cropping see how its being translated to reflect correctly on image.Make sure the image view on which the user sees image is aspectfit content mode.
Note:- I make the rect of image view fit the aspectFit image
use this to do it
- (CGSize)makeSize:(CGSize)originalSize fitInSize:(CGSize)boxSize
{
widthScale = 0;
heightScale = 0;
widthScale = boxSize.width/originalSize.width;
heightScale = boxSize.height/originalSize.height;
float scale = MIN(widthScale, heightScale);
CGSize newSize = CGSizeMake(originalSize.width * scale, originalSize.height * scale);
return newSize;
}
have you tried these:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:selectedImageView];
float currX = (touchPoint.x)/selectedImageView.frame.size.width;
float currY = (touchPoint.y)/selectedImageView.frame.size.height;
/*proceed with other operations on currX and currY,
which is coming out quite well*/
}
or you can also use UIPanGestureRecognizer..
I am working on a sketching app on the iPhone.
I got it working but not pretty as seen here
And I am looking for any suggestion to smooth the drawing
Basically, what I did is when user places a finger on the screen I called
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
then I collect a single touch in an array with
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
and when the user lefts a finger from the screen, I called
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
then I draw all the points in the array using
NSMutableArray *points = [collectedArray points];
CGPoint firstPoint;
[[points objectAtIndex:0] getValue:&firstPoint];
CGContextMoveToPoint(context, firstPoint.x, firstPoint.y);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineJoin(context, kCGLineJoinRound);
for (int i=1; i < [points count]; i++) {
NSValue *value = [points objectAtIndex:i];
CGPoint point;
[value getValue:&point];
CGContextAddLineToPoint(context, point.x, point.y);
}
CGContextStrokePath(context);
UIGraphicsPushContext(context);
And now I want to improve the drawing tobe more like "Sketch Book" App
I think there is something to do with signal processing algorithm to rearrange all the points in the array but I am not sure. Any Help would be much appreciated.
Thankz in advance :)
CGPoint midPoint(CGPoint p1, CGPoint p2)
{
return CGPointMake((p1.x + p2.x) * 0.5, (p1.y + p2.y) * 0.5);
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
previousPoint1 = [touch previousLocationInView:self];
previousPoint2 = [touch previousLocationInView:self];
currentPoint = [touch locationInView:self];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
previousPoint2 = previousPoint1;
previousPoint1 = [touch previousLocationInView:self];
currentPoint = [touch locationInView:self];
// calculate mid point
CGPoint mid1 = midPoint(previousPoint1, previousPoint2);
CGPoint mid2 = midPoint(currentPoint, previousPoint1);
UIGraphicsBeginImageContext(self.imageView.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.imageView.image drawInRect:CGRectMake(0, 0, self.imageView.frame.size.width, self.imageView.frame.size.height)];
CGContextMoveToPoint(context, mid1.x, mid1.y);
// Use QuadCurve is the key
CGContextAddQuadCurveToPoint(context, previousPoint1.x, previousPoint1.y, mid2.x, mid2.y);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, 2.0);
CGContextSetRGBStrokeColor(context, 1.0, 0.0, 0.0, 1.0);
CGContextStrokePath(context);
self.imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
The easiest way to smooth a curve like this is to use a Bezier curve instead of straight line segments. For the math behind this, see this article (pointed to in this answer), which describes how to calculate the curves required to smooth a curve that passes through multiple points.
I believe that the Core Plot framework now has the ability to smooth the curves of plots, so you could look at the code used there to implement this kind of smoothing.
There's no magic to any of this, as these smoothing routines are fast and relatively easy to implement.
I really love the topic. Thanks for all the implementations, espesially Krzysztof Zabłocki and Yu-Sen Han.
I have modified the version of Yu-Sen Han in order to change line thickness depending on the speed of panning (in fact the distance between last touches). Also I've implemented dot drawing (for touchBegan and touchEnded locations being close to each other)
Here is the result:
To define the line thickness I've chosen such a function of distance:
(Don't ask me why... I just though it suits well, but I'm sure you can find a better one)
CGFloat dist = distance(previousPoint1, currentPoint);
CGFloat newWidth = 4*(atan(-dist/15+1) + M_PI/2)+2;
One more hint. To be sure the thickness is changing smoothly, I've bounded it depending on the thickness of the previous segment and a custom coef:
self.lineWidth = MAX(MIN(newWidth,lastWidth*WIDTH_RANGE_COEF),lastWidth/WIDTH_RANGE_COEF);
I translated kyoji's answer into Swift, as a reusable subclass of UIImageView. The subclass TouchDrawImageView allows the user to draw on an image view with her finger.
Once you've added this TouchDrawImageView class to your project, make sure to open your storyboard and
select TouchDrawImageView as the "Custom Class" of your image view
check "User Interaction Enabled" property of your image view
Here's the code of TouchDrawImageView.swift:
import UIKit
class TouchDrawImageView: UIImageView {
var previousPoint1 = CGPoint()
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let touch = touches.first else { return }
previousPoint1 = touch.previousLocation(in: self)
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let touch = touches.first else { return }
let previousPoint2 = previousPoint1
previousPoint1 = touch.previousLocation(in: self)
let currentPoint = touch.location(in: self)
// calculate mid point
let mid1 = midPoint(p1: previousPoint1, p2: previousPoint2)
let mid2 = midPoint(p1: currentPoint, p2: previousPoint1)
UIGraphicsBeginImageContext(self.frame.size)
guard let context = UIGraphicsGetCurrentContext() else { return }
if let image = self.image {
image.draw(in: CGRect(x: 0, y: 0, width: frame.size.width, height: frame.size.height))
}
context.move(to: mid1)
context.addQuadCurve(to: mid2, control: previousPoint1)
context.setLineCap(.round)
context.setLineWidth(2.0)
context.setStrokeColor(red: 1.0, green: 0, blue: 0, alpha: 1.0)
context.strokePath()
self.image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
}
func midPoint(p1: CGPoint, p2: CGPoint) -> CGPoint {
return CGPoint(x: (p1.x + p2.x) / 2.0, y: (p1.y + p2.y) / 2.0)
}
}
Thankz for the input.I update my quest here because I need the space for it.
I look up both corePlot and Bezier curve solutions that you suggested with little success.
For the corePlot I am able to get the graph plot from an array of int but can't find anything related to curve smoothing.BTW Here I am using CPScatterPlot with some random number.
as for Bezier curve, My quest lead me to here It is something to do with Spline implementation in iOS
CatmullRomSpline *myC = [[CatmullRomSpline alloc] initAtPoint:CGPointMake(1.0, 1.0)];
[myC addPoint:CGPointMake(1.0, 1.5)];
[myC addPoint:CGPointMake(1.0, 1.15)];
[myC addPoint:CGPointMake(1.0, 1.25)];
[myC addPoint:CGPointMake(1.0, 1.23)];
[myC addPoint:CGPointMake(1.0, 1.24)];
[myC addPoint:CGPointMake(1.0, 1.26)];
NSLog(#"xxppxx %#",[myC asPointArray]);
NSLog(#"xxppxx2 %#",myC.curves);
and the result I get is:
2011-02-24 14:45:53.915 DVA[10041:40b] xxppxx (
"NSPoint: {1, 1}",
"NSPoint: {1, 1.26}"
)
2011-02-24 14:45:53.942 DVA[10041:40b] xxppxx2 (
"QuadraticBezierCurve: 0x59eea70"
)
I am not really sure how to go from there. So I am stuck on that front as well :(
I did look up GLPaint, as a last resource. It uses OpenGLES and use a "soft dot" sprite to plot the points in the array. I know it's more like avoiding the problem rather than fixing it. But I guess I'l share my findings here anyway.
The Black is GLPaint and the white one is the old method. And the last one is the drawing from "Sketch Book" app just to compare
I am still trying to get this done right, any further suggestion are most welcome.
To get rid of the silly dot in the GLPaint code.
Change in
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
this function
//Ändrat av OLLE
/*
// Convert touch point from UIView referential to OpenGL one (upside-down flip)
if (firstTouch) {
firstTouch = NO;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
} else {
location = [touch locationInView:self];
location.y = bounds.size.height - location.y;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
}
*/
location = [touch locationInView:self];
location.y = bounds.size.height - location.y;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
//Ändrat av OLLE//
I know that this isn't the solution for our problem, but it's something.