I am trying to implement a view that draws the users handwriting (curser position) for the iPad (4). I saw Apple's sample code, that uses OpenGL, however, there were parts I couldn't understand, so, I tried implementing this using core graphics.
#import "PaintView.h"
#include <stdlib.h>
#implementation PaintView
- (id)initWithCoder:(NSCoder *)aDecoder {
self = [super initWithCoder:aDecoder];
if(self) {
//
pointsToDraw = [[NSMutableArray alloc] init];
}
return self;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
NSLog(#"%#", touch);
CGPoint location = [touch locationInView:self];
CGPoint previousLocation = [touch previousLocationInView:self];
Ink *ink = [[Ink alloc] initWithPoint:previousLocation toPoint:location time:touch.timestamp];
// UITouch *newTouch = [touch copy];
[pointsToDraw addObject:ink];
[self setNeedsDisplay];
}
- (void)drawLine:(CGPoint)startingPoint toPoint:(CGPoint)endingPoint context:(CGContextRef)context
{
// Drawing code
CGContextSetStrokeColorWithColor(context, [UIColor redColor].CGColor);
CGRect temp = CGRectMake(10, 10, 100, 100);
// Draw them with a 2.0 stroke width so they are a bit more visible.
CGContextSetLineWidth(context, 2.0);
CGContextMoveToPoint(context, startingPoint.x,startingPoint.y); //start at this point
CGContextAddLineToPoint(context, endingPoint.x, endingPoint.y); //draw to this point
// and now draw the Path!
}
- (void)drawRect:(CGRect)rect
{
[super drawRect:rect];
// [self drawLine:CGPointMake(10, 10) toPoint:CGPointMake(30, 30)];
CGContextRef context = UIGraphicsGetCurrentContext();
for (Ink *ink in pointsToDraw){
[self drawLine:ink.point toPoint:ink.previousPoint context:context];
}
CGContextStrokePath(context);
}
#end
The problem is, every touch I draw everything (ink is a class that contains two CGPOINT's and a time stamp), and after a while this dramatically slows things down creating substantial lag.
My goal is to be able to both capture handwriting in a precise way, and play it back precisely.
Another things to consider, is that I am using a stylus which gives pressure information so I need to be able to draw my line in changing widths.
Any advice will be greatly appreciated.
Instead of storing the points in an array, store them in an array and a UIBezierPath. Then you only need to draw the bezier path within drawRect: instead of setting up the whole scheme.
The stylus doesn't give pressure information - at least not on iOS. iPhones have capacitive touch screens not resistive. The standard algorithm to change width is to use the speed as the factor and draw using little triangles that you fill in to create your line.
Funny! Here is a relevant article: http://www.nearinfinity.com/blogs/jason_harwig/2012/11/06/capture-a-signature-on-ios.html
Related
Is there any way to change the UIBezierPath drawing shape ,see the below image it like a line when user drag the finger,but i want star ,circle and other is there any way to achieve that.
My Expectation is
This is my UIBezierPath code:
-(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
touchPoint = [touch locationInView:self];
if (!CGPointEqualToPoint(startingPoint, CGPointZero))
{
UIBezierPath *path = [UIBezierPath bezierPath];
[path moveToPoint:CGPointMake(touchPoint.x,touchPoint.y)];
[path addLineToPoint:CGPointMake(startingPoint.x,startingPoint.y)];
CAShapeLayer *shapeLayer = [CAShapeLayer layer];
shapeLayer.path = [path CGPath];
shapeLayer.strokeColor = [single.arrColor[single.i] CGColor];
if([UIDevice currentDevice].userInterfaceIdiom ==UIUserInterfaceIdiomPad)
{
shapeLayer.lineWidth = 7.0;
}
else
{
shapeLayer.lineWidth = 5.0;
}
shapeLayer.fillColor = [[UIColor redColor] CGColor];
[self.layer addSublayer:shapeLayer];
[clearBeizer addObject:shapeLayer];
}
startingPoint=touchPoint;
// [arrLayer addObject:shapeLayer];
NSLog(#"Touch moving point =x : %f Touch moving point =y : %f", touchPoint.x, touchPoint.y);
}
Yes, it is doable but it is not trivial. What you essentially want is to stroke a path with stars instead of normal dashes. As far as I know, iOS only provides an API for a standard methods, i.e. stroking with a rectangular dash pattern.
If you want to implement custom stroking, you have to do it yourself. You probably have to flatten the bezier path first. Then "walk" along the path and draw stars/circle/squirrels at certain interval manually. It is especially difficult if you need the interval between the stars to be equal.
You can have a look at DrawKit library for MacOS. DrawKit is for MacOS, not iOS! This is just a reference for you to get the idea.
DrawKit has NSBezierPath+Geometry.h category on NSBezierPath class. You can start with (NSBezierPath*)bezierPathWithZig:(CGFloat)zig zag:(CGFloat)zag method and see how zig-zagy path is implemented
https://github.com/DrawKit/DrawKit/.../NSBezierPath-Geometry.m#L1206
or wavy path [(NSBezierPath*)bezierPathWithWavelength:amplitude:spread:]
https://github.com/DrawKit/DrawKit/..../NSBezierPath-Geometry.m#L1270
Just FYI: UIBezierPath (iOS) often lacking methods that are available in NSBezierPath (MacOS)
If DrawKit confuses you, there are probably open-sourced drawing libraries for iOS on the Internet, try searching them and see how custom drawing is done.
Yes You can do that. But You have to get the custom shaped icons for this task.
You can try this excellent answer provided by RobMayoff here: and the git repo
Here is another way to do it:
I have made a simple Image Editing App similar to what your are doing .
You can draw the heart shapes on the image:
Like that, you can draw many custom shapes:
The code is pretty straight forward and simple:
You need to create few custom shaped erasers. I call them eraser becuase they just erase the pic :P .
Here are the methods for customising the eraser:
- (void)newMaskWithColor:(UIColor *)color eraseSpeed:(CGFloat)speed {
wipingInProgress = NO;
eraseSpeed = speed; //how fast eraser should move
maskColor = color; //eraser color
[self setNeedsDisplay];
}
-(void)setErase:(UIImage *)img{
eraser =img; //set the custom shaped image here
}
And to draw the custom shaped eraser on view:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
wipingInProgress = YES;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] == 1) {
UITouch *touch = [touches anyObject];
location = [touch locationInView:self];
location.x -= [eraser size].width/2;
location.y -= [eraser size].width/2;
[self setNeedsDisplay];
}
}
and finally the draw rect method:
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
if (wipingInProgress) {
if (imageRef) {
// Restore the screen that was previously saved
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextDrawImage(context, rect, imageRef);
CGImageRelease(imageRef);
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
}
[eraser drawAtPoint:location blendMode:kCGBlendModeDestinationOut alpha:eraseSpeed];
}
// Save the screen to restore next time around
imageRef = CGBitmapContextCreateImage(context);
}
Here are some variables declared in the .h file:
CGPoint location;
CGImageRef imageRef;
UIImage *eraser;
BOOL wipingInProgress;
UIColor *maskColor;
CGFloat eraseSpeed;
I am simply done this using UIImageView :
UIImageView *imageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:single.arrimages[single.colorimages]]];
imageView.frame = CGRectMake(touchPoint.x, touchPoint.y, 30,30);
[self addSubview:imageView];
just add image while user touch the screen.
get the x , y coordination of user touch and give it to uiimageview frame.
i Hope it will help for some one.
I'm using core text API to draw text on UI. (I don't know other methods, because most Google search results lead me to core text api).
I have read some tutorials online about using core text API. In those tutorial, I see step by step to config from matrix, attribute string to framesetter ... but none of them explain carefully meaning of each step, so I cannot modify by myself.
Below code is the function draw text on screen with (x,y) is the location which I want to draw. This piece of code describe clearly step by step do draw on screen. Nevertheless I don't know where to put x and y parameters, so text will start to draw at this point in rectangle.
// draw text on screen.
+ (void)drawText:(CGContextRef)context bound:(CGRect)rect text:(NSString *)text x:(float)x y:(float) y color:(UIColor *)color size:(float)textSize{
CGContextSaveGState(context);
// always remember to reset the text matrix before drawing.
// otherwise the result will be unpredictable like using uninitialize memory
CGContextSetTextMatrix(context, CGAffineTransformIdentity);
CGContextSetTextMatrix(context, CGAffineTransformMakeScale(1.0f, -1.0f));
// Flip the coordinate system. because core Text uses different coordinate system with UI Kit
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
// step 1: prepare attribute string
NSDictionary *attributes;
attributes = #{
(NSString *) kCTFontAttributeName : [UIFont fontWithName:#"Helvetica" size:textSize],
(NSString *) kCTForegroundColorAttributeName : color
};
NSAttributedString *str = [[NSAttributedString alloc]
initWithString:text attributes:attributes];
CFAttributedStringRef attrString = (__bridge CFAttributedStringRef)str;
// step 2: create CTFFrameSetter
CTFramesetterRef framesetter =
CTFramesetterCreateWithAttributedString((CFAttributedStringRef)attrString);
// step 3. create CGPath
CGMutablePathRef path = CGPathCreateMutable();
CGPathAddRect(path, NULL, rect);
// step 4. get the frame
// use the CGPath and CTFrameSetter to create CTFrame. then drawn it in currently context
CTFrameRef frame = CTFramesetterCreateFrame
(framesetter, CFRangeMake(0, 0), path, NULL);
CTFrameDraw(frame, context);
// step 5. release resource
//CFRelease(attrString);
CFRelease(framesetter);
CFRelease(frame);
CFRelease(path);
CGContextRestoreGState(context);
}
Moreover, I see there are function: CGContextSetTextPosition This seem what I need, but where should I put in above code. I have tried some, but no succeed. Please tell me how to fix this.
Thanks :)
This is not CoreText solution but after your comment under your question I assume this would be a proper answer.
Basic text drawing on a view.
.h
#import <UIKit/UIKit.h>
#interface aView : UIView
#end
.m
#import "aView.h"
#implementation aView
- (void)drawText:(CGFloat)xPosition yPosition:(CGFloat)yPosition canvasWidth:(CGFloat)canvasWidth canvasHeight:(CGFloat)canvasHeight
{
//Draw Text
CGRect textRect = CGRectMake(xPosition, yPosition, canvasWidth, canvasHeight);
NSMutableParagraphStyle* textStyle = NSMutableParagraphStyle.defaultParagraphStyle.mutableCopy;
textStyle.alignment = NSTextAlignmentLeft;
NSDictionary* textFontAttributes = #{NSFontAttributeName: [UIFont fontWithName: #"Helvetica" size: 12], NSForegroundColorAttributeName: UIColor.redColor, NSParagraphStyleAttributeName: textStyle};
[#"Hello, World!" drawInRect: textRect withAttributes: textFontAttributes];
}
- (void)drawRect:(CGRect)rect {
[self drawText:0 yPosition:0 canvasWidth:200 canvasHeight:150];
}
#end
Starting with IOS 7 you can have the str render itself into the current context. To do this import UIKit. This will extend the defenition of NSAttributedString with drawAtPoint: method. So for example you might do something like this:
[str drawAtPoint:CGPointMake(x,y)]
For more info about this and other related methods please see the following:
https://developer.apple.com/library/ios/documentation/uikit/reference/NSAttributedString_UIKit_Additions/Reference/Reference.html#//apple_ref/occ/instm/NSAttributedString/drawAtPoint:
On draw text selected view you can used UIView touch events as below.
create global variable for Touch start point and Touch end Point as below
CGPoint startingPoint;
CGPoint endingPoint;
And then draw text or line as below
#pragma mark- Touch Event
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
startingPoint = [touch locationInView:self];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
}
-(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
endingPoint = [touch locationInView:self];
[self makeLineLayer:self.layer lineFromPointA:startingPoint
toPointB:endingPoint];
}
-(void)makeLineLayer:(CALayer *)layer lineFromPointA:(CGPoint)pointA
toPointB:(CGPoint)pointB
{
CAShapeLayer *line = [CAShapeLayer layer];
UIBezierPath *linePath=[UIBezierPath bezierPath];
[linePath moveToPoint: pointA];
[linePath addLineToPoint:pointB];
line.path=linePath.CGPath;
line.fillColor = nil;
line.opacity = 2.0;
line.lineWidth = 4.0;
line.strokeColor = [UIColor redColor].CGColor;
[layer addSublayer:line];
}
Although this thread is a bit old, but this is my work around for future visitors. This work requires a little bit modification and it starts moving the text to position (x, y).
You have to change step 3 like this:
// step 3. create CGPath
CGMutablePathRef path = CGPathCreateMutable();
CGRect newRect = rect;
newRect.origin.x = x;
newRect.origin.y = -y;
CGPathAddRect(path, NULL, newRect);
and the coordinate (x, y) will start working. The minus sign is due to the different axis of CoreTextAPI.
Also changed the TextMatrix to below for getting the correct transformation or it was giving a flipped Y-Axis text:
// always remember to reset the text matrix before drawing.
// otherwise the result will be unpredictable like using uninitialize memory
CGContextSetTextMatrix(context, CGAffineTransformIdentity);
CGContextSetTextMatrix(context, CGAffineTransformMakeScale(1.0f, 1.0f));
Attached screenshot from IB (Designable):
Hope it helps!
Need a fresh pair of eyes, mine have stopped working and can't make sense of my own code anymore...
I am trying to make a drawing app with a pen on paper style of drawing. Here's how it is suppose to work:
user touches, app grabs location
a var is set to tell drawRect to configure CGcontext, create a new path, move to point, etc (because I always gett errors/warnings in my nslog whenever I do anything with CGcontext outside the drawRect method)
var is then set to determine whether to place a dot or a line when the user lifts finger
if the user drags, the var is changed to tell drawRect to draw a line and [self setneedsdisplay] is called
every time the user places their finger on the screen a timer is activated, every 5 seconds (or until they lift their finger up) the app 'captures' the screen contents, sticks it in an image and wipes the screen, replacing the image and continues drawing (in effect cache'ing the image so all these lines don't have to be re-drawn)
UPDATE #1
So I re-wrote it (because it was obviously really bad and not working...) and ended up with this:
- (void)drawRect:(CGRect)rect {
[_incrImage drawInRect:rect]; /* draw image... this will be blank at first
and then supposed to be filled by the Graphics context so that when the view
is refreshed the user is adding lines ontop of an image (but to them it looks
like a continuous drawing)*/
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapSquare);
CGContextSetLineJoin(UIGraphicsGetCurrentContext(), kCGLineJoinRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), _brushW);
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [UIColor colorWithRed:_brushR green:_brushG blue:_brushB alpha:_brushO].CGColor);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeNormal);
CGContextAddPath(UIGraphicsGetCurrentContext(), _pathref);
CGContextDrawPath(UIGraphicsGetCurrentContext(), kCGPathStroke);
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
_drw = 2; //set this to draw a dot if the user taps
UITouch *touch = [touches anyObject];
_cp = [touch locationInView:self];
_lp = _cp;
CGPathRelease(_pathref); /* this line and the line below is to clear the
path so the app is drawing as little as possible (the idea is that as
the user draws and lifts their finger, the drawing goes into _incrImage
and then the contents is cleared and _incrImage is applied to the
screen so there is as little memory being used as possible and so the
user feels as if they're adding to what they've drawn when they touch
the device again) */
_pathref = CGPathCreateMutable();
CGPathMoveToPoint(_pathref, NULL, _lp.x, _lp.y);
touch = nil;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
_drw = 1; //user moved their finger, this was not a tap so they want
//to draw a line
UITouch *touch = [touches anyObject];
_cp = [touch locationInView:self];
CGPathAddLineToPoint(_pathref, NULL, _cp.x, _cp.y);
[self setNeedsDisplay]; //as the user moves their finger, it draws
_lp = _cp;
touch = nil;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
_cp = [touch locationInView:self];
switch (_drw) { //logic to determine what to draw when the user
//lifts their finger
case 1:
//line
CGPathAddLineToPoint(_pathref, NULL, _cp.x, _cp.y);
break;
case 2:
//dot
CGPathAddArc(_pathref, NULL, _cp.x, _cp.y, _brushW, 0.0, 360.0, 1);
break;
default:
break;
}
/* here's the fun bit, the Graphics context doesn't seem to be going
into _incrImage and therefore is not being displayed when the user
goes to continue drawing after they've drawn a line/dot on the screen */
UIGraphicsBeginImageContext(self.frame.size);
CGContextAddPath(UIGraphicsGetCurrentContext(), _pathref); /* tried adding
my drawn path to the context and then adding the context to the image
before the user taps down again and the path is cleared... not sure
why it isn't working. */
[_incrImage drawAtPoint:CGPointZero];
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self setNeedsDisplay]; //finally, refresh the contents
touch = nil;
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
My new problem is that everything is erased when drawRect is called and that '_incrImage' is not getting the contents of the current graphics and displaying them
__old, I guess no longer needed but keeping here for reference___
here is the relevant code:
- (void)drawRect:(CGRect)rect {
/* REFERENCE: _drw: 0 = clear everything
1 = cache the screen contents in the image and display that
so the device doesn't have to re-draw everything
2 = set up new path
3 = draw lines instead of a dot at current point
4 = draw a 'dot' instead of a line
*/
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapSquare);
CGContextSetLineJoin(UIGraphicsGetCurrentContext(), kCGLineJoinRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), _brushW);
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [UIColor colorWithRed:_brushR green:_brushG blue:_brushB alpha:_brushO].CGColor);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeNormal);
switch (_drw) {
case 0:
//clear everything...this is the first draw... it can also be called to clear the view
UIGraphicsBeginImageContext(self.frame.size);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);
CGContextSetFillColorWithColor(UIGraphicsGetCurrentContext(), [UIColor clearColor].CGColor);
CGContextFillRect(UIGraphicsGetCurrentContext(), self.frame);
CGContextFlush(UIGraphicsGetCurrentContext());
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[_incrImage drawAtPoint:CGPointZero];
[_incrImage drawInRect:rect];
break;
case 1:
//capture the screen content and stick it in _incrImage...
then apply_incrImage to screen so the user can continue drawing ontop of it
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
[_incrImage drawAtPoint:CGPointZero];
[_incrImage drawInRect:rect];
break;
case 2:
//begin path and set everything up, this is called when touchesBegan: fires...
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
[_incrImage drawAtPoint:CGPointZero];
[_incrImage drawInRect:rect];
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), _p.x, _p.y);
break;
case 3:
//add lines, this is after the path is created and set...this is fired when touchesMoved: gets activated and _drw is set to draw lines instead of adding dots
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), _p.x, _p.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
break;
case 4:
//this is fired when touchesEnd: is activated... this sets up ready if the app needs to draw a 'dot' or the arc with a fill...
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), _p.x, _p.y);
CGContextAddArc(UIGraphicsGetCurrentContext(), _p.x, _p.y, _brushW, 0.0, 360.0, 1);
CGContextFillPath(UIGraphicsGetCurrentContext());
CGContextFlush(UIGraphicsGetCurrentContext());
break;
default:
break;
}
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
dTimer = [NSTimer timerWithTimeInterval:5.0 target:self selector:#selector(timeUP) userInfo:Nil repeats:YES];
_drw = 2;
UITouch *touch = [touches anyObject];
_p = [touch locationInView:self];
[self setNeedsDisplay];
_drw = 4;
}
- (void)timeUP {
_drw = 1;
[self setNeedsDisplay];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
_drw = 3;
UITouch *touch = [touches anyObject];
_p = [touch locationInView:self];
[self setNeedsDisplay];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event // (2)
{
[dTimer invalidate];
UITouch *touch = [touches anyObject];
_p = [touch locationInView:self];
[self setNeedsDisplay];
[self timeUP];
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
My questions are:
is this efficient? or is there a better way to do this?
and
Why is this not drawing? I get nothing but I can see my memory being used...
Egad.
Your first problem is that setNeedsDisplay doesn’t draw immediately, it just marks the view to be drawn at the end of the event. So, when you set _drw=2 and call setNeedsDisplay and then set _drw=4, it’s only going to actually call -drawRect: with _drw=4 (if that, since that still might not be the end of the current event).
But, also don’t, uh, use that _drw switch thing. That’s not good.
You want to create an image and draw into the image as the touches happen, and then in drawRect: just blat the image to the screen. If you ever find yourself calling UIGraphicsGetImageFromCurrentImageContext() inside -drawRect; you are doing things backwards (as you are here). Don’t slurp the image from the screen, create an image that you blat to the screen.
The screen should never been your ‘state’. That way lies madness.
On top of the answer from #WilShipley which I agree with (don't put state management logic in drawRect:, only pure redrawing logic), you are currently never drawing drawing anything other than a minuscule line because CGContextStrokePath clears the current line from the context.
The context isn't intended to be a temporary cache of incomplete drawing operations, it's your portal to the screen (or a backing image / PDF file / ...). You need to create your drawing state outside drawRect: and then render it to the screen inside drawRect:.
To ease performance issues, only redraw the area around the newest touch (or between the newest and previous touch).
New programmer here trying to take things step by step. I am trying to find a way to draw a circle around each currently touched location on a device. Two fingers on the screen, one circle under each finger.
I currently have the working code to draw a circle at one touch location, but once I lay another finger on the screen, the circle moves to that second touch location, leaving the first touch location empty. and when I add a third, it moves there etc.
Ideally I would like to be able to have up to 5 active circle on the screen, one for each finger.
Here is my current code.
#interface TapView ()
#property (nonatomic) BOOL touched;
#property (nonatomic) CGPoint firstTouch;
#property (nonatomic) CGPoint secondTouch;
#property (nonatomic) int tapCount;
#end
#implementation TapView
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
NSArray *twoTouch = [touches allObjects];
if(touches.count == 1)
{
self.tapCount = 1;
UITouch *tOne = [twoTouch objectAtIndex:0];
self.firstTouch = [tOne locationInView:[tOne view]];
self.touched = YES;
[self setNeedsDisplay];
}
if(touches.count > 1 && touches.count < 3)
{
self.tapCount = 2;
UITouch *tTwo = [twoTouch objectAtIndex:1];
self.secondTouch = [tTwo locationInView:[tTwo view]];
[self setNeedsDisplay];
}
}
-(void)drawRect:(CGRect)rect
{
if(self.touched && self.tapCount == 1)
{
[self drawTouchCircle:self.firstTouch :self.secondTouch];
}
}
-(void)drawTouchCircle:(CGPoint)firstTouch :(CGPoint)secondTouch
{
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextSetRGBStrokeColor(ctx,0.1,0.1,0.1,1.0);
CGContextSetLineWidth(ctx,10);
CGContextAddArc(ctx,self.firstTouch.x,self.firstTouch.y,30,0.0,M_PI*2,YES);
CGContextStrokePath(ctx);
}
I do have setMultipleTouchEnabled:YES declared in my didFinishLaunchingWithOptions method in the appDelegate.m.
I have attempted to use an if statement in the drawTouchCircle method that changes the self.firstTouch.x to self.secondTouch.x based on a self.tapCount but that seems to break the whole thing, leaving me with no circles at any touch locations.
I'm having an immensely hard time trying to find my issue, and I am aware that it might be something quite simple.
I just wrote some code that seems to work. I've added an NSMutableArray property called circles to the view, which contains a UIBezierPath for each circle.
In -awakeFromNib I setup the array and set self.multipleTouchEnabled = YES - (I think you did this using a reference to the view in your appDelegate.m).
In the view I call this method in the -touchesBegan and -touchesMoved methods.
-(void)setCircles:(NSSet*)touches
{
[_circles removeAllObjects]; //clear circles from previous touch
for(UITouch *t in touches)
{
CGPoint pt= [t locationInView:self];
CGFloat circSize = 200; //or whatever you need
pt = CGPointMake(pt.x - circSize/2.0, pt.y - circSize/2.0);
CGRect circOutline = CGRectMake(pt.x, pt.y, circSize, circSize);
UIBezierPath *circle = [UIBezierPath bezierPathWithOvalInRect:circOutline];
[_circles addObject:circle];
}
[self setNeedsDisplay];
}
Touches ended is:
-(void)touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event
{
[_circles removeAllObjects];
[self setNeedsDisplay];
}
Then I loop over circles in -drawRect and call [circle stroke] on each one
I've been trying to draw a sprite line between 2 points made by sprites with mouse events on Xcode.
I have been following the steps given on a forum in this link:
cocos2d forums
But when i run the code, i get the line going all the way of the simulator. just like this.
snapshot1
The line should stop by the second mouse sprite generated code, but it doesn't and keeps going all the way.
My Scene is something like this.
My .h class
#import <Foundation/Foundation.h>
#import "cocos2d.h"
#import "Constants.h"
#import "SceneManager.h"
#interface EscenaInfo : CCLayer{
CGPoint lastTouchPoint;
CCSprite * background;
}
#property (nonatomic, assign) BOOL iPad;
#end
My .mm
#import "EscenaInfo.h"
#implementation EscenaInfo
#synthesize iPad;
- (void)onBack: (id) sender {
/*
This is where you choose where clicking 'back' sends you.
*/
[SceneManager goMenuPrincipal];
}
- (void)addBackButton {
if (self.iPad) {
// Create a menu image button for iPad
CCMenuItemImage *goBack = [CCMenuItemImage itemFromNormalImage:#"Arrow-Normal-iPad.png"
selectedImage:#"Arrow-Selected-iPad.png"
target:self
selector:#selector(onBack:)];
// Add menu image to menu
CCMenu *back = [CCMenu menuWithItems: goBack, nil];
// position menu in the bottom left of the screen (0,0 starts bottom left)
back.position = ccp(64, 64);
// Add menu to this scene
[self addChild: back];
}
else {
// Create a menu image button for iPhone / iPod Touch
CCMenuItemImage *goBack = [CCMenuItemImage itemFromNormalImage:#"Arrow-Normal-iPhone.png"
selectedImage:#"Arrow-Selected-iPhone.png"
target:self
selector:#selector(onBack:)];
// Add menu image to menu
CCMenu *back = [CCMenu menuWithItems: goBack, nil];
// position menu in the bottom left of the screen (0,0 starts bottom left)
back.position = ccp(32, 32);
// Add menu to this scene
[self addChild: back];
}
}
- (id)init {
if( (self=[super init])) {
// Determine Screen Size
CGSize screenSize = [CCDirector sharedDirector].winSize;
//Boton en la Interfaz del iPad
self.iPad = UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad;
// Put a 'back' button in the scene
[self addBackButton];
///
self.isTouchEnabled = YES;
lastTouchPoint = ccp(-1.0f,-1.0f);
///
[CCTexture2D setDefaultAlphaPixelFormat:kCCTexture2DPixelFormat_RGB565];
background = [CCSprite spriteWithFile:#"background.png"];
background.anchorPoint = ccp(0,0);
[self addChild:background z:-1];
[CCTexture2D setDefaultAlphaPixelFormat:kCCTexture2DPixelFormat_Default];
}
return self;
}
- (void) dealloc
{
// in case you have something to dealloc, do it in this method
// in this particular example nothing needs to be released.
// cocos2d will automatically release all the children (Label)
// don't forget to call "super dealloc"
[super dealloc];
}
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if( touch ) {
CGPoint location = [touch locationInView: [touch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
CCLOG(#"location(%f,%f)", location.x, location.y);
if( CGPointEqualToPoint(lastTouchPoint, ccp(-1.0f,-1.0f) ) )
{
lastTouchPoint = ccp(location.x, location.y);
CCSprite *circle = [CCSprite spriteWithFile:#"circle.png"];
[circle setPosition:lastTouchPoint];
[self addChild:circle];
CCLOG(#"initial touchpoint set. to (%f,%f)", lastTouchPoint.x, lastTouchPoint.y);
}
else {
CCLOG(#"lastTouchPoint is now(%f,%f), location is (%f,%f)", lastTouchPoint.x, lastTouchPoint.y, location.x, location.y);
CGPoint diff = ccpSub(location, lastTouchPoint);
float rads = atan2f( diff.y, diff.x);
float degs = -CC_RADIANS_TO_DEGREES(rads);
float dist = ccpDistance(lastTouchPoint, location);
CCSprite *line = [CCSprite spriteWithFile:#"line.png"];
[line setAnchorPoint:ccp(0.0f, 0.5f)];
[line setPosition:lastTouchPoint];
[line setScaleX:dist];
[line setRotation: degs];
[self addChild:line];
CCSprite *circle = [CCSprite spriteWithFile:#"circle.png"];
[circle setPosition:location];
[self addChild:circle];
// lastTouchPoint = ccp(location.x, location.y);
lastTouchPoint = ccp(-1.0f,-1.0f);
}
}
}
#end
Does anyone knows how to work this out? i have been trying lots of things but nothing has worked for me, or maybe point out my mistake. i would really appreciate this.
I've not run the code but it looks fairly straightforward. The cause of the problem lies in this section:
float dist = ccpDistance(lastTouchPoint, location);
CCSprite *line = [CCSprite spriteWithFile:#"line.png"];
[line setAnchorPoint:ccp(0.0f, 0.5f)];
[line setPosition:lastTouchPoint];
[line setScaleX:dist];
This code calculates the distance between the two touch points in points (pixels), creates a new sprite (that will become the line) and sets the anchor point to the right hand side, centred vertically. It positions this at the point of the last touch and then sets the scale of the sprite's width based on the distance calculated earlier. This scaling factor will ensure the sprite is 'long' enough to reach between the two points.
Your issue:
This isn't taking into account the initial dimensions of the image you are loading (line.png). If this isn't a 1x1 dimension png then the setScale is going to make the resulting sprite too large - the overrun you are experiencing.
The Solution
Make line.png a 1 x 1 pixel image. Your code will work perfectly, though you will have a very thin line that is not aesthetically pleasing.
Or, for best results, calculate the scale for the sprite by taking into account the width of line.png. This way the sprite can be more detailed and won't overrun.
Change thesetScaleX line to this:
[line setScaleX:dist / line.boundingBox.size.width];
Using Cocos2D v3.x this works:
in -(void)update:(CCTime)delta{} you do this:
[self.drawnode drawSegmentFrom:ccp(50,100) to:ccp(75, 25) radius:3 color:self.colorDraw];
The self.drawnode and self.colorDraw properties are initialized like this, maybe inside -(void)onEnter{} :
self.drawnode = [CCDrawNode node];
self.colorDraw = [CCColor colorWithCcColor3b:ccRED];
[self addChild:self.drawnode];
I think you can use core graphics here :
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context,4);
CGContextSetStrokeColorWithColor(context, [UIColor redColor].CGColor);
CGContextMoveToPoint(context,startPoint.x , startPoint.y);
CGContextAddLineToPoint(context, endPoint.x, endPoint.y);
CGContextStrokePath(context);
}
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touchPoint = [touches anyObject];
startPoint = [touchPoint locationInView:self];
endPoint = [touchPoint locationInView:self];
[self setNeedsDisplay];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
endPoint=[touch locationInView:self];
[self setNeedsDisplay];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
endPoint = [touch locationInView:self];
[self setNeedsDisplay];
}
I think this will help you.