How to change a Shape of UIBezierPath Drawing line? - ios

Is there any way to change the UIBezierPath drawing shape ,see the below image it like a line when user drag the finger,but i want star ,circle and other is there any way to achieve that.
My Expectation is
This is my UIBezierPath code:
-(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
touchPoint = [touch locationInView:self];
if (!CGPointEqualToPoint(startingPoint, CGPointZero))
{
UIBezierPath *path = [UIBezierPath bezierPath];
[path moveToPoint:CGPointMake(touchPoint.x,touchPoint.y)];
[path addLineToPoint:CGPointMake(startingPoint.x,startingPoint.y)];
CAShapeLayer *shapeLayer = [CAShapeLayer layer];
shapeLayer.path = [path CGPath];
shapeLayer.strokeColor = [single.arrColor[single.i] CGColor];
if([UIDevice currentDevice].userInterfaceIdiom ==UIUserInterfaceIdiomPad)
{
shapeLayer.lineWidth = 7.0;
}
else
{
shapeLayer.lineWidth = 5.0;
}
shapeLayer.fillColor = [[UIColor redColor] CGColor];
[self.layer addSublayer:shapeLayer];
[clearBeizer addObject:shapeLayer];
}
startingPoint=touchPoint;
// [arrLayer addObject:shapeLayer];
NSLog(#"Touch moving point =x : %f Touch moving point =y : %f", touchPoint.x, touchPoint.y);
}

Yes, it is doable but it is not trivial. What you essentially want is to stroke a path with stars instead of normal dashes. As far as I know, iOS only provides an API for a standard methods, i.e. stroking with a rectangular dash pattern.
If you want to implement custom stroking, you have to do it yourself. You probably have to flatten the bezier path first. Then "walk" along the path and draw stars/circle/squirrels at certain interval manually. It is especially difficult if you need the interval between the stars to be equal.
You can have a look at DrawKit library for MacOS. DrawKit is for MacOS, not iOS! This is just a reference for you to get the idea.
DrawKit has NSBezierPath+Geometry.h category on NSBezierPath class. You can start with (NSBezierPath*)bezierPathWithZig:(CGFloat)zig zag:(CGFloat)zag method and see how zig-zagy path is implemented
https://github.com/DrawKit/DrawKit/.../NSBezierPath-Geometry.m#L1206
or wavy path [(NSBezierPath*)bezierPathWithWavelength:amplitude:spread:]
https://github.com/DrawKit/DrawKit/..../NSBezierPath-Geometry.m#L1270
Just FYI: UIBezierPath (iOS) often lacking methods that are available in NSBezierPath (MacOS)
If DrawKit confuses you, there are probably open-sourced drawing libraries for iOS on the Internet, try searching them and see how custom drawing is done.

Yes You can do that. But You have to get the custom shaped icons for this task.
You can try this excellent answer provided by RobMayoff here: and the git repo
Here is another way to do it:
I have made a simple Image Editing App similar to what your are doing .
You can draw the heart shapes on the image:
Like that, you can draw many custom shapes:
The code is pretty straight forward and simple:
You need to create few custom shaped erasers. I call them eraser becuase they just erase the pic :P .
Here are the methods for customising the eraser:
- (void)newMaskWithColor:(UIColor *)color eraseSpeed:(CGFloat)speed {
wipingInProgress = NO;
eraseSpeed = speed; //how fast eraser should move
maskColor = color; //eraser color
[self setNeedsDisplay];
}
-(void)setErase:(UIImage *)img{
eraser =img; //set the custom shaped image here
}
And to draw the custom shaped eraser on view:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
wipingInProgress = YES;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] == 1) {
UITouch *touch = [touches anyObject];
location = [touch locationInView:self];
location.x -= [eraser size].width/2;
location.y -= [eraser size].width/2;
[self setNeedsDisplay];
}
}
and finally the draw rect method:
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
if (wipingInProgress) {
if (imageRef) {
// Restore the screen that was previously saved
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextDrawImage(context, rect, imageRef);
CGImageRelease(imageRef);
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
}
[eraser drawAtPoint:location blendMode:kCGBlendModeDestinationOut alpha:eraseSpeed];
}
// Save the screen to restore next time around
imageRef = CGBitmapContextCreateImage(context);
}
Here are some variables declared in the .h file:
CGPoint location;
CGImageRef imageRef;
UIImage *eraser;
BOOL wipingInProgress;
UIColor *maskColor;
CGFloat eraseSpeed;

I am simply done this using UIImageView :
UIImageView *imageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:single.arrimages[single.colorimages]]];
imageView.frame = CGRectMake(touchPoint.x, touchPoint.y, 30,30);
[self addSubview:imageView];
just add image while user touch the screen.
get the x , y coordination of user touch and give it to uiimageview frame.
i Hope it will help for some one.

Related

iOS: Draw a text at a point in a rectangle

I'm using core text API to draw text on UI. (I don't know other methods, because most Google search results lead me to core text api).
I have read some tutorials online about using core text API. In those tutorial, I see step by step to config from matrix, attribute string to framesetter ... but none of them explain carefully meaning of each step, so I cannot modify by myself.
Below code is the function draw text on screen with (x,y) is the location which I want to draw. This piece of code describe clearly step by step do draw on screen. Nevertheless I don't know where to put x and y parameters, so text will start to draw at this point in rectangle.
// draw text on screen.
+ (void)drawText:(CGContextRef)context bound:(CGRect)rect text:(NSString *)text x:(float)x y:(float) y color:(UIColor *)color size:(float)textSize{
CGContextSaveGState(context);
// always remember to reset the text matrix before drawing.
// otherwise the result will be unpredictable like using uninitialize memory
CGContextSetTextMatrix(context, CGAffineTransformIdentity);
CGContextSetTextMatrix(context, CGAffineTransformMakeScale(1.0f, -1.0f));
// Flip the coordinate system. because core Text uses different coordinate system with UI Kit
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
// step 1: prepare attribute string
NSDictionary *attributes;
attributes = #{
(NSString *) kCTFontAttributeName : [UIFont fontWithName:#"Helvetica" size:textSize],
(NSString *) kCTForegroundColorAttributeName : color
};
NSAttributedString *str = [[NSAttributedString alloc]
initWithString:text attributes:attributes];
CFAttributedStringRef attrString = (__bridge CFAttributedStringRef)str;
// step 2: create CTFFrameSetter
CTFramesetterRef framesetter =
CTFramesetterCreateWithAttributedString((CFAttributedStringRef)attrString);
// step 3. create CGPath
CGMutablePathRef path = CGPathCreateMutable();
CGPathAddRect(path, NULL, rect);
// step 4. get the frame
// use the CGPath and CTFrameSetter to create CTFrame. then drawn it in currently context
CTFrameRef frame = CTFramesetterCreateFrame
(framesetter, CFRangeMake(0, 0), path, NULL);
CTFrameDraw(frame, context);
// step 5. release resource
//CFRelease(attrString);
CFRelease(framesetter);
CFRelease(frame);
CFRelease(path);
CGContextRestoreGState(context);
}
Moreover, I see there are function: CGContextSetTextPosition This seem what I need, but where should I put in above code. I have tried some, but no succeed. Please tell me how to fix this.
Thanks :)
This is not CoreText solution but after your comment under your question I assume this would be a proper answer.
Basic text drawing on a view.
.h
#import <UIKit/UIKit.h>
#interface aView : UIView
#end
.m
#import "aView.h"
#implementation aView
- (void)drawText:(CGFloat)xPosition yPosition:(CGFloat)yPosition canvasWidth:(CGFloat)canvasWidth canvasHeight:(CGFloat)canvasHeight
{
//Draw Text
CGRect textRect = CGRectMake(xPosition, yPosition, canvasWidth, canvasHeight);
NSMutableParagraphStyle* textStyle = NSMutableParagraphStyle.defaultParagraphStyle.mutableCopy;
textStyle.alignment = NSTextAlignmentLeft;
NSDictionary* textFontAttributes = #{NSFontAttributeName: [UIFont fontWithName: #"Helvetica" size: 12], NSForegroundColorAttributeName: UIColor.redColor, NSParagraphStyleAttributeName: textStyle};
[#"Hello, World!" drawInRect: textRect withAttributes: textFontAttributes];
}
- (void)drawRect:(CGRect)rect {
[self drawText:0 yPosition:0 canvasWidth:200 canvasHeight:150];
}
#end
Starting with IOS 7 you can have the str render itself into the current context. To do this import UIKit. This will extend the defenition of NSAttributedString with drawAtPoint: method. So for example you might do something like this:
[str drawAtPoint:CGPointMake(x,y)]
For more info about this and other related methods please see the following:
https://developer.apple.com/library/ios/documentation/uikit/reference/NSAttributedString_UIKit_Additions/Reference/Reference.html#//apple_ref/occ/instm/NSAttributedString/drawAtPoint:
On draw text selected view you can used UIView touch events as below.
create global variable for Touch start point and Touch end Point as below
CGPoint startingPoint;
CGPoint endingPoint;
And then draw text or line as below
#pragma mark- Touch Event
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
startingPoint = [touch locationInView:self];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
}
-(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
endingPoint = [touch locationInView:self];
[self makeLineLayer:self.layer lineFromPointA:startingPoint
toPointB:endingPoint];
}
-(void)makeLineLayer:(CALayer *)layer lineFromPointA:(CGPoint)pointA
toPointB:(CGPoint)pointB
{
CAShapeLayer *line = [CAShapeLayer layer];
UIBezierPath *linePath=[UIBezierPath bezierPath];
[linePath moveToPoint: pointA];
[linePath addLineToPoint:pointB];
line.path=linePath.CGPath;
line.fillColor = nil;
line.opacity = 2.0;
line.lineWidth = 4.0;
line.strokeColor = [UIColor redColor].CGColor;
[layer addSublayer:line];
}
Although this thread is a bit old, but this is my work around for future visitors. This work requires a little bit modification and it starts moving the text to position (x, y).
You have to change step 3 like this:
// step 3. create CGPath
CGMutablePathRef path = CGPathCreateMutable();
CGRect newRect = rect;
newRect.origin.x = x;
newRect.origin.y = -y;
CGPathAddRect(path, NULL, newRect);
and the coordinate (x, y) will start working. The minus sign is due to the different axis of CoreTextAPI.
Also changed the TextMatrix to below for getting the correct transformation or it was giving a flipped Y-Axis text:
// always remember to reset the text matrix before drawing.
// otherwise the result will be unpredictable like using uninitialize memory
CGContextSetTextMatrix(context, CGAffineTransformIdentity);
CGContextSetTextMatrix(context, CGAffineTransformMakeScale(1.0f, 1.0f));
Attached screenshot from IB (Designable):
Hope it helps!

Measure distance of line drawn on iPhone screen

I am trying to creating an application that allows the user to draw a line on the screen and it measures the distance of the drawn line. I have been able to successfully draw the line but I don't know how to measure it. The line does not have to be perfectly straight either. It is basically a squiggle. If someone could please point me in the right direction or help guide me that would be awesome. I am using Xcode 5.1.1 and objective-c. I only just started dabbling in the language this summer.
EDIT: I am looking to measure the distance in either inches or cm. I would like the measurement to be the entire line, to follow the curve of the line. The distance not the displacement.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwipe = YES; //swipe declared in header
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self.view]; //tracking finger movement on screen
UIGraphicsBeginImageContext(CGSizeMake(320, 568)); // 568 iphone 5, 480 is iphone 4 (320,525)
[drawImage.image drawInRect:CGRectMake(0, 0, 320, 568)]; // 0,0 centered in corner
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound); //round line end
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0); // width of line
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [[UIColor redColor] CGColor]); //sets color to red (change red to any color for that color)
//CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0,1,0,1); //green color
CGContextBeginPath(UIGraphicsGetCurrentContext()); //start of when drawn path
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
[drawImage setFrame:CGRectMake(0, 0, 320, 568)]; //(320, 568)
drawImage.image = UIGraphicsGetImageFromCurrentImageContext(); //importnant
UIGraphicsEndImageContext(); //finished drawing for time period
lastPoint = currentPoint;
[self.view addSubview:drawImage]; //adds to page
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject]; //touch fires of touch
location = [touch locationInView:touch.view];
lastClick = [NSDate date];
lastPoint = [touch locationInView:self.view]; //stops connecting to previous line
lastPoint.y -= 0;
[super touchesEnded:touches withEvent: event];
}
First add a property that cumulates the length of the path.
#property(nonatomic, assign) CGFloat pathLength;
Initialize it to 0.0 when user begins drawing the path. Maybe do this in touchesBegan, or the same place elsewhere in your code where you realize you're beginning to draw. Add a method that computes the cartesian distance between points:
- (CGFloat)distanceFrom:(CGPoint)p1 to:(CGPoint)p2 {
CGFloat x = (p2.x - p1.x);
CGFloat y = (p2.y - p1.y);
return sqrt(x*x + y*y);
}
As you get touches moved, you are already handling the current and last touch positions. All you must do now is cumulate the distance between successive points:
// in touches moved, after you have lastPoint and currentPoint
self.pathLength += [self distanceFrom:currentPoint to:lastPoint];
There are quite a few refs here and elsewhere for converting these points to inches or cm. As far as I can see all are fraught with the inability to get the device resolution at runtime from the SDK. If you're willing to add a (dangerous) constant to the code, you can get PPI here, and divide that into the pathLength computed above.

How to 'record' a UIBezierPath to be able to playback a stroked drawing path

In my current iOS app, the user is able to draw with their finger by using a UIBezierPath with smoothing the path; this is pretty simple however. What I would like to know, is if it's possible to record the path, the dots, and the color associated for the path and dots for when the user lifts up their finger and changes pencil colors. My goal then is that a play button would then playback everything they just created in real time, and would be sped up with an animation in case they took several minutes drawing.
I appreciate your responses. Here's the code I'm currently using for drawing (not the best code):
#property (nonatomic, strong) UIBezierPath *path;
#property uint ctr;
#end
#implementation DrawViewController
{
CGPoint pts[4];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
self.ctr = 0;
UITouch *touch = [touches anyObject];
pts[0] = [touch locationInView:self.drawImage];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self.drawImage];
self.ctr++;
pts[self.ctr] = p;
if (self.ctr == 3)
{
pts[2] = CGPointMake((pts[1].x + pts[3].x)/2.0, (pts[1].y + pts[3].y)/2.0);
[self.path moveToPoint:pts[0]];
[self.path addQuadCurveToPoint:pts[2] controlPoint:pts[1]];
//[self.drawImage setNeedsDisplay];
pts[0] = pts[2];
pts[1] = pts[3];
self.ctr = 1;
dispatch_async(dispatch_get_main_queue(),
^{
UIGraphicsBeginImageContextWithOptions(self.drawImage.bounds.size, NO, 0.0);
[self.drawImage.image drawAtPoint:CGPointZero];
[[UIColor colorWithRed:self.red green:self.green blue:self.blue alpha:1.0] setStroke];
[self.path stroke];
self.drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self.path removeAllPoints];
self.ctr = 0;
});
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if (self.ctr == 0)
{
[self.path moveToPoint:pts[0]];
[self.path addLineToPoint:pts[0]];
}
else if (self.ctr == 1)
{
[self.path moveToPoint:pts[0]];
[self.path addLineToPoint:pts[1]];
}
else if (self.ctr == 2)
{
[self.path moveToPoint:pts[0]];
[self.path addQuadCurveToPoint:pts[2] controlPoint:pts[1]];
}
self.ctr = 0;
dispatch_async(dispatch_get_main_queue(),
^{
UIGraphicsBeginImageContextWithOptions(self.drawImage.bounds.size, NO, 0.0);
[self.drawImage.image drawAtPoint:CGPointZero];
[[UIColor colorWithRed:self.red green:self.green blue:self.blue alpha:1.0] setStroke];
[self.path stroke];
self.drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self.path removeAllPoints];
self.ctr = 0;
});
}
What I would do would be to create a custom object that represents a single "segment" of your drawing. (Let's call it a "BezierSegment".) From a quick glance, it looks like you're using quadratic Bezier segments. So create an object that saves the 3 control points for the bezier and the color used to draw it. Each time you draw a new "segment", create one of these objects and add it to a mutable array of segment objects.
Then you could loop through your array of BezierSegment objects, create BezierPath objects out of each one, and draw it to the screen in order to recreate it.
You could also save things like line thickness, optional closed paths with a separate pen color, etc.

how can make bokeh effect by core graphics?

I want to create app that has paintbrush to add bokeh effect with finger move onto image. Here is the code.
#import "fingerDrawView.h"
////create bokeh image
-(UIImage*)drawCircle{
///1) create a bitmap context
UIGraphicsBeginImageContext(self.bounds.size);
///2) get the context
CGContextRef circleContext = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(circleContext, 3.0f);
//circle1
CGContextSetFillColorWithColor(circleContext, [UIColor colorWithRed:0.5 green:1 blue:0.5 alpha:0.4].CGColor);
CGRect circle1Point = CGRectMake(0, 0, 80, 80);///// When play it in simulator, it look smaller than this size. I don’t know why???
CGContextFillEllipseInRect(circleContext, circle1Point);
CGContextSetStrokeColorWithColor(circleContext, [UIColor colorWithRed:0.3 green:0.9 blue:0 alpha:0.6].CGColor);
CGContextStrokeEllipseInRect(circleContext, circle1Point);
////4) export the context into an image
UIImage *circleImage = UIGraphicsGetImageFromCurrentImageContext();
//// 5) destroy the context
UIGraphicsEndImageContext();
return circleImage;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0),^{
_imageBuffer = [self drawCircle];
dispatch_async(dispatch_get_main_queue(), ^{
CGPoint touchPoint = [touch locationInView:self];
CGPoint prev_touchPoint = [touch previousLocationInView:self];
if (ABS(touchPoint.x - prev_touchPoint.x) > 6
|| ABS(touchPoint.y - prev_touchPoint.y) > 6) {
_aImageView = [[UIImageView alloc]initWithImage:_imageBuffer ];
_aImageView.multipleTouchEnabled = YES;
_aImageView.userInteractionEnabled = YES;
[_aImageView setFrame:CGRectMake(touchPoint.x, touchPoint.y, 100.0, 100.0)];
[self addSubview:_aImageView];
}
});
});
}
It is able work in simulator. However, it crash while run in devise (ipad4). The console informed that “received memory warning”. I made GCD to draw bokeh image, but it didn’t work.
By the way, I want to make the bokeh image size in 80X80 (-(UIImage*)drawCircle). When play it in simulator, it look smaller than this size.

Building 'DrawSomething' style app for neuroscience project research purposes

I am trying to implement a view that draws the users handwriting (curser position) for the iPad (4). I saw Apple's sample code, that uses OpenGL, however, there were parts I couldn't understand, so, I tried implementing this using core graphics.
#import "PaintView.h"
#include <stdlib.h>
#implementation PaintView
- (id)initWithCoder:(NSCoder *)aDecoder {
self = [super initWithCoder:aDecoder];
if(self) {
//
pointsToDraw = [[NSMutableArray alloc] init];
}
return self;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
NSLog(#"%#", touch);
CGPoint location = [touch locationInView:self];
CGPoint previousLocation = [touch previousLocationInView:self];
Ink *ink = [[Ink alloc] initWithPoint:previousLocation toPoint:location time:touch.timestamp];
// UITouch *newTouch = [touch copy];
[pointsToDraw addObject:ink];
[self setNeedsDisplay];
}
- (void)drawLine:(CGPoint)startingPoint toPoint:(CGPoint)endingPoint context:(CGContextRef)context
{
// Drawing code
CGContextSetStrokeColorWithColor(context, [UIColor redColor].CGColor);
CGRect temp = CGRectMake(10, 10, 100, 100);
// Draw them with a 2.0 stroke width so they are a bit more visible.
CGContextSetLineWidth(context, 2.0);
CGContextMoveToPoint(context, startingPoint.x,startingPoint.y); //start at this point
CGContextAddLineToPoint(context, endingPoint.x, endingPoint.y); //draw to this point
// and now draw the Path!
}
- (void)drawRect:(CGRect)rect
{
[super drawRect:rect];
// [self drawLine:CGPointMake(10, 10) toPoint:CGPointMake(30, 30)];
CGContextRef context = UIGraphicsGetCurrentContext();
for (Ink *ink in pointsToDraw){
[self drawLine:ink.point toPoint:ink.previousPoint context:context];
}
CGContextStrokePath(context);
}
#end
The problem is, every touch I draw everything (ink is a class that contains two CGPOINT's and a time stamp), and after a while this dramatically slows things down creating substantial lag.
My goal is to be able to both capture handwriting in a precise way, and play it back precisely.
Another things to consider, is that I am using a stylus which gives pressure information so I need to be able to draw my line in changing widths.
Any advice will be greatly appreciated.
Instead of storing the points in an array, store them in an array and a UIBezierPath. Then you only need to draw the bezier path within drawRect: instead of setting up the whole scheme.
The stylus doesn't give pressure information - at least not on iOS. iPhones have capacitive touch screens not resistive. The standard algorithm to change width is to use the speed as the factor and draw using little triangles that you fill in to create your line.
Funny! Here is a relevant article: http://www.nearinfinity.com/blogs/jason_harwig/2012/11/06/capture-a-signature-on-ios.html

Resources