Why the responds is slow as stroke PNG images with UItouch? - ios

I used below code to stroke PNG with finger move. There has 2 UIImage View. One locates at background to put background image there. The other one is clear UIImage view to stroke PNG images on top of it.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch * touch in touches) {
currentPoint = [touch locationInView:self.view];
lastPoint = [touch previousLocationInView:self.view];
//set up array to make space between PNG images
if (ABS(currentPoint.x-lastPoint.x)>16
|| ABS(currentPoint.y - lastPoint.y) > 13) {
[brushLocations addObject:[NSValue valueWithCGPoint:currentPoint]];
}
[self drawingWithArray];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[brushLocations removeAllObjects];//reset
}
-(void)drawingWithArray{
UIGraphicsBeginImageContext(self.view.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)];
for (int i=0; i<[brushLocations count]; i++) {
CGPoint center =[[brushLocations objectAtIndex:i]CGPointValue];
// bokehImage is UIImage
bokehImage=[bokehImgArray objectAtIndex: i%[bokehImgArray count]];
/// the PNG images are not semi-transparent, even set the alpha is 0.5??
[bokehImage drawAtPoint:center blendMode:kCGBlendModeOverlay alpha:0.5f];
//drawImage is uiimage view on top of background image view for stroke PNG images.
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
Now, I got problem is the respond is slow. The PNG images didn’t display immediately while finger move on device (IPad4).
Also, the PNG images are not semi-transparent. I suppose that the function of “drawAtPoint .. blendMode .. alpha “ can make images to be semi-transparent (set 0.5 alpha).

Yes, something like this should work:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch * touch in touches) {
currentPoint = [touch locationInView:self.view];
lastPoint = [touch previousLocationInView:self.view];
//set up array to make space between PNG images
if (ABS(currentPoint.x-lastPoint.x)>16
|| ABS(currentPoint.y - lastPoint.y) > 13) {
[brushLocations addObject:[NSValue valueWithCGPoint:currentPoint]];
}
// [self drawingWithArray]; // don't call draw routine during touch handler
[self setNeedsDisplay]; // queue the redraw instead
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
// Not needed here
// [brushLocations removeAllObjects];//reset
}
//-(void)drawingWithArray
- (void)drawRect:(CGRect)rect
{
// CGContext is already set when drawRect is called
// UIGraphicsBeginImageContext(self.view.frame.size);
// [drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)];
[drawImage.image drawInRect:rect];
for (int i=0; i<[brushLocations count]; i++) {
CGPoint center =[[brushLocations objectAtIndex:i]CGPointValue];
// bokehImage is UIImage
bokehImage=[bokehImgArray objectAtIndex: i%[bokehImgArray count]];
// the PNG images are not semi-transparent, even set the alpha is 0.5??
[bokehImage drawAtPoint:center blendMode:kCGBlendModeOverlay alpha:0.5f];
//drawImage is uiimage view on top of background image view for stroke PNG images.
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
// UIGraphicsEndImageContext();
}
[brushLocations removeAllObjects];//reset
}

Related

Touch coordinates/location distorted

Hi all I have a spritekit scene "Menu"
that I am loading a UIView as a subview (CGScratchViewController.xib)
-(void)addscratchView:(SKView*)view
{
CGRect viewFrame = CGRectMake(0, 0, [UIScreen mainScreen].bounds.size.height, [UIScreen mainScreen].bounds.size.width);
//ScratchableView : UIView
myScratchView = [[ScratchView alloc]initWithFrame:viewFrame ];
self.name = #"menuScene";
[self.view addSubview:myScratchView];
}
ScratchableView class loads a layer that i can erase with my finger an overlay drawing revealing a drawing below
the code seems to be working however the touches are off and seem to be "scaled" somehow, meaning that if I draw in the top left corner the touch is in the correct place, but dragging the finger outward and the touch becomes more and more distorted
- any thoughts on what to look for to correct this
(am a newbie making newbie mistakes)
Scratchableview.h
// Created by Olivier Yiptong on 11-01-11.
//
#import "ScratchableView.h"
#implementation ScratchableView
#synthesize contentScale;
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self) {
scratchable = [UIImage imageNamed:#"scratchable.jpg"].CGImage;
width = CGImageGetWidth(scratchable);
height = CGImageGetHeight(scratchable);
self.opaque = NO;
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceGray();
CFMutableDataRef pixels = CFDataCreateMutable( NULL , width * height );
alphaPixels = CGBitmapContextCreate( CFDataGetMutableBytePtr( pixels ) , width , height , 8 , width , colorspace , kCGImageAlphaNone );
provider = CGDataProviderCreateWithCFData(pixels);
CGContextSetFillColorWithColor(alphaPixels, [UIColor blackColor].CGColor);
CGContextFillRect(alphaPixels, frame);
CGContextSetStrokeColorWithColor(alphaPixels, [UIColor whiteColor].CGColor);
CGContextSetLineWidth(alphaPixels, 20.0);
CGContextSetLineCap(alphaPixels, kCGLineCapRound);
CGImageRef mask = CGImageMaskCreate(width, height, 8, 8, width, provider, nil, NO);
scratched = CGImageCreateWithMask(scratchable, mask);
CGImageRelease(mask);
CGColorSpaceRelease(colorspace);
}
return self;
}
- (void)drawRect:(CGRect)rect {
CGContextDrawImage(UIGraphicsGetCurrentContext() , [self bounds] , scratched);
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
if([[touch view] isKindOfClass:[UIImageView class]]){
CGPoint point= [touch locationInView:touch.view];
NSLog(#"%f%f",point.x,point.y);
location = point;
}
firstTouch = YES;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event touchesForView:self] anyObject];
if (firstTouch) {
firstTouch = NO;
previousLocation = [touch previousLocationInView:self];
} else {
location = [touch locationInView:self];
previousLocation = [touch previousLocationInView:self];
}
// Render the stroke
[self renderLineFromPoint:previousLocation toPoint:location];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event touchesForView:self] anyObject];
if (firstTouch) {
firstTouch = NO;
previousLocation = [touch previousLocationInView:self];
[self renderLineFromPoint:previousLocation toPoint:location];
}
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void) renderLineFromPoint:(CGPoint)start toPoint:(CGPoint)end {
CGContextMoveToPoint(alphaPixels, start.x, start.y);
CGContextAddLineToPoint(alphaPixels, end.x, end.y);
CGContextStrokePath(alphaPixels);
[self setNeedsDisplay];
}
- (void)dealloc {
CGContextRelease(alphaPixels);
CGImageRelease(scratchable);
CGDataProviderRelease(provider);
}
and Menuscene
-(void)didMoveToView:(SKView *)view {
self.userInteractionEnabled = YES;
SKView * skView = (SKView *)self.view;
skView.showsFPS = YES;
skView.showsNodeCount = YES;
// [self createButtons ];
[self addscratchView:view]; //for testing
}
-(void)addscratchView:(SKView*)view
{
CGRect viewFrame = CGRectMake(0, 0, [UIScreen mainScreen].bounds.size.height, [UIScreen mainScreen].bounds.size.width);
myScratchView = [[ScratchView alloc]initWithFrame:viewFrame ];
self.name = #"menuScene";
[self.view addSubview:myScratchView];
}
I've tried to convert the coordinates...still not much luck (maybe its not the right thing to do anyways ..
//convert between view coordinates and scene coordinates
//coordinate system is not the same in a Sprite Kit scene as they are in a UIView
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchlocation = [touch locationInView:touch.view];
spriteView = (Menu *) spriteView.scene.view;
CGPoint positionInScene = [self convertPoint:touchlocation fromCoordinateSpace:spriteView.scene.view];
Video of the effect/distorted touches
https://www.youtube.com/watch?v=dMrcufcKpao
It looks like you are not layering your views the way you are thinking you are laying them in your head.
Right now you have it layered like this
Back of the screen
SKView
--SKScene
ScratchView
Now when this is happening, here is how your views are probably laid out size wise.
SKView->UIBuilder Default (600x600 I think)
--SKScene-> SceneBuilder Default (600x600)
ScratchView->ScreenSize
Your SKView will then resize to the screen size with AutoConstraints, but your scene will stay at 600x600 because you do not set the scaleMode to .ResizeFill, so now you have 2 different coordinate systems, with scaling all off.
I think the best option for you right now is to just set your scaleMode to .ResizeFill (skView.scaleMode = .ResizeFill) so that all your coordinates line up, and as you learn more about what SpriteKit has to offer, you can refactor your code better to suit your needs.

How to get all the coordinate points (X and Y) in ios

I am writing an app which allows the user to draw what ever they want in the view. While they drawing I am sending the coordinate values simultaneously to web using Json and draw the same diagram in web. When I draw slowly I am getting all the coordinates values like this.
{"XY":["92,240","94,240","96,240","97,240","98,240","99,240","100,240","102,240","103,240","104,240","104,240","105,240","106,240","107,240","108,240","108,240","110,240","110,240","112,240","112,240","114,240","115,240","116,240","117,240","118,240","120,240","120,240","120,240","122,240","122,240","124,240","124,240","126,240"]}
But when I draw quickly I am getting the desired drawing but missing lots of coordinate values.
{"XY":["96,320","117,302","170,262","252,208"]}
The following code that I used to implement this.
#implementation PJRSignatureView
{
UIBezierPath *beizerPath;
UIImage *incrImage;
CGPoint points[10];
uint control;
}
- (void)drawRect:(CGRect)rect
{
[incrImage drawInRect:rect];
[beizerPath stroke];
// Set initial color for drawing
UIColor *fillColor = INITIAL_COLOR;
[fillColor setFill];
UIColor *strokeColor = INITIAL_COLOR;
[strokeColor setStroke];
[beizerPath stroke];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([lblSignature superview]){
[lblSignature removeFromSuperview];
}
rectArray = [[NSMutableArray alloc] init];
control = 0;
UITouch *touch = [touches anyObject];
points[0] = [touch locationInView:self];
CGPoint startPoint = points[0];
CGPoint endPoint = CGPointMake(startPoint.x + 1.5, startPoint.y
+ 2);
[beizerPath moveToPoint:startPoint];
NSLog(#"myPoint = %#", [NSValue valueWithCGPoint:endPoint]);
NSLog(#"beizerPath :%#",beizerPath);
[beizerPath addLineToPoint:endPoint];
NSLog(#"beizerPath end:%#",beizerPath);
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
control++;
points[control] = touchPoint;
if (control == 4)
{
points[3] = CGPointMake((points[2].x + points[4].x)/2.0, (points[2].y + points[4].y)/2.0);
[beizerPath moveToPoint:points[0]];
[beizerPath addCurveToPoint:points[3] controlPoint1:points[1] controlPoint2:points[2]];
[self setNeedsDisplay];
points[0] = points[3];
points[1] = points[4];
control = 1;
}
NSLog(#"beizerPathmove:%#",beizerPath);
NSString *rect_xy = [NSString stringWithFormat:#"%.f,%.f",touchPoint.x,touchPoint.y];
[rectArray addObject:rect_xy];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self drawBitmapImage];
[self setNeedsDisplay];
[beizerPath removeAllPoints];
NSMutableDictionary *rectDict = [[NSMutableDictionary alloc]init];
[rectDict setValue:rectArray forKey:#"XY"];
NSData *jsonData = [NSJSONSerialization dataWithJSONObject:rectDict options:0 error:nil];
// Checking the format
NSLog(#"%#",[[NSString alloc] initWithData:jsonData encoding:NSUTF8StringEncoding]);
control = 0;
}
How to find all the coordinate values even when I draw quickly?
The Apple products sends the touch events at a certain intervals. So, If you draw/write anything slowly its possible to get all of the coordinate values, if you draw/write fast you will less some of them. And apply the same kind of function in the web that you have used. Hope this will be fixed now.
The system sends touch events at a certain interval. If you move slow you get more, but going fast you get less of them. You can't get more if you move fast. But you also don't need more probably. You just have to draw the line between points, no matter if the distance between them is small or bigger.

CGcontext and drawRect not drawing

Need a fresh pair of eyes, mine have stopped working and can't make sense of my own code anymore...
I am trying to make a drawing app with a pen on paper style of drawing. Here's how it is suppose to work:
user touches, app grabs location
a var is set to tell drawRect to configure CGcontext, create a new path, move to point, etc (because I always gett errors/warnings in my nslog whenever I do anything with CGcontext outside the drawRect method)
var is then set to determine whether to place a dot or a line when the user lifts finger
if the user drags, the var is changed to tell drawRect to draw a line and [self setneedsdisplay] is called
every time the user places their finger on the screen a timer is activated, every 5 seconds (or until they lift their finger up) the app 'captures' the screen contents, sticks it in an image and wipes the screen, replacing the image and continues drawing (in effect cache'ing the image so all these lines don't have to be re-drawn)
UPDATE #1
So I re-wrote it (because it was obviously really bad and not working...) and ended up with this:
- (void)drawRect:(CGRect)rect {
[_incrImage drawInRect:rect]; /* draw image... this will be blank at first
and then supposed to be filled by the Graphics context so that when the view
is refreshed the user is adding lines ontop of an image (but to them it looks
like a continuous drawing)*/
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapSquare);
CGContextSetLineJoin(UIGraphicsGetCurrentContext(), kCGLineJoinRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), _brushW);
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [UIColor colorWithRed:_brushR green:_brushG blue:_brushB alpha:_brushO].CGColor);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeNormal);
CGContextAddPath(UIGraphicsGetCurrentContext(), _pathref);
CGContextDrawPath(UIGraphicsGetCurrentContext(), kCGPathStroke);
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
_drw = 2; //set this to draw a dot if the user taps
UITouch *touch = [touches anyObject];
_cp = [touch locationInView:self];
_lp = _cp;
CGPathRelease(_pathref); /* this line and the line below is to clear the
path so the app is drawing as little as possible (the idea is that as
the user draws and lifts their finger, the drawing goes into _incrImage
and then the contents is cleared and _incrImage is applied to the
screen so there is as little memory being used as possible and so the
user feels as if they're adding to what they've drawn when they touch
the device again) */
_pathref = CGPathCreateMutable();
CGPathMoveToPoint(_pathref, NULL, _lp.x, _lp.y);
touch = nil;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
_drw = 1; //user moved their finger, this was not a tap so they want
//to draw a line
UITouch *touch = [touches anyObject];
_cp = [touch locationInView:self];
CGPathAddLineToPoint(_pathref, NULL, _cp.x, _cp.y);
[self setNeedsDisplay]; //as the user moves their finger, it draws
_lp = _cp;
touch = nil;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
_cp = [touch locationInView:self];
switch (_drw) { //logic to determine what to draw when the user
//lifts their finger
case 1:
//line
CGPathAddLineToPoint(_pathref, NULL, _cp.x, _cp.y);
break;
case 2:
//dot
CGPathAddArc(_pathref, NULL, _cp.x, _cp.y, _brushW, 0.0, 360.0, 1);
break;
default:
break;
}
/* here's the fun bit, the Graphics context doesn't seem to be going
into _incrImage and therefore is not being displayed when the user
goes to continue drawing after they've drawn a line/dot on the screen */
UIGraphicsBeginImageContext(self.frame.size);
CGContextAddPath(UIGraphicsGetCurrentContext(), _pathref); /* tried adding
my drawn path to the context and then adding the context to the image
before the user taps down again and the path is cleared... not sure
why it isn't working. */
[_incrImage drawAtPoint:CGPointZero];
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self setNeedsDisplay]; //finally, refresh the contents
touch = nil;
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
My new problem is that everything is erased when drawRect is called and that '_incrImage' is not getting the contents of the current graphics and displaying them
__old, I guess no longer needed but keeping here for reference___
here is the relevant code:
- (void)drawRect:(CGRect)rect {
/* REFERENCE: _drw: 0 = clear everything
1 = cache the screen contents in the image and display that
so the device doesn't have to re-draw everything
2 = set up new path
3 = draw lines instead of a dot at current point
4 = draw a 'dot' instead of a line
*/
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapSquare);
CGContextSetLineJoin(UIGraphicsGetCurrentContext(), kCGLineJoinRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), _brushW);
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [UIColor colorWithRed:_brushR green:_brushG blue:_brushB alpha:_brushO].CGColor);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeNormal);
switch (_drw) {
case 0:
//clear everything...this is the first draw... it can also be called to clear the view
UIGraphicsBeginImageContext(self.frame.size);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);
CGContextSetFillColorWithColor(UIGraphicsGetCurrentContext(), [UIColor clearColor].CGColor);
CGContextFillRect(UIGraphicsGetCurrentContext(), self.frame);
CGContextFlush(UIGraphicsGetCurrentContext());
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[_incrImage drawAtPoint:CGPointZero];
[_incrImage drawInRect:rect];
break;
case 1:
//capture the screen content and stick it in _incrImage...
then apply_incrImage to screen so the user can continue drawing ontop of it
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
[_incrImage drawAtPoint:CGPointZero];
[_incrImage drawInRect:rect];
break;
case 2:
//begin path and set everything up, this is called when touchesBegan: fires...
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
[_incrImage drawAtPoint:CGPointZero];
[_incrImage drawInRect:rect];
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), _p.x, _p.y);
break;
case 3:
//add lines, this is after the path is created and set...this is fired when touchesMoved: gets activated and _drw is set to draw lines instead of adding dots
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), _p.x, _p.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
break;
case 4:
//this is fired when touchesEnd: is activated... this sets up ready if the app needs to draw a 'dot' or the arc with a fill...
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), _p.x, _p.y);
CGContextAddArc(UIGraphicsGetCurrentContext(), _p.x, _p.y, _brushW, 0.0, 360.0, 1);
CGContextFillPath(UIGraphicsGetCurrentContext());
CGContextFlush(UIGraphicsGetCurrentContext());
break;
default:
break;
}
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
dTimer = [NSTimer timerWithTimeInterval:5.0 target:self selector:#selector(timeUP) userInfo:Nil repeats:YES];
_drw = 2;
UITouch *touch = [touches anyObject];
_p = [touch locationInView:self];
[self setNeedsDisplay];
_drw = 4;
}
- (void)timeUP {
_drw = 1;
[self setNeedsDisplay];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
_drw = 3;
UITouch *touch = [touches anyObject];
_p = [touch locationInView:self];
[self setNeedsDisplay];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event // (2)
{
[dTimer invalidate];
UITouch *touch = [touches anyObject];
_p = [touch locationInView:self];
[self setNeedsDisplay];
[self timeUP];
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
My questions are:
is this efficient? or is there a better way to do this?
and
Why is this not drawing? I get nothing but I can see my memory being used...
Egad.
Your first problem is that setNeedsDisplay doesn’t draw immediately, it just marks the view to be drawn at the end of the event. So, when you set _drw=2 and call setNeedsDisplay and then set _drw=4, it’s only going to actually call -drawRect: with _drw=4 (if that, since that still might not be the end of the current event).
But, also don’t, uh, use that _drw switch thing. That’s not good.
You want to create an image and draw into the image as the touches happen, and then in drawRect: just blat the image to the screen. If you ever find yourself calling UIGraphicsGetImageFromCurrentImageContext() inside -drawRect; you are doing things backwards (as you are here). Don’t slurp the image from the screen, create an image that you blat to the screen.
The screen should never been your ‘state’. That way lies madness.
On top of the answer from #WilShipley which I agree with (don't put state management logic in drawRect:, only pure redrawing logic), you are currently never drawing drawing anything other than a minuscule line because CGContextStrokePath clears the current line from the context.
The context isn't intended to be a temporary cache of incomplete drawing operations, it's your portal to the screen (or a backing image / PDF file / ...). You need to create your drawing state outside drawRect: and then render it to the screen inside drawRect:.
To ease performance issues, only redraw the area around the newest touch (or between the newest and previous touch).

How to 'record' a UIBezierPath to be able to playback a stroked drawing path

In my current iOS app, the user is able to draw with their finger by using a UIBezierPath with smoothing the path; this is pretty simple however. What I would like to know, is if it's possible to record the path, the dots, and the color associated for the path and dots for when the user lifts up their finger and changes pencil colors. My goal then is that a play button would then playback everything they just created in real time, and would be sped up with an animation in case they took several minutes drawing.
I appreciate your responses. Here's the code I'm currently using for drawing (not the best code):
#property (nonatomic, strong) UIBezierPath *path;
#property uint ctr;
#end
#implementation DrawViewController
{
CGPoint pts[4];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
self.ctr = 0;
UITouch *touch = [touches anyObject];
pts[0] = [touch locationInView:self.drawImage];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self.drawImage];
self.ctr++;
pts[self.ctr] = p;
if (self.ctr == 3)
{
pts[2] = CGPointMake((pts[1].x + pts[3].x)/2.0, (pts[1].y + pts[3].y)/2.0);
[self.path moveToPoint:pts[0]];
[self.path addQuadCurveToPoint:pts[2] controlPoint:pts[1]];
//[self.drawImage setNeedsDisplay];
pts[0] = pts[2];
pts[1] = pts[3];
self.ctr = 1;
dispatch_async(dispatch_get_main_queue(),
^{
UIGraphicsBeginImageContextWithOptions(self.drawImage.bounds.size, NO, 0.0);
[self.drawImage.image drawAtPoint:CGPointZero];
[[UIColor colorWithRed:self.red green:self.green blue:self.blue alpha:1.0] setStroke];
[self.path stroke];
self.drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self.path removeAllPoints];
self.ctr = 0;
});
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if (self.ctr == 0)
{
[self.path moveToPoint:pts[0]];
[self.path addLineToPoint:pts[0]];
}
else if (self.ctr == 1)
{
[self.path moveToPoint:pts[0]];
[self.path addLineToPoint:pts[1]];
}
else if (self.ctr == 2)
{
[self.path moveToPoint:pts[0]];
[self.path addQuadCurveToPoint:pts[2] controlPoint:pts[1]];
}
self.ctr = 0;
dispatch_async(dispatch_get_main_queue(),
^{
UIGraphicsBeginImageContextWithOptions(self.drawImage.bounds.size, NO, 0.0);
[self.drawImage.image drawAtPoint:CGPointZero];
[[UIColor colorWithRed:self.red green:self.green blue:self.blue alpha:1.0] setStroke];
[self.path stroke];
self.drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self.path removeAllPoints];
self.ctr = 0;
});
}
What I would do would be to create a custom object that represents a single "segment" of your drawing. (Let's call it a "BezierSegment".) From a quick glance, it looks like you're using quadratic Bezier segments. So create an object that saves the 3 control points for the bezier and the color used to draw it. Each time you draw a new "segment", create one of these objects and add it to a mutable array of segment objects.
Then you could loop through your array of BezierSegment objects, create BezierPath objects out of each one, and draw it to the screen in order to recreate it.
You could also save things like line thickness, optional closed paths with a separate pen color, etc.

iOS Bezier path shows with zig-zags

trying to learn some Quartz graphics and trying to do a smooth drawing. I am trying to combine this 2 tutorials
http://mobile.tutsplus.com/tutorials/iphone/ios-sdk_freehand-drawing/
http://www.raywenderlich.com/18840/how-to-make-a-simple-drawing-app-with-uikit
And so far this is my code
#import "ViewController.h"
#interface ViewController ()
{
UIBezierPath *path;
CGPoint pts[5];
uint ctr;
}
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self setupData];
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
-(void) setupData
{
[self.mainImage setMultipleTouchEnabled:NO];
[self.tempImage setMultipleTouchEnabled:NO];
path = [UIBezierPath bezierPath];
[path setLineWidth:2.0];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
ctr = 0;
UITouch *touch = [touches anyObject];
pts[0] = [touch locationInView:self.tempImage];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self.tempImage];
ctr++;
pts[ctr] = p;
if (ctr == 4)
{
pts[3] = CGPointMake((pts[2].x + pts[4].x)/2.0, (pts[2].y + pts[4].y)/2.0); // move the endpoint to the middle of the line joining the second control point of the first Bezier segment and the first control point of the second Bezier segment
[path moveToPoint:pts[0]];
[path addCurveToPoint:pts[3] controlPoint1:pts[1] controlPoint2:pts[2]]; // add a cubic Bezier from pt[0] to pt[3], with control points pt[1] and pt[2]
[self draw];
[self.mainImage setNeedsDisplay];
[self.tempImage setNeedsDisplay];
// replace points and get ready to handle the next segment
pts[0] = pts[3];
pts[1] = pts[4];
ctr = 1;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[path removeAllPoints];
ctr = 0;
UIGraphicsBeginImageContext(self.tempImage.frame.size);
[self.mainImage.image drawInRect:CGRectMake(0, 0, self.tempImage.frame.size.width, self.tempImage.frame.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
[self.tempImage.image drawInRect:CGRectMake(0, 0, self.tempImage.frame.size.width, self.tempImage.frame.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
self.mainImage.image = UIGraphicsGetImageFromCurrentImageContext();
self.tempImage.image = nil;
[self.tempImage setNeedsDisplay];
[self.mainImage setNeedsDisplay];
UIGraphicsEndImageContext();
}
- (void)draw
{
UIGraphicsBeginImageContext(self.tempImage.frame.size);
[self.tempImage.image drawInRect:CGRectMake(0, 0, self.tempImage.frame.size.width, self.tempImage.frame.size.height)];
pts[3] = CGPointMake((pts[2].x + pts[4].x)/2.0, (pts[2].y + pts[4].y)/2.0); // move the endpoint to the middle of the line joining the second control point of the first Bezier segment and the first control point of the second Bezier segment
[path moveToPoint:pts[0]];
[path addCurveToPoint:pts[3] controlPoint1:pts[1] controlPoint2:pts[2]]; // add a cubic Bezier from pt[0] to pt[3], with control points pt[1] and pt[2]
[path stroke];
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeNormal);
// CGContextStrokePath(UIGraphicsGetCurrentContext());
self.tempImage.image = UIGraphicsGetImageFromCurrentImageContext();
[self.tempImage setAlpha:1.0];
UIGraphicsEndImageContext();
}
The corners looked much better, but the lines itself look like they have zigzags in them. Would appreciate if you can point me to my mistake. Thanks
I think the image you're drawing into is non-retina sized because self.tempImage.frame.size returns the frame size in points not pixels. Then when you draw the image to the screen it is up-sized and pixelated. If you make the buffer image match the pixel size of the component on screen then you should be fine.

Resources