I have a UIImageView (wImage) in which I am trying to draw a line. The code runs fine in simulator but when I test it an a device it is super slow and creates a memory warning. Could someone please tell me what the problem is?
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint thirdPoint = lastPoint;
lastPoint = [touch previousLocationInView:self.view];
CGPoint currentPoint = [touch locationInView:self.view];
CGPoint mid1 = CGPointMake((lastPoint.x+thirdPoint.x)/2, (lastPoint.y+thirdPoint.y)/2);
CGPoint mid2 = CGPointMake((currentPoint.x+lastPoint.x)/2, (currentPoint.y+lastPoint.y)/2);
UIGraphicsBeginImageContext(wImage.frame.size);
CGContextSetAllowsAntialiasing(UIGraphicsGetCurrentContext(), true);
CGContextSetShouldAntialias(UIGraphicsGetCurrentContext(), true);
[wImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), mid1.x, mid1.y);
CGContextAddQuadCurveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y, mid2.x, mid2.y);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush );
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), red, green, blue, 1);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeNormal);
CGContextStrokePath(UIGraphicsGetCurrentContext());
wImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
You are having such a performance problem because you are actually doing a ton of work to create a new image every time the user's finger moves. Don't draw directly on the image itself, create a UIView that is responsible for the user's drawing, with a transparend background. There's a good amount you can do with it, and I couldn't possibly put all the code here, but there is a great tutorial, complete with some really cool code to smooth out the line as the user is drawing. It results in a much nicer looking path. Here it is:
http://code.tutsplus.com/tutorials/smooth-freehand-drawing-on-ios--mobile-13164
Go ahead and read through all the sections - it would be good for you to understand what's going on rather than just trying the last implementation.
For your implementation, make sure the view has a transparent background, so you can see the ImageView underneath.
Related
I have an iPhone app where I provide a sketch pad for the user to save a signature. An UIImageView gets added to the main view and that holds the strokes. For some reason you can only draw short lines on the pad like the following image.
I have another application for the iPad that uses the same code and it works fine. I'm not sure what could be causing it. I'm not using any touch or gesture code that would interfere with it. The following is some of the code I use.
UPDATE: If I create a UIViewController with the same class and make it the root view controller then it works fine. Something in my navigation hierarchy is doing something weird.
-(void)SetUpSignaturePad{
//create a frame for our signature capture
imageFrame = CGRectMake(self.view.frame.origin.x,
self.view.frame.origin.y,
self.view.frame.size.width + 23,
self.view.frame.size.height + 7 );
//allocate an image view and add to the main view
mySignatureImage = [[UIImageView alloc] initWithImage:nil];
mySignatureImage.frame = imageFrame;
mySignatureImage.backgroundColor = [UIColor whiteColor];
[self.view addSubview:mySignatureImage];
}
//when one or more fingers touch down in a view or window
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//did our finger moved yet?
fingerMoved = NO;
UITouch *touch = [touches anyObject];
//we need 3 points of contact to make our signature smooth using quadratic bezier curve
currentPoint = [touch locationInView:mySignatureImage];
lastContactPoint1 = [touch previousLocationInView:mySignatureImage];
lastContactPoint2 = [touch previousLocationInView:mySignatureImage];
//when one or more fingers associated with an event move within a view or window
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
//well its obvious that our finger moved on the screen
fingerMoved = YES;
UITouch *touch = [touches anyObject];
//save previous contact locations
lastContactPoint2 = lastContactPoint1;
lastContactPoint1 = [touch previousLocationInView:mySignatureImage];
//save current location
currentPoint = [touch locationInView:mySignatureImage];
//find mid points to be used for quadratic bezier curve
CGPoint midPoint1 = [self midPoint:lastContactPoint1 withPoint:lastContactPoint2];
CGPoint midPoint2 = [self midPoint:currentPoint withPoint:lastContactPoint1];
//create a bitmap-based graphics context and makes it the current context
UIGraphicsBeginImageContext(imageFrame.size);
//draw the entire image in the specified rectangle frame
[mySignatureImage.image drawInRect:CGRectMake(0, 0, imageFrame.size.width, imageFrame.size.height)];
//set line cap, width, stroke color and begin path
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 3.0f);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 0.0, 1.0);
CGContextBeginPath(UIGraphicsGetCurrentContext());
//begin a new new subpath at this point
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), midPoint1.x, midPoint1.y);
//create quadratic Bézier curve from the current point using a control point and an end point
CGContextAddQuadCurveToPoint(UIGraphicsGetCurrentContext(),
lastContactPoint1.x, lastContactPoint1.y, midPoint2.x, midPoint2.y);
//set the miter limit for the joins of connected lines in a graphics context
CGContextSetMiterLimit(UIGraphicsGetCurrentContext(), 2.0);
//paint a line along the current path
CGContextStrokePath(UIGraphicsGetCurrentContext());
//set the image based on the contents of the current bitmap-based graphics context
mySignatureImage.image = UIGraphicsGetImageFromCurrentImageContext();
//remove the current bitmap-based graphics context from the top of the stack
UIGraphicsEndImageContext();
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
//if the finger never moved draw a point
if(!fingerMoved) {
UIGraphicsBeginImageContext(imageFrame.size);
[mySignatureImage.image drawInRect:CGRectMake(0, 0, imageFrame.size.width, imageFrame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 3.0f);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 0.0, 1.0);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
CGContextFlush(UIGraphicsGetCurrentContext());
mySignatureImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
}
//calculate midpoint between two points
- (CGPoint) midPoint:(CGPoint )p0 withPoint: (CGPoint) p1 {
return (CGPoint) {
(p0.x + p1.x) / 2.0,
(p0.y + p1.y) / 2.0
};
}
I'm sorry to tell you that I haven't a real solution, but your problem most probably is due to performance issues. Why? Because you are creating an image each time a gesture is detected. Creating images requires of screen renderings that takes time and resources.
You should base your code on same project that has drawing functionalities, usually that use a view that updates their draw in the drawRect method, for you maybe a CAShapeLaywr is also fine.
Run Time Profiler in instruments and search wich method is tanking time.
The Problem: I have a small game where u can scratch a image above another image. The upper image becomes invisible where you are scratching. The whole thing is placed into a SpriteKit view. The problem is, that on weaker devices (iPhone4) the scratching image is only updated when the user stops scratching.
I assume that the image is only updated when the user does not scratch until the image is completely rendered and displayed.
Can someone suggest a better way to see the scratching immediately. I am aware that the scratching consumes a lot of performance but I dont mind if it lags a little.
Here the code:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
UIGraphicsBeginImageContext(self.view.frame.size);
[_scratchImage drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), _lastPoint.x, _lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 25.0f );
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 0.0, 1.0);
CGContextSetBlendMode (UIGraphicsGetCurrentContext(), kCGBlendModeClear);
CGContextStrokePath(UIGraphicsGetCurrentContext());
_scratchImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
_lastPoint = currentPoint;
SKTexture* scratchTexture = [SKTexture textureWithImage:_scratchImage];
_scratchSprite.texture = scratchTexture;
}
UPDATE
I ended up recording the points as suggested in the correct answer. But other than updating the image in a CADisplayLink callback I updated the screen in a
-(void)update(float currentTime)
callback from SpriteKit (Which is kind of the same for SpriteKit use cases)
My code in the update method looks like the following:
-(void)displayUpdated
{
UIGraphicsBeginImageContext(self.view.frame.size);
[_scratchImage drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
CGMutablePathRef linePath = nil;
linePath = CGPathCreateMutable();
CGPathMoveToPoint(linePath, NULL, _lastPoint.x, _lastPoint.y);
for (NSValue* val in _recordedPoints) {
CGPoint p = [val CGPointValue];
CGPathAddLineToPoint(linePath, NULL, p.x, p.y);
NSLog(#"%f, %f", p.x, p.y);
_lastPoint = p;
}
//flush array
[_recordedPoints removeAllObjects];
CGContextAddPath(UIGraphicsGetCurrentContext(), linePath);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 25.0f );
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 0.0, 1.0);
CGContextSetBlendMode (UIGraphicsGetCurrentContext(), kCGBlendModeClear);
CGContextStrokePath(UIGraphicsGetCurrentContext());
_scratchImage = UIGraphicsGetImageFromCurrentImageContext();
CGPathRelease(linePath);
UIGraphicsEndImageContext();
SKTexture* scratchTexture = [SKTexture textureWithImage:_scratchImage];
_scratchSprite.texture = scratchTexture;
}
A more performant code would also hold a reference to the context but I could not manage to get that working. If someone can suggest a solution I would be happy. Nevertheless my Framerate on a iPhone 4 went up to 25-30 fps from about 5 fps.
I would try the following:
Measure in Instruments.app Time Profiler.
Speed up drawing by creating a CGBitmapContext as an instance variable. Then you can skip the [_scratchImage drawInRect:...] in each step.
Separate the drawing and the event handling. Right now, you will always create a new UIImage for each touch event. It could be better to coalesce the stroked lines so the creation of a UIImage does not happen as often. E.g. put the line segments in a queue and draw in a callback from CADisplayLink.
I don't know much about SpriteKit, In UIView, you should put your drawing code in drawRect method. Is it the same as SpriteKit ?
So, I am working on a virtual wall painting app.
I am able to draw some random lines by moving my finger on the image view(which has an image in it). Now I am trying to erase the drawings made on it without success as of yet.
I googled to find some solution and this line below came as full on recommended :-
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeClear);
Using this drawing can be erased,but it also erases the image on which the drawing are done. Is there any way to prevent the image from getting erased ? only erase the drawing made on it?
Following is the code I am using to erase the drawings :-
In touchesMoved:-
UITouch *touch = [touches anyObject];
CGPoint previousPoint = [touch locationInView:self.upperImageView];
UIGraphicsBeginImageContext(self.upperImageView.frame.size);
[self.upperImageView.image drawInRect:CGRectMake(0, 0, self.upperImageView.frame.size.width, self.upperImageView.frame.size.height)];
CGContextSaveGState(UIGraphicsGetCurrentContext());
CGContextSetShouldAntialias(UIGraphicsGetCurrentContext(),YES);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), eraserCap);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 45.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.25, 0.25, 0.25, 1.0);
CGMutablePathRef pathB = CGPathCreateMutable();
CGPathMoveToPoint(pathB,nil,location.x,location.y);
CGPathAddLineToPoint(pathB,nil,previousPoint.x,previousPoint.y);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);
CGContextAddPath(UIGraphicsGetCurrentContext(),pathB);
CGContextStrokePath(UIGraphicsGetCurrentContext());
upperImageView.image = UIGraphicsGetImageFromCurrentImageContext();
CGContextRestoreGState(UIGraphicsGetCurrentContext());
UIGraphicsEndImageContext();
location = previousPoint;
and in touchesEnded:-
UITouch *touch = [touches anyObject];
CGPoint previousPoint = [touch locationInView:self.upperImageView];
UIGraphicsBeginImageContext(self.upperImageView.frame.size);
[self.upperImageView.image drawInRect:CGRectMake(0, 0, self.upperImageView.frame.size.width, self.upperImageView.frame.size.height)];
CGContextSaveGState(UIGraphicsGetCurrentContext());
CGContextSetShouldAntialias(UIGraphicsGetCurrentContext(), YES);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), eraserCap);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 1.0,1.0,1.0, 0.0);
CGMutablePathRef pathB = CGPathCreateMutable();
CGPathMoveToPoint(pathB, nil, location.x, location.y);
CGPathAddLineToPoint(pathB, nil, previousPoint.x, previousPoint.y);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);
CGContextAddPath(UIGraphicsGetCurrentContext(),pathB);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.upperImageView.image = UIGraphicsGetImageFromCurrentImageContext();
// CGContextRestoreGState(UIGraphicsGetCurrentContext());
UIGraphicsEndImageContext();
previousPoint = location;
Is there anything I am doing wrong ? Please guide me in the right direction here, any help will be truly appreciated.
You can use different views for your image and drawings.
Or keep drawn paths in the array and delete each path separately.
Move all of your drawing to a separate UIView that overlays the UIImageView. That way, when you go to "clear" the contents, the original UIImageView is still completely intact underneath.
Something like this will get you set up:
- (void)viewWillAppear {
// Assumes you add a property of type UIView
self.drawingView = [[UIView alloc] initWithFrame:self.upperImageView.bounds];
[self.upperImageView addSubview:drawingView];
}
Then, your drawing code needs to only change to this (notice that you won't need to redraw the image any more):
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint previousPoint = [touch locationInView:self.drawingView];
UIGraphicsBeginImageContext(self.drawingView.frame.size);
CGContextSaveGState(UIGraphicsGetCurrentContext());
CGContextSetShouldAntialias(UIGraphicsGetCurrentContext(),YES);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), eraserCap);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 45.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.25, 0.25, 0.25, 1.0);
CGMutablePathRef pathB = CGPathCreateMutable();
CGPathMoveToPoint(pathB,nil,location.x,location.y);
CGPathAddLineToPoint(pathB,nil,previousPoint.x,previousPoint.y);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);
CGContextAddPath(UIGraphicsGetCurrentContext(),pathB);
CGContextStrokePath(UIGraphicsGetCurrentContext());
upperImageView.image = UIGraphicsGetImageFromCurrentImageContext();
CGContextRestoreGState(UIGraphicsGetCurrentContext());
UIGraphicsEndImageContext();
location = previousPoint;
}
Make similar changes in the touchesEnded: method.
I have a really strange issue. In iOS8, ONLY on the ipad 2 (works in iOS7 on all devices and the mini in iOS8), a signature panel I have will cause lines/drawn objects to fade away the longer the user holds their finger on the UIImageView.
Googling comes up with nothing, any ideas? Here is the code for the drawing of the users taps.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
swiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
currentPoint.y -= 20;
UIGraphicsBeginImageContext(self.view.frame.size);
[_drawImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 0.0, 1.0);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
_drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
I have been having the same issue, and just discovered the cause.
My UIImageView was inside a UITableViewCell, and the autoresizing mask of the UIImageView was set to UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth.
It turns out that removing UIViewAutoresizingFlexibleHeight, and instead setting an explicit height for the UIImageView, resolved the problem.
I suspect this is down to the different behaviour of table cells in iOS8, as I know there is now an auto resize feature (which I haven't used yet as I had my own implementation). I suspect this is playing havoc with the flexible height resize mask, causing the distortion.
I'm working on an drawing app for iPhone. It works fine for like 5 seconds in the iPhone simulator but as more I draw it gets more laggy. When I test it on the device it gets even more laggy and I can't even draw a simple tree. When I check how high percent the processor is running at in xcode it usually go between 97-100%. Is there any way to fix this?
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self.view];
UIGraphicsBeginImageContext(CGSizeMake(320, 568));
[drawImage.image drawInRect:CGRectMake(0, 0, 320, 568)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0, 1, 0, 1);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGPathMoveToPoint(path, NULL, lastPoint.x, lastPoint.y);
CGPathAddLineToPoint(path, NULL, currentPoint.x, currentPoint.y);
CGContextAddPath(UIGraphicsGetCurrentContext(),path);
CGContextStrokePath(UIGraphicsGetCurrentContext());
[drawImage setFrame:CGRectMake(0, 0, 320, 568)];
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
[self.view addSubview:drawImage];
}
When running instruments on your method, I got the following results:
What that tells me:
Setting up a new context every time you want to draw something is
waste of time. consider only setting it up once, store it somewhere,
and you save almost a third of the time, the method currently needs
drawImage is the other most-consuming part. It will be enough to set that only once!
all other calls are almost negligible
There's a great talk about graphics performance including a demo and code walkthrough of a drawing app here:
https://developer.apple.com/videos/wwdc/2012/
The iOS App Performance: Graphics and Animations video