issue trying to screen capture a UIView in iOS - ios

my app has a ViewController which contains inside a UIView (CanvasView) where you can draw your signature (I added a label "Hi Little John" to differentiate it), and I also added a UIImage below it for making a capture by touching a camera button
Now when I touch the camera button it only captures the UIView and UILabel but not the signature as it shows
I have a two classes: my UIView class (CanvasView) and my UIViewController, in my CanvasView I have this code for the screenshot:
#implementation CanvasView
-(UIImage *)getCanvasScreenshot{
//first it makes an UIImage from the view
UIGraphicsBeginImageContext(self.bounds.size);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *sourceImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//now it will position the image, X/Y away from top left corner to get the portion required
UIGraphicsBeginImageContext(self.frame.size);
[sourceImage drawAtPoint:CGPointMake(0, 0)];
UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return croppedImage;
}
//some other code goes here...
#end
in my ViewController's class I have a my IBAction which triggers the event
- (IBAction)captureSignature:(id)sender {
self.imageFirma.image = [canvasView getCanvasScreenshot];
}
I want to capture the signature on my picture
any help I'll appreciate
thanks in advance
***** EDIT** the code for rendering the signature is in CanvasView as well, it's three methods
(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
depending of the event if the touch is:
at the begining (touchesBegan)
if is it moving (touchesMoved)
or if is finished (touchesEnded)
for rendering the stroke all of them call to a method named:
(void) addLineToAndRenderStroke:(SmoothStroke*)currentStroke toPoint:(CGPoint)end toWidth:(CGFloat)width toColor:(UIColor*)color
I can go deeper if you want but it's kinda long because it is a framework for making strokes on the canvas, and the code of all these methods are:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
if(![JotStylusManager sharedInstance].enabled){
for (UITouch *touch in touches) {
[self addLineToAndRenderStroke:[self getStrokeForTouchHash:touch.hash]
toPoint:[touch locationInView:self]
toWidth:[self widthForPressure:JOT_MIN_PRESSURE]
toColor:[self colorForPressure:JOT_MIN_PRESSURE]];
}
}
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
if(![JotStylusManager sharedInstance].enabled){
for (UITouch *touch in touches) {
// check for other brands of stylus,
// or process non-Jot touches
//
// for this example, we'll simply draw every touch if
// the jot sdk is not enabled
[self addLineToAndRenderStroke:[self getStrokeForTouchHash:touch.hash]
toPoint:[touch locationInView:self]
toWidth:[self widthForPressure:JOT_MIN_PRESSURE]
toColor:[self colorForPressure:JOT_MIN_PRESSURE]];
}
}
}
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
if(![JotStylusManager sharedInstance].enabled){
for(UITouch* touch in touches){
SmoothStroke* currentStroke = [self getStrokeForTouchHash:touch.hash];
// now line to the end of the stroke
[self addLineToAndRenderStroke:currentStroke
toPoint:[touch locationInView:self]
toWidth:[self widthForPressure:JOT_MIN_PRESSURE]
toColor:[self colorForPressure:JOT_MIN_PRESSURE]];
// this stroke is now finished, so add it to our completed strokes stack
// and remove it from the current strokes, and reset our undo state if any
[_stackOfStrokes addObject:currentStroke];
[currentStrokes removeObjectForKey:#([touch hash])];
[stackOfUndoneStrokes removeAllObjects];
}
}
}
- (void) addLineToAndRenderStroke:(SmoothStroke*)currentStroke toPoint:(CGPoint)end toWidth:(CGFloat)width toColor:(UIColor*)color{
// fetch the current and previous elements
// of the stroke. these will help us
// step over their length for drawing
AbstractBezierPathElement* previousElement = [currentStroke.segments lastObject];
// Convert touch point from UIView referential to OpenGL one (upside-down flip)
end.y = self.bounds.size.height - end.y;
if(![currentStroke addPoint:end withWidth:width andColor:color]) return;
//
// ok, now we have the current + previous stroke segment
// so let's set to drawing it!
[self renderElement:[currentStroke.segments lastObject] fromPreviousElement:previousElement includeOpenGLPrep:YES];
}

Related

Draw UIBezierPath on UIImageView for image crop

I am working on a photo editing application and need the user to be able to draw a path to crop out the photo. I have a class that works with a UIView for drawing smooth UIBezierPath lines. However I really need to apply this to a UIImageView so that the drawing can be done over the image. If I change the class to subclass UIImageView instead it no longer works. Any ideas on what I can do to fix this? Or better options for achieving the same goal? Below is my implementation:
#import "DrawView.h"
#implementation DrawView
{
UIBezierPath *path;
}
- (id)initWithCoder:(NSCoder *)aDecoder // (1)
{
if (self = [super initWithCoder:aDecoder])
{
[self setMultipleTouchEnabled:NO]; // (2)
[self setBackgroundColor:[UIColor whiteColor]];
path = [UIBezierPath bezierPath];
[path setLineWidth:2.0];
}
return self;
}
- (void)drawRect:(CGRect)rect // (5)
{
[[UIColor blackColor] setStroke];
[path stroke];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self];
[path moveToPoint:p];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self];
[path addLineToPoint:p]; // (4)
[self setNeedsDisplay];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesMoved:touches withEvent:event];
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesEnded:touches withEvent:event];
}
#end
If I place a break point on touchesBegan or touchesMoved it does fire when expected.
First problem - userInteractionEnabled property should be set to YES for receiving touch events.
Second one - drawRect: method is not called for UIImageView subclasses:
The UIImageView class is optimized to draw its images to the display.
UIImageView will not call drawRect: a subclass. If your subclass needs
custom drawing code, it is recommended you use UIView as the base
class.
So, DrawView should be UIView subclass, that will draw UIImage in drawRect:, before bezier path drawing. Something like that (only changed parts of code):
// DrawView.h
#interface DrawView : UIView
#property (nonatomic, strong) UIImage* image;
#end
// DrawView.m
#implementation DrawView
- (void)drawRect:(CGRect)rect
{
[image drawInRect:self.bounds];
[[UIColor blackColor] setStroke];
[path stroke];
}
#end

CGcontext and drawRect not drawing

Need a fresh pair of eyes, mine have stopped working and can't make sense of my own code anymore...
I am trying to make a drawing app with a pen on paper style of drawing. Here's how it is suppose to work:
user touches, app grabs location
a var is set to tell drawRect to configure CGcontext, create a new path, move to point, etc (because I always gett errors/warnings in my nslog whenever I do anything with CGcontext outside the drawRect method)
var is then set to determine whether to place a dot or a line when the user lifts finger
if the user drags, the var is changed to tell drawRect to draw a line and [self setneedsdisplay] is called
every time the user places their finger on the screen a timer is activated, every 5 seconds (or until they lift their finger up) the app 'captures' the screen contents, sticks it in an image and wipes the screen, replacing the image and continues drawing (in effect cache'ing the image so all these lines don't have to be re-drawn)
UPDATE #1
So I re-wrote it (because it was obviously really bad and not working...) and ended up with this:
- (void)drawRect:(CGRect)rect {
[_incrImage drawInRect:rect]; /* draw image... this will be blank at first
and then supposed to be filled by the Graphics context so that when the view
is refreshed the user is adding lines ontop of an image (but to them it looks
like a continuous drawing)*/
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapSquare);
CGContextSetLineJoin(UIGraphicsGetCurrentContext(), kCGLineJoinRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), _brushW);
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [UIColor colorWithRed:_brushR green:_brushG blue:_brushB alpha:_brushO].CGColor);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeNormal);
CGContextAddPath(UIGraphicsGetCurrentContext(), _pathref);
CGContextDrawPath(UIGraphicsGetCurrentContext(), kCGPathStroke);
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
_drw = 2; //set this to draw a dot if the user taps
UITouch *touch = [touches anyObject];
_cp = [touch locationInView:self];
_lp = _cp;
CGPathRelease(_pathref); /* this line and the line below is to clear the
path so the app is drawing as little as possible (the idea is that as
the user draws and lifts their finger, the drawing goes into _incrImage
and then the contents is cleared and _incrImage is applied to the
screen so there is as little memory being used as possible and so the
user feels as if they're adding to what they've drawn when they touch
the device again) */
_pathref = CGPathCreateMutable();
CGPathMoveToPoint(_pathref, NULL, _lp.x, _lp.y);
touch = nil;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
_drw = 1; //user moved their finger, this was not a tap so they want
//to draw a line
UITouch *touch = [touches anyObject];
_cp = [touch locationInView:self];
CGPathAddLineToPoint(_pathref, NULL, _cp.x, _cp.y);
[self setNeedsDisplay]; //as the user moves their finger, it draws
_lp = _cp;
touch = nil;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
_cp = [touch locationInView:self];
switch (_drw) { //logic to determine what to draw when the user
//lifts their finger
case 1:
//line
CGPathAddLineToPoint(_pathref, NULL, _cp.x, _cp.y);
break;
case 2:
//dot
CGPathAddArc(_pathref, NULL, _cp.x, _cp.y, _brushW, 0.0, 360.0, 1);
break;
default:
break;
}
/* here's the fun bit, the Graphics context doesn't seem to be going
into _incrImage and therefore is not being displayed when the user
goes to continue drawing after they've drawn a line/dot on the screen */
UIGraphicsBeginImageContext(self.frame.size);
CGContextAddPath(UIGraphicsGetCurrentContext(), _pathref); /* tried adding
my drawn path to the context and then adding the context to the image
before the user taps down again and the path is cleared... not sure
why it isn't working. */
[_incrImage drawAtPoint:CGPointZero];
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self setNeedsDisplay]; //finally, refresh the contents
touch = nil;
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
My new problem is that everything is erased when drawRect is called and that '_incrImage' is not getting the contents of the current graphics and displaying them
__old, I guess no longer needed but keeping here for reference___
here is the relevant code:
- (void)drawRect:(CGRect)rect {
/* REFERENCE: _drw: 0 = clear everything
1 = cache the screen contents in the image and display that
so the device doesn't have to re-draw everything
2 = set up new path
3 = draw lines instead of a dot at current point
4 = draw a 'dot' instead of a line
*/
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapSquare);
CGContextSetLineJoin(UIGraphicsGetCurrentContext(), kCGLineJoinRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), _brushW);
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), [UIColor colorWithRed:_brushR green:_brushG blue:_brushB alpha:_brushO].CGColor);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeNormal);
switch (_drw) {
case 0:
//clear everything...this is the first draw... it can also be called to clear the view
UIGraphicsBeginImageContext(self.frame.size);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);
CGContextSetFillColorWithColor(UIGraphicsGetCurrentContext(), [UIColor clearColor].CGColor);
CGContextFillRect(UIGraphicsGetCurrentContext(), self.frame);
CGContextFlush(UIGraphicsGetCurrentContext());
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[_incrImage drawAtPoint:CGPointZero];
[_incrImage drawInRect:rect];
break;
case 1:
//capture the screen content and stick it in _incrImage...
then apply_incrImage to screen so the user can continue drawing ontop of it
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
[_incrImage drawAtPoint:CGPointZero];
[_incrImage drawInRect:rect];
break;
case 2:
//begin path and set everything up, this is called when touchesBegan: fires...
_incrImage = UIGraphicsGetImageFromCurrentImageContext();
[_incrImage drawAtPoint:CGPointZero];
[_incrImage drawInRect:rect];
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), _p.x, _p.y);
break;
case 3:
//add lines, this is after the path is created and set...this is fired when touchesMoved: gets activated and _drw is set to draw lines instead of adding dots
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), _p.x, _p.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
break;
case 4:
//this is fired when touchesEnd: is activated... this sets up ready if the app needs to draw a 'dot' or the arc with a fill...
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), _p.x, _p.y);
CGContextAddArc(UIGraphicsGetCurrentContext(), _p.x, _p.y, _brushW, 0.0, 360.0, 1);
CGContextFillPath(UIGraphicsGetCurrentContext());
CGContextFlush(UIGraphicsGetCurrentContext());
break;
default:
break;
}
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
dTimer = [NSTimer timerWithTimeInterval:5.0 target:self selector:#selector(timeUP) userInfo:Nil repeats:YES];
_drw = 2;
UITouch *touch = [touches anyObject];
_p = [touch locationInView:self];
[self setNeedsDisplay];
_drw = 4;
}
- (void)timeUP {
_drw = 1;
[self setNeedsDisplay];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
_drw = 3;
UITouch *touch = [touches anyObject];
_p = [touch locationInView:self];
[self setNeedsDisplay];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event // (2)
{
[dTimer invalidate];
UITouch *touch = [touches anyObject];
_p = [touch locationInView:self];
[self setNeedsDisplay];
[self timeUP];
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
My questions are:
is this efficient? or is there a better way to do this?
and
Why is this not drawing? I get nothing but I can see my memory being used...
Egad.
Your first problem is that setNeedsDisplay doesn’t draw immediately, it just marks the view to be drawn at the end of the event. So, when you set _drw=2 and call setNeedsDisplay and then set _drw=4, it’s only going to actually call -drawRect: with _drw=4 (if that, since that still might not be the end of the current event).
But, also don’t, uh, use that _drw switch thing. That’s not good.
You want to create an image and draw into the image as the touches happen, and then in drawRect: just blat the image to the screen. If you ever find yourself calling UIGraphicsGetImageFromCurrentImageContext() inside -drawRect; you are doing things backwards (as you are here). Don’t slurp the image from the screen, create an image that you blat to the screen.
The screen should never been your ‘state’. That way lies madness.
On top of the answer from #WilShipley which I agree with (don't put state management logic in drawRect:, only pure redrawing logic), you are currently never drawing drawing anything other than a minuscule line because CGContextStrokePath clears the current line from the context.
The context isn't intended to be a temporary cache of incomplete drawing operations, it's your portal to the screen (or a backing image / PDF file / ...). You need to create your drawing state outside drawRect: and then render it to the screen inside drawRect:.
To ease performance issues, only redraw the area around the newest touch (or between the newest and previous touch).

How to erase the drawing using button click?

I have some troubles with drawing app in IOS. I have created the free hand drawing with the help of some tutorials. But I found some difficulties in erasing the drawing. In my app, I have button with eraser as background image. After I clicked the eraser button, when I swipes over the drawing, it will erase the drawing wherever I swipes. Can anyone help me to do this.
Thanks in advance.
Given below is my code:
#implementation LinearInterpView
{
UIBezierPath *path;
}
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
return self;
}
-(id)initWithCoder:(NSCoder *)aDecoder {
if(self = [super initWithCoder:aDecoder]) {
[self setMultipleTouchEnabled:YES];
[self setBackgroundColor:[UIColor whiteColor]];
path=[UIBezierPath bezierPath];
[path setLineWidth:2.0];
}
return self;
}
-(void)drawRect:(CGRect)rect{
[[UIColor blackColor] setStroke];
[path stroke];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch=[touches anyObject];
CGPoint p=[touch locationInView:self];
[path moveToPoint:p];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch=[touches anyObject];
CGPoint p=[touch locationInView:self];
[path addLineToPoint:p];
[self setNeedsDisplay];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesMoved:touches withEvent:event];
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{
[self touchesEnded:touches withEvent:event];
}
// This is the button action to erase the drawing.
- (IBAction)erase:(id)sender {
CGContextRef cgref=UIGraphicsGetCurrentContext();
CGContextSetBlendMode(cgref, kCGBlendModeClear);
}
Kindly clear me, what mistake I did.
So by drawing you mean you have drew lines on the screen say with some color you can do the same by setting white color and alpha 1 so that white lines replace the existing colored lines. A better tutorial here . This also seemed important.
First of all your logic should be make a layer on ImageView.
then you can draw on that layer then pass white color to erase.
It'll look like erase and your view will look like according to requirement.
That will surly work.
Try this to erase drawing in iOS:
- (IBAction)eraserPressed:(id)sender {
red = 255.0/255.0;
green = 255.0/255.0;
blue = 255.0/255.0;
opacity = 1.0;
}
why not you implement the same logic in the erasor button as you did in the draw button. just make the default color of the stroke in the eraser as white color or what ever color your background is.

Why can't I drag images?

I am working off of http://www.raywenderlich.com/33806/how-to-make-a-letterword-game-with-uikit-part-2 , and it says in reference to the view to add the following to the implementation:
int _xOffset, _yOffset;
For the initializer:
self.userInteractionEnabled = YES;
It also gives three methods to implement to handle dragging:
#pragma mark - dragging the tile
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint pt = [[touches anyObject] locationInView:self.superview];
_xOffset = pt.x - self.center.x;
_yOffset = pt.y - self.center.y;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint pt = [[touches anyObject] locationInView:self.superview];
self.center = CGPointMake(pt.x - _xOffset, pt.y - _yOffset);
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesMoved:touches withEvent:event];
}
Nothing seems to happen, though, when I touch one of the images and try to drag it around. It seems to never move.
There is more work than this to do; I want to do something specific based on where a dragged image is dropped. But right now I'm working on the smoke test, and I cannot see any change in the UI behavior from before when I added this code.
Thanks for any help,

cocos2d for iphone touch detection on a sprite

I am using ccTouchesBegan and ccTouchesEnded methods to register touches. Everything was ok until I placed some buttons(ccmenuitems) on my node. now when i place my finger down on a button and then move it to any other place of screen ccTouchesEnded method does not calls. What am I doing wrong? How can I detect touches on a button?
some code:
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint location = [self convertTouchToNodeSpace: [touches anyObject]];
// here i check if touch is in the right place
if ([self ptInRect:location :CGRectMake(0, center.y - 160, winSize.width, 40)]) {
dragBeginLocation = location;
}
}
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint location = [self convertTouchToNodeSpace: [touches anyObject]];
if (ABS(location.x - dragBeginLocation.x) < 8) {
NSLog(#"TAP");
} else {
NSLog(#"SWIPE");
}
}
So, when I begin touch on a sprite and release on background I get nothing in console.
Buttons are in CCMenu, which is higher than background.

Resources