how can make bokeh effect by core graphics? - ios

I want to create app that has paintbrush to add bokeh effect with finger move onto image. Here is the code.
#import "fingerDrawView.h"
////create bokeh image
-(UIImage*)drawCircle{
///1) create a bitmap context
UIGraphicsBeginImageContext(self.bounds.size);
///2) get the context
CGContextRef circleContext = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(circleContext, 3.0f);
//circle1
CGContextSetFillColorWithColor(circleContext, [UIColor colorWithRed:0.5 green:1 blue:0.5 alpha:0.4].CGColor);
CGRect circle1Point = CGRectMake(0, 0, 80, 80);///// When play it in simulator, it look smaller than this size. I don’t know why???
CGContextFillEllipseInRect(circleContext, circle1Point);
CGContextSetStrokeColorWithColor(circleContext, [UIColor colorWithRed:0.3 green:0.9 blue:0 alpha:0.6].CGColor);
CGContextStrokeEllipseInRect(circleContext, circle1Point);
////4) export the context into an image
UIImage *circleImage = UIGraphicsGetImageFromCurrentImageContext();
//// 5) destroy the context
UIGraphicsEndImageContext();
return circleImage;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0),^{
_imageBuffer = [self drawCircle];
dispatch_async(dispatch_get_main_queue(), ^{
CGPoint touchPoint = [touch locationInView:self];
CGPoint prev_touchPoint = [touch previousLocationInView:self];
if (ABS(touchPoint.x - prev_touchPoint.x) > 6
|| ABS(touchPoint.y - prev_touchPoint.y) > 6) {
_aImageView = [[UIImageView alloc]initWithImage:_imageBuffer ];
_aImageView.multipleTouchEnabled = YES;
_aImageView.userInteractionEnabled = YES;
[_aImageView setFrame:CGRectMake(touchPoint.x, touchPoint.y, 100.0, 100.0)];
[self addSubview:_aImageView];
}
});
});
}
It is able work in simulator. However, it crash while run in devise (ipad4). The console informed that “received memory warning”. I made GCD to draw bokeh image, but it didn’t work.
By the way, I want to make the bokeh image size in 80X80 (-(UIImage*)drawCircle). When play it in simulator, it look smaller than this size.

Related

How to change a Shape of UIBezierPath Drawing line?

Is there any way to change the UIBezierPath drawing shape ,see the below image it like a line when user drag the finger,but i want star ,circle and other is there any way to achieve that.
My Expectation is
This is my UIBezierPath code:
-(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
touchPoint = [touch locationInView:self];
if (!CGPointEqualToPoint(startingPoint, CGPointZero))
{
UIBezierPath *path = [UIBezierPath bezierPath];
[path moveToPoint:CGPointMake(touchPoint.x,touchPoint.y)];
[path addLineToPoint:CGPointMake(startingPoint.x,startingPoint.y)];
CAShapeLayer *shapeLayer = [CAShapeLayer layer];
shapeLayer.path = [path CGPath];
shapeLayer.strokeColor = [single.arrColor[single.i] CGColor];
if([UIDevice currentDevice].userInterfaceIdiom ==UIUserInterfaceIdiomPad)
{
shapeLayer.lineWidth = 7.0;
}
else
{
shapeLayer.lineWidth = 5.0;
}
shapeLayer.fillColor = [[UIColor redColor] CGColor];
[self.layer addSublayer:shapeLayer];
[clearBeizer addObject:shapeLayer];
}
startingPoint=touchPoint;
// [arrLayer addObject:shapeLayer];
NSLog(#"Touch moving point =x : %f Touch moving point =y : %f", touchPoint.x, touchPoint.y);
}
Yes, it is doable but it is not trivial. What you essentially want is to stroke a path with stars instead of normal dashes. As far as I know, iOS only provides an API for a standard methods, i.e. stroking with a rectangular dash pattern.
If you want to implement custom stroking, you have to do it yourself. You probably have to flatten the bezier path first. Then "walk" along the path and draw stars/circle/squirrels at certain interval manually. It is especially difficult if you need the interval between the stars to be equal.
You can have a look at DrawKit library for MacOS. DrawKit is for MacOS, not iOS! This is just a reference for you to get the idea.
DrawKit has NSBezierPath+Geometry.h category on NSBezierPath class. You can start with (NSBezierPath*)bezierPathWithZig:(CGFloat)zig zag:(CGFloat)zag method and see how zig-zagy path is implemented
https://github.com/DrawKit/DrawKit/.../NSBezierPath-Geometry.m#L1206
or wavy path [(NSBezierPath*)bezierPathWithWavelength:amplitude:spread:]
https://github.com/DrawKit/DrawKit/..../NSBezierPath-Geometry.m#L1270
Just FYI: UIBezierPath (iOS) often lacking methods that are available in NSBezierPath (MacOS)
If DrawKit confuses you, there are probably open-sourced drawing libraries for iOS on the Internet, try searching them and see how custom drawing is done.
Yes You can do that. But You have to get the custom shaped icons for this task.
You can try this excellent answer provided by RobMayoff here: and the git repo
Here is another way to do it:
I have made a simple Image Editing App similar to what your are doing .
You can draw the heart shapes on the image:
Like that, you can draw many custom shapes:
The code is pretty straight forward and simple:
You need to create few custom shaped erasers. I call them eraser becuase they just erase the pic :P .
Here are the methods for customising the eraser:
- (void)newMaskWithColor:(UIColor *)color eraseSpeed:(CGFloat)speed {
wipingInProgress = NO;
eraseSpeed = speed; //how fast eraser should move
maskColor = color; //eraser color
[self setNeedsDisplay];
}
-(void)setErase:(UIImage *)img{
eraser =img; //set the custom shaped image here
}
And to draw the custom shaped eraser on view:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
wipingInProgress = YES;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] == 1) {
UITouch *touch = [touches anyObject];
location = [touch locationInView:self];
location.x -= [eraser size].width/2;
location.y -= [eraser size].width/2;
[self setNeedsDisplay];
}
}
and finally the draw rect method:
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
if (wipingInProgress) {
if (imageRef) {
// Restore the screen that was previously saved
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextDrawImage(context, rect, imageRef);
CGImageRelease(imageRef);
CGContextTranslateCTM(context, 0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
}
[eraser drawAtPoint:location blendMode:kCGBlendModeDestinationOut alpha:eraseSpeed];
}
// Save the screen to restore next time around
imageRef = CGBitmapContextCreateImage(context);
}
Here are some variables declared in the .h file:
CGPoint location;
CGImageRef imageRef;
UIImage *eraser;
BOOL wipingInProgress;
UIColor *maskColor;
CGFloat eraseSpeed;
I am simply done this using UIImageView :
UIImageView *imageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:single.arrimages[single.colorimages]]];
imageView.frame = CGRectMake(touchPoint.x, touchPoint.y, 30,30);
[self addSubview:imageView];
just add image while user touch the screen.
get the x , y coordination of user touch and give it to uiimageview frame.
i Hope it will help for some one.

Drawing with finger on UIImageView that does not cover entire screen

I have managed to get drawing on top of a recently captured UIImage working, but it only works when the UIImageView covers the entire screen. It begins to lose accuracy the moment I start drawing away from the horizontal center. Any idea how I can get it to only work on the ImageView itself and have it remain accurate?
- (void)viewDidLoad
{
[super viewDidLoad];
float height = (self.view.frame.size.width * self.chosenImage.size.height)/self.chosenImage.size.width;
self.imageView.frame = CGRectMake(0, self.navigationController.navigationBar.frame.size.height, self.view.frame.size.width, height);
self.imageView.center = self.imageView.superview.center;
self.imageView.image = self.chosenImage;
}
- (UIImage *)drawLineFromPoint:(CGPoint)from_Point toPoint:(CGPoint)to_Point image:(UIImage *)image
{
CGSize sizeOf_Screen = self.imageView.frame.size;
UIGraphicsBeginImageContext(sizeOf_Screen);
CGContextRef current_Context = UIGraphicsGetCurrentContext();
[image drawInRect:CGRectMake(0, 0, sizeOf_Screen.width, sizeOf_Screen.height)];
CGContextSetLineCap(current_Context, kCGLineCapRound);
CGContextSetLineWidth(current_Context, 1.0);
CGContextSetRGBStrokeColor(current_Context, 1, 0, 0, 1);
CGContextBeginPath(current_Context);
CGContextMoveToPoint(current_Context, from_Point.x, from_Point.y);
CGContextAddLineToPoint(current_Context, to_Point.x, to_Point.y);
CGContextStrokePath(current_Context);
UIImage *rect = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return rect;
}
- (void)touchesBegan:(NSSet *)_touches withEvent:(UIEvent *)_event
{
// retrieve the touch point
UITouch *_touch = [_touches anyObject];
CGPoint current_Point = [_touch locationInView:self.imageView];
// Its record the touch points to use as input to our line smoothing algorithm
self.drawnPoints = [NSMutableArray arrayWithObject:[NSValue valueWithCGPoint:current_Point]];
self.previousPoint = current_Point;
// we need to save the unmodified image to replace the jagged polylines with the smooth polylines
self.cleanImage = self.imageView.image;
}
- (void)touchesMoved:(NSSet *)_touches withEvent:(UIEvent *)_event
{
UITouch *_touch = [_touches anyObject];
CGPoint current_Point = [_touch locationInView:self.imageView];
[self.drawnPoints addObject:[NSValue valueWithCGPoint:current_Point]];
self.imageView.image = [self drawLineFromPoint:self.previousPoint toPoint:current_Point image:self.imageView.image];
self.previousPoint = current_Point;
}
EDIT: With these settings I have accuracy in the UIImageView, but the image blurs inwards as I am drawing.
When I edit [image drawInRect:CGRectMake(0, 0, sizeOf_Screen.width, sizeOf_Screen.height)]; to be [image drawInRect:CGRectMake(0, self.imageView.frame.origin.y, sizeOf_Screen.width, sizeOf_Screen.height)]; and try to draw, the image flies off out of the view when I try to draw.
I've tried tweaking the sizeOf_screen value and CGRectMake of image in the drawLineFromPoint method. I've also attempted to set the locationInView in both instances to self.imageView, but to no avail. It's very strange. Any ideas or code samples that might accomplish what I'm looking for? I can't seem to find anything.

Scratch a SKNode and reveal SKNode underneath it , an effect like wiping a misted glass

Basically I want to achieve something like this : https://github.com/moqod/iOS-Scratch-n-See in SpriteKit.
In a SKScene ,I have a car image,2-3 different types of layers of dirt images, one layer of soap , one layer of water droplets and by these layers i mean all of them are in UIImage form and equal to car frame's size(which i can use as SKTexture and SKNode eventually).
The project mentioned above adds UIImageView on one another and than erase images.
I need to manage many layers like if a soap tool is selected ,I want to bring up the dirt image layer , erase the dirt image wherever user touches and below it i will place soap image(semi-transparent) ,which will be visible now and below it car image.
After merging them(half erased/half present dirt+soap+car image) i will get another image and display it on top ,so this will give an impression to the user as if he is applying soap on car and removing dirt.
If you can see what i am trying to explain.
I want to use above mentioned project and achieve these tasks on SpriteKit.
I cant use z-position to bring upfront and move back the images as it works only on SKSpriteNode and above example is coded on UIKit (UIImages) to erase images and not nodes.
I cant add transparent SKScenes on one another ,ex : Making a SKScene's background transparent not working... is this a bug? , same way as UIImageView's are added on that project as i am working on IOS 7 and want my application to be compatible with it.
Last resort would be i need to drop SpriteKit and work on UIKit.
Any logic to swipe over a SKSpriteNode and make its particluar swiped area transparent by changing its alpha value or something ?
Any help or suggestions are most welcomed. Thank You.
You can implement a "scratch and see" using Sprite Kit's SKCropNode. An SKCropNode applies a mask to its children nodes. The mask is used to hide part or all of the crop node's children. For this app, the child node is the image you would like to uncover by "scratching."
The basic steps
Start with an empty image as the texture for the mask
Add circles to the mask where the user touches the hidden image to uncover the picture below
Here's an example of how to do that:
First, define these properties
#property UIImage *image;
#property SKSpriteNode *maskNode;
#property SKNode *node;
then add the contents of the scene to didMoveToView.
-(void)didMoveToView:(SKView *)view {
self.node = [SKNode node];
_node.name = #"tree";
// Create a node that will hold the image that's hidden and later uncovered by "scratching"
CGPoint position = CGPointMake (CGRectGetWidth(self.frame)/2,CGRectGetHeight(self.frame)/2);
SKSpriteNode *imageNode = [SKSpriteNode spriteNodeWithImageNamed:#"hidden_pic.png"];
imageNode.position = CGPointZero;
CGSize size = imageNode.size;
// This is the layer that you "scatch" off
SKSpriteNode *background = [SKSpriteNode spriteNodeWithColor:[SKColor grayColor] size:size];
background.position = position;
background.name = #"background";
[_node addChild:background];
// This is the mask node. Initialize it with an empty image, so it completely hides the image
UIImage *image = [self blankImageWithSize:size];
self.image = image;
SKTexture *texture = [SKTexture textureWithImage:image];
SKSpriteNode *maskNode = [SKSpriteNode spriteNodeWithTexture:texture];
maskNode.position = CGPointZero;
maskNode.name = #"mask";
self.maskNode = maskNode;
[_node addChild:maskNode];
// This is the node that crops its children
SKCropNode *cropNode = [SKCropNode node];
cropNode.position = position;
cropNode.maskNode = maskNode;
cropNode.zPosition = 100;
cropNode.name = #"crop";
[_node addChild:cropNode];
[cropNode addChild:imageNode];
[self addChild:_node];
}
This creates an empty image. It is used to as the initial mask image so that the picture is completely hidden.
- (UIImage*) blankImageWithSize:(CGSize)size
{
UIGraphicsBeginImageContext(size);
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
This method draws a circle on an image at a specified point. It is used to update the mask node's image. Each circle drawn on the mask uncovers more of the hidden picture.
#define kCircleRadius 22
- (UIImage *)imageByDrawingCircleOnImage:(UIImage *)image atPoint:(CGPoint)point
{
UIGraphicsBeginImageContext(image.size);
[image drawAtPoint:CGPointZero];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextScaleCTM(context, 1, -1);
CGContextTranslateCTM(context, 0, -image.size.height);
CGRect rect = CGRectMake(point.x-kCircleRadius, point.y-kCircleRadius,
kCircleRadius*2, kCircleRadius*2);
UIBezierPath* roundedRectanglePath = [UIBezierPath bezierPathWithOvalInRect:rect];
[[UIColor blackColor] setFill];
[roundedRectanglePath fill];
CGContextAddPath(context, roundedRectanglePath.CGPath);
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
This method converts the specified point to the mask node's coordinates, calls
a method to draw a circle in the mask node, and updates the mask node's
texture.
- (void) drawCircleInImageAtPoint:(CGPoint)point
{
CGPoint location = [self convertPoint:point toNode:_maskNode];
location = CGPointMake(location.x+_maskNode.size.width/2, location.y+_maskNode.size.height/2);
UIImage *newImage = [self imageByDrawingCircleOnImage:_image atPoint:location];
SKTexture *texture = [SKTexture textureWithImage:newImage];
self.image = newImage;
_maskNode.texture = texture;
}
These methods handle touch events. It adds cicles to the mask node image where the user touched the screen.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
NSArray *nodes = [self nodesAtPoint:location];
for (SKNode *node in nodes) {
if ([node.name isEqualToString:#"crop"]) {
[self drawCircleInImageAtPoint:location];
}
}
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
NSArray *nodes = [self nodesAtPoint:location];
for (SKNode *node in nodes) {
if ([node.name isEqualToString:#"crop"]) {
[self drawCircleInImageAtPoint:location];
}
}
}
}

RenderInContext is incredibly slow on iOS7

I am implementing "scratch" functionality in my application. User scratches the screen and sees the image "below".
On touchesMoved: I update mask image and apply it to layer. In general code is like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGPoint cPoint = [touch locationInView:self];
UIImageView *imageView = [[UIImageView alloc] initWithFrame:self.bounds];
imageView.image = _maskImage;
// ... add some subviews to imageView corresponding to touch manner
_maskImage = [UIImage imageFromLayer:imageView.layer];
[self setNeedsDisplay];
}
- (void)drawRect:(CGRect)rect
{
_maskImageView.image = _maskImage;
_viewWithOurImage.layer.mask = _maskImageView.layer;
}
I get UIImage from CALayer using code (category on UIImage):
+ (UIImage*)imageFromLayer:(CALayer*)layer
{
UIGraphicsBeginImageContextWithOptions([layer frame].size, NO, 0);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
This code works perfectly on iOS6 (tested on iPhone 4s and iPad2), no lags at all.
But when I run it on iOS7 (xcode4 or xcode5), it is awfully slow and laggy. I used a time profiler, it clearly points to renderInContext: row.
Then I tried following code:
...
if (SYSTEM_VERSION_LESS_THAN(#"7.0"))
_maskImage = [UIImage imageFromLayer:imageView.layer];
else
_maskImage = [UIImage imageFromViewIniOS7:imageView];
...
+ (UIImage*)imageFromViewIniOS7:(UIView*)view
{
UIGraphicsBeginImageContextWithOptions(view.frame.size, NO, 0);
CGContextSetInterpolationQuality(UIGraphicsGetCurrentContext(), kCGInterpolationNone);
// actually there is NSInvocation, but I've shortened for example
[view drawViewHierarchyInRect:view.bounds afterScreenUpdates:YES];
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
And it is still very slow. Tested on iPhone 4s (same as was on iOS6), new iPod5 and iPad3.
What am I doing wrong? Obviously it's a problem with iOS7...
I will appreciate any suggestions.
I will suggest you to try some other way, as sorry to say, the touchesMoved function works slowly in IOS7, there's nothing wrong with your code

Why the responds is slow as stroke PNG images with UItouch?

I used below code to stroke PNG with finger move. There has 2 UIImage View. One locates at background to put background image there. The other one is clear UIImage view to stroke PNG images on top of it.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch * touch in touches) {
currentPoint = [touch locationInView:self.view];
lastPoint = [touch previousLocationInView:self.view];
//set up array to make space between PNG images
if (ABS(currentPoint.x-lastPoint.x)>16
|| ABS(currentPoint.y - lastPoint.y) > 13) {
[brushLocations addObject:[NSValue valueWithCGPoint:currentPoint]];
}
[self drawingWithArray];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[brushLocations removeAllObjects];//reset
}
-(void)drawingWithArray{
UIGraphicsBeginImageContext(self.view.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)];
for (int i=0; i<[brushLocations count]; i++) {
CGPoint center =[[brushLocations objectAtIndex:i]CGPointValue];
// bokehImage is UIImage
bokehImage=[bokehImgArray objectAtIndex: i%[bokehImgArray count]];
/// the PNG images are not semi-transparent, even set the alpha is 0.5??
[bokehImage drawAtPoint:center blendMode:kCGBlendModeOverlay alpha:0.5f];
//drawImage is uiimage view on top of background image view for stroke PNG images.
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
Now, I got problem is the respond is slow. The PNG images didn’t display immediately while finger move on device (IPad4).
Also, the PNG images are not semi-transparent. I suppose that the function of “drawAtPoint .. blendMode .. alpha “ can make images to be semi-transparent (set 0.5 alpha).
Yes, something like this should work:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch * touch in touches) {
currentPoint = [touch locationInView:self.view];
lastPoint = [touch previousLocationInView:self.view];
//set up array to make space between PNG images
if (ABS(currentPoint.x-lastPoint.x)>16
|| ABS(currentPoint.y - lastPoint.y) > 13) {
[brushLocations addObject:[NSValue valueWithCGPoint:currentPoint]];
}
// [self drawingWithArray]; // don't call draw routine during touch handler
[self setNeedsDisplay]; // queue the redraw instead
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
// Not needed here
// [brushLocations removeAllObjects];//reset
}
//-(void)drawingWithArray
- (void)drawRect:(CGRect)rect
{
// CGContext is already set when drawRect is called
// UIGraphicsBeginImageContext(self.view.frame.size);
// [drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)];
[drawImage.image drawInRect:rect];
for (int i=0; i<[brushLocations count]; i++) {
CGPoint center =[[brushLocations objectAtIndex:i]CGPointValue];
// bokehImage is UIImage
bokehImage=[bokehImgArray objectAtIndex: i%[bokehImgArray count]];
// the PNG images are not semi-transparent, even set the alpha is 0.5??
[bokehImage drawAtPoint:center blendMode:kCGBlendModeOverlay alpha:0.5f];
//drawImage is uiimage view on top of background image view for stroke PNG images.
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
// UIGraphicsEndImageContext();
}
[brushLocations removeAllObjects];//reset
}

Resources