Swift drawing app - changing line width value based on a slider - ios

I'm trying to change the line width of my drawing app using a slider but every time I change the slider and redraw a line, all lines on the screen change to the currently selected line width. I must be doing something wrong.
var layers:[Array<CGPoint>] = []
var layerIndex = 0
var sliderValue: CGFloat = 3.0
var strokeInfo:[[String:Any]] = []
//change the slider
func slider(value: CGFloat) {
sliderValue = value
print("Slider value is \(sliderValue)")
}
//on new touch, start a new array (layer) of points
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
var points:[CGPoint] = []
layers.append(points)
var info = Dictionary<String,Any>()
info["line"] = sliderValue
strokeInfo.insert(info, at: layerIndex)
let pt = (touches.first!).location(in: self)
points.append(pt)
}
//add each point to the correct array as the finger moves
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
let pt = (touches.first!).location(in: self)
layers[layerIndex].append(pt)
self.setNeedsDisplay()
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
print("layer \(layerIndex) now has \(layers[layerIndex].count)")
layerIndex += 1
}
override func draw(_ rect: CGRect) {
//get pointer to view
let context = UIGraphicsGetCurrentContext()
context?.clear(rect)
for (index, element) in strokeInfo.enumerated() {
if index == layerIndex {
context?.setLineWidth(element["line"] as! CGFloat)
}
}
//loop through the layers if any
for points in layers {
UIColor.cyan.set()
//loop through each layer's point values
if points.count - 1 > 0 {
for i in 0 ..< points.count-1 {
let pt1 = points[i] as CGPoint
let pt2 = points[i + 1] as CGPoint
context?.setLineCap(.round)
context?.move(to: pt1)
context?.addLine(to: pt2)
context?.strokePath()
}
}
}
}

You're changing the line width and line cap of the context. The settings of the graphics context affect the entire path associated with the context.
If you want to draw different paths, I suggest you use multiple UIBezierPath objects, each with it's own width, color, and line cap settings. You can draw your bezier paths in your drawRect method.
Alternately you could use multiple CAShapeLayers, each with different path drawing settings and stack them on top of each other to get the composite image you want.

Related

How do I rotate a SpriteNode with a one finger “touch and drag”

How do I rotate a SpriteNode with a one finger “touch and drag” so that:
It doesn’t jerk around
It moves in a circle (I’ve successfully accomplished this part several times- both with a code only solution and with an SKS file)
It produces a meaningful value (as a physical control knob would)
It moves while my finger is on it but not when my finger is off
The things I’ve tried:
Using CGAffineTransform’s CGAffineTransformMakeRotation to effect a rotation of the knob in SpriteKit. But I cannot figure out how to use CGAffineTransformMakeRotation on a SpriteNode. I could put a different sort of object into my Scene or on top of it, but that’s just not right.
For example, Matthijs Hollemans’ MHRotaryKnob https://github.com/hollance/MHRotaryKnob
. I translated Hollemans knob from Objective C to Swift 4 but ran into trouble attempting to use it in SpriteKit to rotate sprites. I didn’t get that because I could not figure out how to use knobImageView.transform = CGAffineTransformMakeRotation (newAngle * M_PI/180.0); in Swift with SpriteKit. I know I could use Hollemans Objective C class and push a UIImage over the top of my scene, but that doesn’t seem like the best nor most elegant solution.
I also translated Wex’s solution from Objective C to Swift
Rotate image on center using one finger touch
Using Allan Weir’s suggestions on dealing with the CGAffineTransform portions of the code https://stackoverflow.com/a/41724075/1678060 But that doesn’t work.
I've tried setting the zRotation on my sprite directly without using .physicalBody to no avail. It has the same jerky movement and will not stop where you want it to stop. And moves in the opposite direction of your finger drag- even when you put the '-' in front of the radian angle.
I’ve also tried 0x141E’s solution on Stack Overflow:
Drag Rotate a Node around a fixed point This is the solution posted below using an .sks file (somewhat modified- I've tried the un-modified version and it is no better). This solution jerks around, doesn’t smoothly follow my finger, cannot consistently move the knob to a specific point. Doesn’t matter if I set physicsBody attributes to create friction, mass, or angularDamping and linearDamping or reducing the speed of the SKSpriteNode.
I have also scoured the Internet looking for a good solution in Swift 4 using SpriteKit, but so far to no avail.
import Foundation
import SpriteKit
class Knob: SKSpriteNode
{
var startingAngle: CGFloat?
var currentAngle: CGFloat?
var startingTime: TimeInterval?
var startingTouchPoint: CGPoint?
override init(texture: SKTexture?, color: UIColor, size: CGSize) {
super.init(texture: texture, color: color, size: size)
self.setupKnob()
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.setupKnob()
}
func setupKnob() {
self.physicsBody = SKPhysicsBody(circleOfRadius: CGFloat(self.size.height))
self.physicsBody?.pinned = true
self.physicsBody?.isDynamic = true
self.physicsBody?.affectedByGravity = false
self.physicsBody?.allowsRotation = true
self.physicsBody?.mass = 30.0
//self.physicsBody?.friction = 0.8
//self.physicsBody?.angularDamping = 0.8
//self.physicsBody?.linearDamping = 0.9
//self.speed = 0.1
self.isUserInteractionEnabled = true
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
let location = touch.location(in:self)
let node = atPoint(location)
startingTouchPoint = CGPoint(x: location.x, y: location.y)
if node.name == "knobHandle" {
let dx = location.x - node.position.x
let dy = location.y - node.position.y
startingAngle = atan2(dy, dx)
}
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches{
let location = touch.location(in:self)
let node = atPoint(location)
guard startingAngle != nil else {return}
if node.name == "knobHandle" {
let dx:CGFloat = location.x - node.position.x
let dy:CGFloat = location.y - node.position.y
var angle: CGFloat = atan2(dy, dx)
angle = ((angle) * (180.0 / CGFloat.pi))
let rotate = SKAction.rotate(toAngle: angle, duration: 2.0, shortestUnitArc: false)
self.run(rotate)
startingAngle = angle
}
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
var touch: UITouch = touches.first!
var location: CGPoint = touch.location(in: self)
self.removeAllActions()
startingAngle = nil
startingTime = nil
}
}
Edit: If I remove the conversion to degrees and change the duration of the SKAction to 1.0 in SKAction.rotate(toAngle:duration: 1.0, shortestUnitArc:) then it almost works: not as jerky, but still jerks; the lever doesn't change directions well- meaning sometimes if you attempt to move it opposite of the direction it was traveling it continues to go the old direction around the anchorPoint instead of the new direction you're dragging it.
Edit 2: GreatBigBore and I discussed both the SKAction rotation and the self.zRotation- the code above and the code below.
Edit 3: sicvayne suggested some code for the SKScene and I've adapted to SKSpriteNode (below). It doesn't move consistently or allow to you stop in a specific place.
import Foundation
import SpriteKit
class Knob: SKSpriteNode {
var fingerLocation = CGPoint()
override init(texture: SKTexture?, color: UIColor, size: CGSize) {
super.init(texture: texture, color: color, size: size)
self.setupKnob()
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.setupKnob()
}
func setupKnob() {
self.isUserInteractionEnabled = true
}
func rotateKnob(){
let radians = atan2(fingerLocation.x - self.position.x, fingerLocation.y - self.position.y)
self.zRotation = -radians
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
fingerLocation = touch.location(in: self)
}
self.rotateKnob()
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
}
/*override func update(_ currentTime: TimeInterval) { //this is a SKScene function
rotateKnob()
}*/
}
The Math was wrong. Here's what I learned you need:
How this looks in swift:
if point.x.sign == .minus {
angle = atan(point.y/point.x) + CGFloat.pi/2
} else {
angle = atan(point.y/point.x) + CGFloat.pi/2 + CGFloat.pi
}
Also, you have to get the coordinates of another object in the scene because the entire coordinate system rotates with the object:
let body = parent?.childNode(withName: "objectInScene")
let point = touch.location(in: body!)
I usually do something like this without any jittering or jerking issues.
var fingerLocation = CGPoint()
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch: AnyObject in touches {
fingerLocation = touch.location(in: self)
}
}
func rotatePlayer(){
let radians = atan2(fingerLocation.x - playerNode.position.x, fingerLocation.y - playerNode.position.y)
playerNode.zRotation = -radians//this rotates the player
}
override func update(_ currentTime: TimeInterval) {
rotatePlayer()
}
Depending on how your images are facing, you're probably going to have to mess around with the radians. In my case, my "player" image is facing upwards. Hope this helped.

Implement ink annotations on iOS 11 PDFKit document

I want to allow the user to draw on an iOS 11 PDFKit document viewed in a PDFView. The drawing should ultimately be embedded inside the PDF.
The latter I have solved by adding a PDFAnnotation of type "ink" to the PDFPage with a UIBezierPath corresponding to the user's drawing.
However, how do I actually record the touches the user makes on top of the PDFView to create such an UIBezierPath?
I have tried overriding touchesBegan on the PDFView and on the PDFPage, but it is never called. I have tried adding a UIGestureRecognizer, but didn't accomplish anything.
I'm assuming that I need to afterwards use the PDFView instance method convert(_ point: CGPoint, to page: PDFPage) to convert the coordinates obtained to PDF coordinates suitable for the annotation.
In the end I solved the problem by creating a PDFViewController class extending UIViewController and UIGestureRecognizerDelegate. I added a PDFView as a subview, and a UIBarButtonItem to the navigationItem, that serves to toggle annotation mode.
I record the touches in a UIBezierPath called signingPath, and have the current annotation in currentAnnotation of type PDFAnnotation using the following code:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let position = touch.location(in: pdfView)
signingPath = UIBezierPath()
signingPath.move(to: pdfView.convert(position, to: pdfView.page(for: position, nearest: true)!))
annotationAdded = false
UIGraphicsBeginImageContext(CGSize(width: 800, height: 600))
lastPoint = pdfView.convert(position, to: pdfView.page(for: position, nearest: true)!)
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let position = touch.location(in: pdfView)
let convertedPoint = pdfView.convert(position, to: pdfView.page(for: position, nearest: true)!)
let page = pdfView.page(for: position, nearest: true)!
signingPath.addLine(to: convertedPoint)
let rect = signingPath.bounds
if( annotationAdded ) {
pdfView.document?.page(at: 0)?.removeAnnotation(currentAnnotation)
currentAnnotation = PDFAnnotation(bounds: rect, forType: .ink, withProperties: nil)
var signingPathCentered = UIBezierPath()
signingPathCentered.cgPath = signingPath.cgPath
signingPathCentered.moveCenter(to: rect.center)
currentAnnotation.add(signingPathCentered)
pdfView.document?.page(at: 0)?.addAnnotation(currentAnnotation)
} else {
lastPoint = pdfView.convert(position, to: pdfView.page(for: position, nearest: true)!)
annotationAdded = true
currentAnnotation = PDFAnnotation(bounds: rect, forType: .ink, withProperties: nil)
currentAnnotation.add(signingPath)
pdfView.document?.page(at: 0)?.addAnnotation(currentAnnotation)
}
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let position = touch.location(in: pdfView)
signingPath.addLine(to: pdfView.convert(position, to: pdfView.page(for: position, nearest: true)!))
pdfView.document?.page(at: 0)?.removeAnnotation(currentAnnotation)
let rect = signingPath.bounds
let annotation = PDFAnnotation(bounds: rect, forType: .ink, withProperties: nil)
annotation.color = UIColor(hex: 0x284283)
signingPath.moveCenter(to: rect.center)
annotation.add(signingPath)
pdfView.document?.page(at: 0)?.addAnnotation(annotation)
}
}
The annotation toggle button just runs:
pdfView.isUserInteractionEnabled = !pdfView.isUserInteractionEnabled
This was really the key to it, as this disables scrolling on the PDF and enables me to receive the touch events.
The way the touch events are recorded and converted into PDFAnnotation immediately means that the annotation is visible while writing on the PDF, and that it is finally recorded into the correct position in the PDF - no matter the scroll position.
Making sure it ends up on the right page is just a matter of similarly changing the hardcoded 0 for page number to the pdfView.page(for: position, nearest:true) value.
I've done this by creating a new view class (eg Annotate View) and putting on top of the PDFView when the user is annotating.
This view uses it's default touchesBegan/touchesMoved/touchesEnded methods to create a bezier path following the gesture. Once the touch has ended, my view then saves it as an annotation on the pdf.
Note: you would need a way for the user to decide if they were in an annotating state.
For my main class
class MyViewController : UIViewController, PDFViewDelegate, VCDelegate {
var pdfView: PDFView?
var touchView: AnnotateView?
override func loadView() {
touchView = AnnotateView(frame: CGRect(x: 0, y: 0, width: 375, height: 600))
touchView?.backgroundColor = .clear
touchView?.delegate = self
view.addSubview(touchView!)
}
func addAnnotation(_ annotation: PDFAnnotation) {
print("Anotation added")
pdfView?.document?.page(at: 0)?.addAnnotation(annotation)
}
}
My annotation view
class AnnotateView: UIView {
var path: UIBezierPath?
var delegate: VCDelegate?
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
// Initialize a new path for the user gesture
path = UIBezierPath()
path?.lineWidth = 4.0
var touch: UITouch = touches.first!
path?.move(to: touch.location(in: self))
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
// Add new points to the path
let touch: UITouch = touches.first!
path?.addLine(to: touch.location(in: self))
self.setNeedsDisplay()
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch = touches.first
path?.addLine(to: touch!.location(in: self))
self.setNeedsDisplay()
let annotation = PDFAnnotation(bounds: self.bounds, forType: .ink, withProperties: nil)
annotation.add(self.path!)
delegate?.addAnnotation(annotation)
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
self.touchesEnded(touches, with: event)
}
override func draw(_ rect: CGRect) {
// Draw the path
path?.stroke()
}
override init(frame: CGRect) {
super.init(frame: frame)
self.isMultipleTouchEnabled = false
}
}
EDIT:
jksoegaard's answer, while being the inspiration to all of my work on this matter, has a flaw: during touchedMoved, multiple PDF annotations are created, and consequently the PDF page becomes bogged down with annotations, which affects its loading time severely. I wrote code that draws on a CAShapeLayer during the touchedMoved phase, and creates the completed PDF annotation only in the touchesEnded phase.
My implementation is a subclass of UIGestureRecognizer, and it allows you to choose between pen, highlighter and eraser, and choose color and width. It also includes an undo manager. The example project is here.
Original Answer
Adding to jksoegaard's excellent answer, a few clarifications for newbies like myself:
You need to include UIBezierPath+.swift in your project for .moveCenter and rect.center to be recognized. Download from https://github.com/xhamr/paintcode-path-scale.
These lines can be excluded:
UIGraphicsBeginImageContext(CGSize(width: 800, height: 600))
and
let page = pdfView.page(for: position, nearest: true)!
You need to declare a few global vars outside the functions:
var signingPath = UIBezierPath()
var annotationAdded : Bool?
var lastPoint : CGPoint?
var currentAnnotation : PDFAnnotation?
Finally, if you want the ink to be wider and nicely colored, you need to do two things:
a. Every time before you see annotation.add or currentAnnotation.add, you need (use annotation or currentAnnotation as needed by that function):
let b = PDFBorder()
b.lineWidth = { choose a pixel number here }
currentAnnotation?.border = b
currentAnnotation?.color=UIColor.{ your color of choosing }
I recommend specifying a low alpha for the color. The result is beautiful, and affected by the speed of your stroke. For example, red would be:
UIColor(red: 255/255.0, green: 0/255.0, blue: 0/255.0, alpha: 0.1)
b. The rect in which every touch is recorded needs to accommodate for the thicker lines. Instead of
let rect = signingPath.bounds
Try, for an example of 10px of thickness:
let rect = CGRect(x:signingPath.bounds.minX-5,
y:signingPath.bounds.minY-5, width:signingPath.bounds.maxX-
signingPath.bounds.minX+10, height:signingPath.bounds.maxY-
signingPath.bounds.minY+10)
IMPORTANT: The touchesEnded function also makes use of the currentAnnotation variable. You must repeat the definition of rect within that function as well (either the short one or my suggested one above), and repeat the definition of currentAnnotation there as well:
currentAnnotation = PDFAnnotation(bounds: rect, forType: .ink, withProperties: nil)
If you don't, a single tap that did not move will make your app crash.
I can verify that once the file is saved, the annotations are retained. Sample code for saving:
let pdfData = pdfDocument?.dataRepresentation()
let annotatedPdfUrl = URL(fileURLWithPath: "\
(NSSearchPathForDirectoriesInDomains(.documentsDirectory, .userDomainMask,
true)[0])/AnnotatedPDF.pdf")
try! pdfData!.write(to: annotatedPdfUrl)

Active clickable area for UIButton with rounded corners?

So I have created a button with a border in my storyboard.
and then I rounded its corners and added a border color:
button.layer.cornerRadius = button.bounds.size.width / 2
button.layer.borderColor = greenColor
So the runtime result looks like this:
However the user can tap slightly outside the area of the button (where the corners used to be) and still call the button function. Is there a way to restrict the enabled area of the button to just be the circle?
With other answers you block the touch, I needed it to fall through.
And its even easier:
1) Setup your preferred path (for me circle)
private var touchPath: UIBezierPath {return UIBezierPath(ovalIn: self.bounds)}
2) Override point inside function
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
return touchPath.contains(point)
}
All inside you UIButton subclass.
So I figured it out. Basically I have to detect the touch on the button, and then calculate the distance between the touch and the center of the button. If that distance is less than the radius of the circle (the width of the button / 2) then the tap was inside the circle.
Here's my code:
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
let radius:CGFloat = (self.frame.width / 2)
var point:CGPoint = CGPoint()
if let touch = touches.first {
point = touch.locationInView(self.superview)
}
let distance:CGFloat = sqrt(CGFloat(powf((Float(self.center.x - point.x)), 2) + powf((Float(self.center.y - point.y)), 2)))
if(distance < radius) {
super.touchesBegan(touches, withEvent: event)
}
}
(SWIFT 3) This solution works with all the buttons not just with round. Before we have to create path as a private property of the button class and after we can easily write this:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let location = touch.location(in: self)
if path.contains(location) {
print("This print is shown only in case of button location tap")
}
}
}

let user to draw rectangle to select an area

I'm new in Swift and I'm trying to let the user draw a rectangle (touching and dragging) to select an area of an image just like when cropping but I don't want to crop I just want to know the CGRect the user created.
So far I have a .xib with a UIImage inside and its ViewController. I want to draw above the image but every tutorial I found about drawing is about subclassing UIView, override drawRect and put that as the xib class.
I figured it out. I just created a uiview and change its frame depending on the touches events
let overlay = UIView()
var lastPoint = CGPointZero
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
overlay.layer.borderColor = UIColor.blackColor().CGColor
overlay.backgroundColor = UIColor.clearColor().colorWithAlphaComponent(0.5)
overlay.hidden = true
self.view.addSubview(overlay)
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
//Save original tap Point
if let touch = touches.first {
lastPoint = touch.locationInView(self.view)
}
}
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
//Get the current known point and redraw
if let touch = touches.first {
let currentPoint = touch.locationInView(view)
reDrawSelectionArea(lastPoint, toPoint: currentPoint)
}
}
func reDrawSelectionArea(fromPoint: CGPoint, toPoint: CGPoint) {
overlay.hidden = false
//Calculate rect from the original point and last known point
let rect = CGRectMake(min(fromPoint.x, toPoint.x),
min(fromPoint.y, toPoint.y),
fabs(fromPoint.x - toPoint.x),
fabs(fromPoint.y - toPoint.y));
overlay.frame = rect
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
overlay.hidden = true
//User has lift his finger, use the rect
applyFilterToSelectedArea(overlay.frame)
overlay.frame = CGRectZero //reset overlay for next tap
}

Drawing curved lines without lagging?

I have a class below that when attached to a view draws curved lines when the user touches their device. The problem is that the lines drawn seem to lag behind from the position of the finger on the screen. The lagging is enough to be noticeable and mildly annoying as new sections of the line display a small distance away from the finger touching the screen.
The code uses the addCurveToPoint curve method. (The alternative addQuadCurveToPoint curve method appears to be less superior in terms of a quality curved line but does display on screen faster.)
I suspect that this issue relates to when setNeedsDisplay is called once the counter == 4. It appears the code waits until 4 new touch points are received while drawing before a curved line is drawn. Ideally a curved line is drawn at every single touch point (i.e. counter == 1), eliminating the lagging. (Changing Counter == 1 doesn't seem to work.)
I'm lost and don't know how to update the code to improve it further to remove that short lag but retain the curved lines. What needs to change in the below code to remove that short lag?
// Swift 2 code below tested using Xcode 7.0.1.
class drawView: UIView {
var path:UIBezierPath?
var incrementalImage:UIImage?
var points = [CGPoint?](count: 5, repeatedValue: nil)
var counter:Int?
var infoView:UIView = UIView()
var strokeColor:UIColor?
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.multipleTouchEnabled = false
self.backgroundColor = UIColor.whiteColor()
path = UIBezierPath()
path?.lineWidth = 20.0
strokeColor = UIColor.darkGrayColor()
path?.lineCapStyle = CGLineCap.Round
}
override init(frame: CGRect) {
super.init(frame: frame)
self.multipleTouchEnabled = false
path = UIBezierPath()
path?.lineWidth = 20.0
}
override func drawRect(rect: CGRect) {
incrementalImage?.drawInRect(rect)
strokeColor?.setStroke()
path?.stroke()
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
counter = 0
let touch: AnyObject? = touches.first
points[0] = touch!.locationInView(self)
infoView.removeFromSuperview()
}
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
let touch: AnyObject? = touches.first
let point = touch!.locationInView(self)
counter = counter! + 1
points[counter!] = point
if counter == 4{
points[3]! = CGPointMake((points[2]!.x + points[4]!.x)/2.0, (points[2]!.y + points[4]!.y)/2.0)
path?.moveToPoint(points[0]!)
path?.addCurveToPoint(points[3]!, controlPoint1: points[1]!, controlPoint2: points[2]!)
self.setNeedsDisplay()
points[0]! = points[3]!
points[1]! = points[4]!
counter = 1
}
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
self.drawBitmap()
self.setNeedsDisplay()
path?.removeAllPoints()
counter = 0
}
override func touchesCancelled(touches: Set<UITouch>?, withEvent event: UIEvent?) {
self.touchesEnded(touches!, withEvent: event)
}
func drawBitmap(){
UIGraphicsBeginImageContextWithOptions(self.bounds.size, true, 0.0)
strokeColor?.setStroke()
if((incrementalImage) == nil){
let rectPath:UIBezierPath = UIBezierPath(rect: self.bounds)
UIColor.whiteColor().setFill()
rectPath.fill()
}
incrementalImage?.drawAtPoint(CGPointZero)
path?.stroke()
incrementalImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
}
}
To start, I believe that you are doing it wrong. This usally works well if you want to draw a few lines not nesscarly by the users input but for circles, squiggly lines, simple things.
when using:
self.setNeedsDisplay()
You are redrawing ALL the lines EVERYTIME! this is tough CPU and that's why you have a lag. Image the user draws a few hundred lines then into a thousand and everytime he/she touches the screen it will redraw ALL of those lines.
OK. So, What I recommend doing is have 2 UIImageViews: 1) mainImageView - which will hold the overall drawing. 2) tempImageView - which the user will use to draw.
When the user touches/draws on "tempImageView" it draws until they let go of the screen then merge "tempImageView" to "mainImageView"
Here is a tutorial on:
http://www.raywenderlich.com/87899/make-simple-drawing-app-uikit-swift

Resources