Swift Drawing App is Laggy - ios

I am currently working on a drawing app. I'm building it for iOS, and am using Swift 3 to do so.
It's just a basic drawing app, but I'm trying to add an extra feature. I started with a UIScrollView, then added an Image View to that scroll view. All of the drawing part is done with the image view. When you first launch the app, the scrollview is completely zoomed in. When you switch to "zoom mode" this allows you to pinch to zoom. The problem is, when you first open the app, when you draw while zoomed in, the drawing is really fuzzy. In order to fix this, I can use a line of code like this:
UIGraphicsBeginImageContextWithOptions((self.view.frame.size), false, 7.0)
This causes the drawing to look great while zoomed in, but causes the app to run very laggy. The thing that confuses me though, if I change the above code to this:
UIGraphicsBeginImageContextWithOptions((self.view.frame.size), false, 0.0)
And zoom out all the way, the drawing looks exactly the same (granted, I'm zoomed all the way out) but it's no longer laggy. I know this probably isn't coming across super clearly so here's a video showing what happens in the first scenario: https://youtu.be/E_9FKf1pUTY and in the second: https://youtu.be/OofFTS4Q0OA
So basically, I'm wondering if there's a way to treat the zoomed in area as if it was its own view. It seems to me as if the app is updating the entire image view, rather than just the part that is visible at any given time. Is there a way to update only the portion of the image view that is drawn on? Sorry if this is a bit of a confusing post, feel free to ask questions if there's anything you don't understand. Just for clarity sake, I'll include all of the drawing code below:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
print("Touches began")
swiped = true
if let touch = touches.first {
lastPoint = touch.location(in: scrollView)
lastPoint.x = lastPoint.x / scrollView.zoomScale
lastPoint.y = lastPoint.y / scrollView.zoomScale
}
}
func drawLines(fromPoint:CGPoint,toPoint:CGPoint) {
print("\(fromPoint.x), \(fromPoint.y)")
//UIGraphicsBeginImageContext(self.view.frame.size)
UIGraphicsBeginImageContextWithOptions((scrollView.frame.size), false, 0.0)
imageView.image?.draw(in: CGRect(x: 0, y: 0, width: self.view.frame.width, height: self.view.frame.height))
let context = UIGraphicsGetCurrentContext()
context?.move(to: CGPoint(x: fromPoint.x, y: fromPoint.y))
context?.addLine(to: CGPoint(x: toPoint.x, y: toPoint.y))
context?.setBlendMode(CGBlendMode.normal)
context?.setLineCap(CGLineCap.round)
if erase == true {
context?.setLineWidth(30)
}
if erase == false {
context?.setLineWidth(CGFloat(sizeVar))
}
if color == "black" {
context?.setStrokeColor(UIColor.black.cgColor)
}
if color == "white" {
context?.setStrokeColor(UIColor.white.cgColor)
}
if color == "blue" {
context?.setStrokeColor(UIColor.blue.cgColor)
}
if color == "cyan" {
context?.setStrokeColor(UIColor.cyan.cgColor)
}
if color == "green" {
context?.setStrokeColor(UIColor.green.cgColor)
}
if color == "magenta" {
context?.setStrokeColor(UIColor.magenta.cgColor)
}
if color == "red" {
context?.setStrokeColor(UIColor.red.cgColor)
}
if color == "yellow" {
context?.setStrokeColor(UIColor.yellow.cgColor)
}
context?.strokePath()
imageView.image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
swiped = true
if let touch = touches.first {
var currentPoint = touch.location(in: scrollView)
currentPoint.x = currentPoint.x / scrollView.zoomScale
currentPoint.y = currentPoint.y / scrollView.zoomScale
drawLines(fromPoint: lastPoint, toPoint: currentPoint)
lastPoint = currentPoint
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
if !swiped {
drawLines(fromPoint: lastPoint, toPoint: lastPoint)
}
}

The scale parameter in the function UIGraphicsBeginImageContextWithOptions(_:_:_:) is not a quality setting, and you should not put arbitrary values there. It's a scale factor, and it's akin to the #1x, #2x, and #3x settings for images. It tells the system the scaling factor to use when mapping image pixels to screen pixels. In most (almost all) cases you should use 0, which means "use the native scale of the screen" (#2x for normal retina, or #3x for iPhone 6+ and 7+.) You should never set it to an arbitrary value like 7. That creates an image with 7x as many pixels as normal, and forces the system to scale it for screen drawing every time, which takes more time and is slower.
Next, creating a new image for each new line is a dreadfully inefficient way to do drawing. It creates and releases large blocks of memory constantly, and then has to completely redraw the screen each time. Instead I would set up a view that has a CAShapeLayer as it's backing layer and update the path that's installed in the layer.

Related

Cannot find CGPoint in CGPath

I have the following function for drawing a line between two points:
override func draw(from fromPoint: CGPoint, to toPoint: CGPoint) {
let path = UIBezierPath()
path.move(to: fromPoint)
path.addLine(to: toPoint)
path.close()
layer.path = path.cgPath
layer.strokeColor = pencil.color.cgColor
layer.lineWidth = pencil.strokeSize
}
Which is being called in touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?)
This draws a line and works fine, the line draws itself on the correct screen position.
Then I have another functionality where the user can erase previous drawings, and I'm trying to get if a touch position, is contained by any of the drawn paths:
private func findLayer(in touch: UITouch) -> CAShapeLayer? {
let point = touch.location(in: view)
// all the drawn paths are contained in CAShapeLayers
// Circles and rectangle layers have a corresponding frame that contains the UIBezierPath.
if let hitLayer = view?.layer.hitTest(point) as? CAShapeLayer,
hitLayer.path?.contains(point) == true || hitLayer.frame.contains(point) {
return hitLayer
}
guard let sublayers = view?.layer.sublayers else { return nil }
// this extra check is for layers that dont have a frame (Lines, Angle and free drawing)
for layer in sublayers {
if let shapeLayer = layer as? CAShapeLayer,
shapeLayer.path?.contains(point) == true || shapeLayer.frame.contains(point) {
return shapeLayer
}
}
return nil
}
The problem is that when the user draws a line, the findLayer function doesn't ever return the Layer with the line, but it works perfectly when the user draws a circle or a rectangle.
I don't want to give Line drawings a frame, because then the hit box could be too big, and the user could delete the drawing even if the touch isn't near the real path.
How can a find if a touch point is part of a CAShapeLayer.path ?

How can I make a CGRect move randomly around the screen so long as the user is touching it?

I'd like for a rectangle move to move around the screen randomly when the user holds his/her finger on it, and stop moving if the user's finger moves off of it. In other words, so long as the user is able to keep up with it, it will keep moving. How should I go about doing this?
let rect = CGRect(x: 157, y: 398, width: 100, height: 100) // create rect
let view = UIView(frame: rect) // create view for rect
view.backgroundColor = .red // color rect
self.view.addSubview(view) // display rect
Simply you can use touch Methods to move your view
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch = touches.first as! UITouch
let location = touch.location(in: self.view) //Gives you CGPoints
YourView.frame = CGRect(origin: location, size: Your size here>
}
That might help!

Drawing on a UIImageView inside a UIScrollView

I have a UIImageView inside a UIScrollView which automatically zooms out to fit the image supplied. The user can zoom as usual with a pinch gesture, and the pan gesture is set to require two touches since the drawing takes precedence.
On launch, everything looks great, but when I invoke my drawing code, this happens:
As you can see, when drawLineFrom(fromPoint:toPoint:) is invoked, the UIImageView shrinks. After that, the drawing appears to work as intended (though it skips the first part of the line on every touch).
My UIPanGestureRecognizer selector:
#objc func onOneFingerDrawing(_ sender: UIPanGestureRecognizer) {
switch sender.state {
case .began:
swiped = false
lastPoint = sender.location(in: drawView.imageView)
case .changed:
swiped = true
let currentPoint = sender.location(in: drawView.imageView)
drawLineFrom(fromPoint: lastPoint, toPoint: currentPoint)
lastPoint = currentPoint
case .ended:
guard drawView.scrollView.frame.contains(sender.location(in: drawView.imageView)) else {
return
}
if let newImage = drawView.imageView.image {
if history.count > historyIndex + 1 {
history.removeLast((history.count - 1) - historyIndex)
}
history.append(newImage)
historyIndex = history.count - 1
}
case .possible,
.cancelled,
.failed:
return
}
}
and my drawLineFrom(fromPoint:toPoint:):
#objc func drawLineFrom(fromPoint: CGPoint, toPoint: CGPoint) {
UIGraphicsBeginImageContextWithOptions(drawView.imageView.frame.size, false, UIScreen.main.scale)
let context = UIGraphicsGetCurrentContext()
context?.interpolationQuality = .none
drawView.imageView.image?.draw(in: CGRect(x: 0, y: 0, width: drawView.imageView.frame.size.width, height: drawView.imageView.frame.size.height))
context?.move(to: fromPoint)
context?.addLine(to: toPoint)
context?.setLineCap(.round)
context?.setLineWidth(lineWidth)
context?.setStrokeColor(lineColor)
context?.setBlendMode(blendMode)
context?.strokePath()
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
drawView.imageView.image = newImage
}
There is issue with image constraint with scrollview. Whenever you start rendering the image then its frame is changing according to content. So, you need to add aspect ratio constraint (as i did) or any size constraint to image view. see gifs for reference.
Before add image view size constraint.
After adding image view size constraint.
The drawing skips the first part of the line because you are using a UIPanGestureRecognizer. They system first determines it is a pan before it sends you a .began event. You should swap it to a generic UIGestureRecognizer to have it started immediately. You'll want to add in additional logic in that case checking for movement and number of fingers.
As far as the resizing, its tough to say without more info. I'd color the background of the ImageView as well. The first question is, is it the whole imageView that is shrinking or is it just the actual image inside it.

Sprite-Kit: moving an SKSpriteNode

i know that it's not that hard to move an object, but mine is different, i tried various ways none of them worked..
What i want to achieve?
If the user taps the left side of the screen, the ball would go left
If the user taps the right side of the screen, the ball would go right
So i went straight to the touchesBegan and wrote the following:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
ball?.removeAllActions()
for touch in touches {
let moveBY = SKAction.moveTo(x: touch.location(in: view).x, duration: 1.0)
self.ball?.run(moveBY)
}
}
i tried four different ways, i couldn't get over it, also for your reference, here's a photo of my ball's
info:
In order to move a node in a direction you want, you first have to know three things
the anchorPoint of the node you want to move
the anchorPoint of the parent node of the node you want to move
the position of node you want to move
The anchorPoint property of the nodes store its values in normalised way from (0, 0) to (1, 1). Defaults to (0.5, 0.5) for SPSpriteNode
The position property of a node store its position as (x, y) coordinates in a ortogonal coordinate system in which the point (0, 0) is positioned where the anchor point of the PARENT is.
So your scene's anchorPoint is (0.5, 0.5) meaning if you place a direct child in it at coordinates (0, 0), the child will be positioned so that its anchorPoint is located at (0, 0) which will always match its parent's anchorPoint.
When you are using those overloads touchesBegan, touchesMoved, etc... inside SKScene or SKNodes as a general, if you want to get the position of touch represented in the coordinate system in which all direct children of scene(sknode) are situated, you should do something like
class GameScene: SKScene {
override func didMove(to view: SKView) {
super.didMove(to: view)
let rect = SKSpriteNode(color: .green, size: CGSize(width: 100, height: 100))
addChild(rect)
rect.name = "rect"
// don't pay attention to these two i need them because i dont have .sks file
size = CGSize(width: 1334, height: 750)
anchorPoint = CGPoint(x: 0.5, y: 0.5)
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let touch = touches.first else { return }
let moveAction = SKAction.move(to: touch.location(in: self), duration: 1)
childNode(withName: "rect")?.run(moveAction)
}
}
hope it was helpful, other problems that you may have is that your ball is not direct child of scene, and you get its coordinates wrong

Drawing on UIImageView within UIScrollView

About my app: The user can view a PDF file within a UIWebView. I have an option for the user to choose whether they want to scroll through the pdf or take notes on it. When they take notes, the scrolling is disabled, and vice-versa. However, when the user is drawing, the lines move up and become hazy as shown:
(The red boxes are text within the pdf)
Here is my code:
Switching between the pen and scroll:
var usingPen = false
#IBAction func usePen(sender: AnyObject) {
usingPen = true
webView.userInteractionEnabled = false
UIView.animateWithDuration(0.3) { () -> Void in
self.popUpView.alpha = 0
}
}
#IBAction func useScroll(sender: AnyObject) {
usingPen = false
webView.userInteractionEnabled = true
UIView.animateWithDuration(0.3) { () -> Void in
self.popUpView.alpha = 0
}
}
The imageView the user draws on (objectView):
var objectView = UIImageView()
override func viewDidAppear(animated: Bool) {
objectView.frame.size = webView.scrollView.contentSize
webView.scrollView.addSubview(objectView)
}
Drawing on the image view:
var start = CGPoint()
let size: CGFloat = 3
var color = UIColor.blackColor()
func draw(start: CGPoint, end: CGPoint) {
if usingPen == true {
UIGraphicsBeginImageContext(self.objectView.frame.size)
let context = UIGraphicsGetCurrentContext()
objectView.image?.drawInRect(CGRect(x: 0, y: 0, width: objectView.frame.width, height: objectView.frame.height))
CGContextSetFillColorWithColor(context, color.CGColor)
CGContextSetStrokeColorWithColor(context, color.CGColor)
CGContextSetLineWidth(context, size)
CGContextBeginPath(context)
CGContextMoveToPoint(context, start.x, start.y)
CGContextAddLineToPoint(context, end.x, end.y)
CGContextStrokePath(context)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
objectView.image = newImage
}
}
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
start = (touches.first?.locationInView(self.objectView))!
}
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
draw(start, end: (touches.first?.locationInView(self.objectView))!)
start = (touches.first?.locationInView(self.objectView))!
}
How can I prevent the haziness and movement of the drawings? Thanks for your help.
This probably occurs due to an image scaling issue. Since you're drawing into an image with a scale factor of 1 (which is the default) and your screen has a scale factor of 2 or 3, lines will continue to blur slightly every time they're copied and drawn. The solution is to specify the screen's scale when you create your image context:
UIGraphicsBeginImageContextWithOptions(self.objectView.frame.size, false, UIScreen.mainScreen().scale)
Note that the way you're drawing the line is fairly inefficient. Instead, you probably want to create a CGBitmapContext and continually draw to that; it'd be much faster and would also eliminate the "generation loss" problem you have.

Resources