问题
I'm trying to understand different drawing methods in Swift and why they perform as they do.
The below code draws smooth lines using a UIBezierPath
, project available from https://github.com/limhowe/LimSignatureView
Initially the performance of the code below is highly responsive when the user begins touching the screen. However, over time, the longer the duration of touching and moving on the screen, the performance begins to lag and the drawing displayed doesn't keep up and isn't accurate (some points appear to be missed). The performance only returns to being highly responsive once touching the screen ends and touching the screen begins again for a new drawing.
Things I've noticed:
When
self.setNeedsDisplay()
is commented out (inoverride func touchesMoved
), there is no drawing displayed during touching the screen, but once lifting the finger triggeringoverride func touchesEnded
, the final result is a perfect drawing, with no lagging or inaccurate drawing points no matter how long the duration. (This is the result ideally wanted and displayed while drawing.)When
beizerPath.removeAllPoints()
is commented out (inoverride func touchesEnded
) lagging continues to occur even after the user lifts their finger and starts touching the screen again.It seems that
beizerPath.removeAllPoints()
might reset the lagging, whileself.setNeedsDisplay()
might be causing lagging over time.When increasing
beizerPath.lineWidth
to from 2 to a larger width (e.g. 50 or 100), the drawing tends to lag slightly.
Hmmm. I'm confused as to why this is all is as it is.
What exactly is going on here with the lagging over time?
What is the relationship between
beizerPath.removeAllPoints()
andself.setNeedsDisplay()
and the lagging?Why does increasing
beizerPath.lineWidth
cause some minor lagging?What might else be causing lagging?
Could removing some of the points while drawing improve performance, and if so, how can that be acheived?
What needs to be modified in the below code to ensure that perfect drawing continues overtime with no lagging?
Appreciate any enlightened feedback and improvements to the code. Thank you.
// LimSignatureView.swift
// SwiftSignatureView
//
// Created by MyAdmin on 3/6/15.
// Copyright (c) 2015 MyAdmin. All rights reserved.
//
import UIKit
class LimSignatureView: UIView {
var beizerPath: UIBezierPath = UIBezierPath()
var incrImage : UIImage?
var points : [CGPoint] = Array<CGPoint>(count: 5, repeatedValue: CGPointZero)
var control : Int = 0
var lblSignature : UILabel = UILabel()
var shapeLayer : CAShapeLayer?
required init(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
// Only override drawRect: if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
override func drawRect(rect: CGRect) {
// Drawing code
incrImage?.drawInRect(rect)
beizerPath.stroke()
// Set initial color for drawing
UIColor.redColor().setFill()
UIColor.redColor().setStroke()
beizerPath.stroke()
}
override init(frame: CGRect) {
super.init(frame: frame)
var lblHeight: CGFloat = 61.0
self.backgroundColor = UIColor.blackColor()
beizerPath.lineWidth = 2.0
lblSignature.frame = CGRectMake(0, self.frame.size.height/2 - lblHeight/2, self.frame.size.width, lblHeight);
lblSignature.font = UIFont (name: "HelveticaNeue-UltraLight", size: 30)
lblSignature.text = "Sign Here";
lblSignature.textColor = UIColor.lightGrayColor()
lblSignature.textAlignment = NSTextAlignment.Center
lblSignature.alpha = 0.3;
self.addSubview(lblSignature)
}
// MARK : - TOUCH Implementation
override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
if lblSignature.superview != nil {
lblSignature.removeFromSuperview()
}
control = 0;
var touch = touches.anyObject() as UITouch
points[0] = touch.locationInView(self)
var startPoint = points[0];
var endPoint = CGPointMake(startPoint.x + 1.5, startPoint.y
+ 2);
beizerPath.moveToPoint(startPoint)
beizerPath.addLineToPoint(endPoint)
}
override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {
var touch = touches.anyObject() as UITouch
var touchPoint = touch.locationInView(self)
control++;
points[control] = touchPoint;
if (control == 4)
{
points[3] = CGPointMake((points[2].x + points[4].x)/2.0, (points[2].y + points[4].y)/2.0);
beizerPath.moveToPoint(points[0])
beizerPath.addCurveToPoint(points[3], controlPoint1: points[1], controlPoint2: points[2])
self.setNeedsDisplay()
points[0] = points[3];
points[1] = points[4];
control = 1;
}
}
override func touchesEnded(touches: NSSet, withEvent event: UIEvent) {
self.drawBitmapImage()
self.setNeedsDisplay()
beizerPath.removeAllPoints()
control = 0
}
override func touchesCancelled(touches: NSSet!, withEvent event: UIEvent!) {
self.touchesEnded(touches, withEvent: event)
}
// MARK : LOGIC
func drawBitmapImage() {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, false, 0);
if incrImage != nil {
var rectpath = UIBezierPath(rect: self.bounds)
UIColor.clearColor().setFill()
rectpath.fill()
}
incrImage?.drawAtPoint(CGPointZero)
//Set final color for drawing
UIColor.redColor().setStroke()
beizerPath.stroke()
incrImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
}
}
回答1:
The reason that it becomes laggy is two fold.
First, you are doing the drawing in the touch events. That is considered not a good idea. I agree that it is the best method to get the cleanest tracing of touch movement, but definitely not a good idea if you are concerned with performance.
Second, you are drawing the entire path (from touch to up) every move event. So even though you have drawn the first three segments when you get to the fourth, you clear the screen and draw the first three again. This paired with the drawing happening every touch event causes major slowdowns.
Ideally you would cache the latest touch event into an object. Then create a timer (60 fps maybe?) and draw a line from the last timer event to the current cached touch location. This could cause problems with the line not following the touch event as closely, but you may have to just experiment with that.
Then with that optimization, you should draw into an image context, then draw that context to the screen when needed. That way you are only drawing the latest segment into the context instead of redrawing the entire path.
Those two things should improve your performance immensely. It will definitely have a detrimental effect to the clarity of tracing touch events, but you should be able to find a happy medium in there somewhere. Maybe you cache all touch events and then on the timer event draw all the latest points into the context and fill that context to the screen. Then you should be able to keep the clarity of tracing and improve the performance.
You could also look into drawing into a UIImage that is inside a UIImageView onscreen. That may preserve your historical drawn path without requiring you to redraw it every pass, but I don't have any experience in that.
来源:https://stackoverflow.com/questions/32829778/drawing-performance-over-time-for-a-uibezierpath-with-swift-for-ios