CGContext: how do I erase pixels (e.g. kCGBlendModeClear) outside of a bitmap context?

前端 未结 3 1767
梦毁少年i
梦毁少年i 2021-02-01 09:29

I\'m trying to build an eraser tool using Core Graphics, and I\'m finding it incredibly difficult to make a performant eraser - it all comes down to:

CGContextSetB

相关标签:
3条回答
  • 2021-02-01 10:08

    2D graphics following painting paradigms. When you are painting, it's hard to remove paint you've already put on the canvas, but super easy to add more paint on top. The blend modes with a bitmap context give you a way to do something hard (scrape paint off the canvas) with few lines of code. The few lines of code do not make it an easy computing operation (which is why it performs slowly).

    The easiest way to fake clearing out pixels without having to do the offscreen bitmap buffering is to paint the background of your view over the image.

    -(void)drawRect:(CGRect)rect
    {
        if (drawingStroke) {
            CGColor lineCgColor = lineColor.CGColor;
            if (eraseModeOn) {
                //Use concrete background color to display erasing. You could use the backgroundColor property of the view, or define a color here
                lineCgColor = [[self backgroundColor] CGColor];
            } 
            [curImage drawAtPoint:CGPointZero];
            CGContextRef context = UIGraphicsGetCurrentContext();
            CGContextAddPath(context, currentPath);
            CGContextSetLineCap(context, kCGLineCapRound);
            CGContextSetLineWidth(context, lineWidth);
            CGContextSetBlendMode(context, kCGBlendModeNormal);
            CGContextSetStrokeColorWithColor(context, lineCgColor);
            CGContextStrokePath(context);
        } else {
            [curImage drawAtPoint:CGPointZero];
        }
    }
    

    The more difficult (but more correct) way is to do the image editing on a background serial queue in response to an editing event. When you get a new action, you do the bitmap rendering in the background to an image buffer. When the buffered image is ready, you call setNeedsDisplay to allow the view to be redrawn during the next update cycle. This is more correct as drawRect: should be displaying the content of your view as quickly as possible, not processing the editing action.

    @interface ImageEditor : UIView
    
    @property (nonatomic, strong) UIImage * imageBuffer;
    @property (nonatomic, strong) dispatch_queue_t serialQueue;
    @end
    
    @implementation ImageEditor
    
    - (dispatch_queue_t) serialQueue
    {
        if (_serialQueue == nil)
        {
            _serialQueue = dispatch_queue_create("com.example.com.imagebuffer", DISPATCH_QUEUE_SERIAL);
        }
        return _serialQueue;
    }
    
    - (void)editingAction
    {
        dispatch_async(self.serialQueue, ^{
            CGSize bufferSize = [self.imageBuffer size];
    
            UIGraphicsBeginImageContext(bufferSize);
    
            CGContext context = UIGraphicsGetCurrentContext();
    
            CGContextDrawImage(context, CGRectMake(0, 0, bufferSize.width, bufferSize.height), [self.imageBuffer CGImage]);
    
            //Do editing action, draw a clear line, solid line, etc
    
            self.imageBuffer = UIGraphicsGetImageFromCurrentImageContext();
            UIGraphicsEndImageContext();
    
            dispatch_async(dispatch_get_main_queue(), ^{
                [self setNeedsDisplay];
            });
        });
    }
    -(void)drawRect:(CGRect)rect
    {
        [self.imageBuffer drawAtPoint:CGPointZero];
    }
    
    @end
    
    0 讨论(0)
  • 2021-02-01 10:12

    key is CGContextBeginTransparencyLayer and use clearColor and set CGContextSetBlendMode(context, kCGBlendModeClear);

    0 讨论(0)
  • 2021-02-01 10:32

    I've managed to get good results by using the following code:

    - (void)drawRect:(CGRect)rect
    {
        if (drawingStroke) {
            if (eraseModeOn) {
                CGContextRef context = UIGraphicsGetCurrentContext();
                CGContextBeginTransparencyLayer(context, NULL);
                [eraseImage drawAtPoint:CGPointZero];
    
                CGContextAddPath(context, currentPath);
                CGContextSetLineCap(context, kCGLineCapRound);
                CGContextSetLineWidth(context, ERASE_WIDTH);
                CGContextSetBlendMode(context, kCGBlendModeClear);
                CGContextSetStrokeColorWithColor(context, [[UIColor clearColor] CGColor]);
                CGContextStrokePath(context);
                CGContextEndTransparencyLayer(context);
            } else {
                [curImage drawAtPoint:CGPointZero];
                CGContextRef context = UIGraphicsGetCurrentContext();
                CGContextAddPath(context, currentPath);
                CGContextSetLineCap(context, kCGLineCapRound);
                CGContextSetLineWidth(context, self.lineWidth);
                CGContextSetBlendMode(context, kCGBlendModeNormal);
                CGContextSetStrokeColorWithColor(context, self.lineColor.CGColor);
                CGContextStrokePath(context);
            }
        } else {
            [curImage drawAtPoint:CGPointZero];
        }
    
        self.empty = NO;
    }
    

    The trick was to wrap the following into CGContextBeginTransparencyLayer / CGContextEndTransparencyLayer calls:

    • Blitting the erase background image to the context
    • Drawing the "erase" path on top of the erase background image, using kCGBlendModeClear

    Since both the erase background image's pixel data and the erase path are in the same layer, it has the effect of clearing the pixels.

    0 讨论(0)
提交回复
热议问题