Fast blur on iOS

前端 未结 5 490
孤独总比滥情好
孤独总比滥情好 2020-12-07 10:29

I\'m running into trouble trying to blur part of the screen in my iOS app. See image for better idea of what I\'m trying to do.

相关标签:
5条回答
  • 2020-12-07 10:45

    You might try the Apple Core Image Filter (CIFilter) set of routines. They require iOS 5, so you might not be able to use them.

    I am not sure if it is faster then the methods you have tried, but I have used it in projects in the past, and it works really well. If you can grab the part of the screen you want to make blurry, put that into an image, pass it through a filter, and then re-display it at the appropriate place on the screen, that should work.

    I used the filters to change the colors of an image in real-time, and it worked well.

    http://developer.apple.com/library/ios/#DOCUMENTATION/GraphicsImaging/Reference/QuartzCoreFramework/Classes/CIFilter_Class/Reference/Reference.html

    0 讨论(0)
  • 2020-12-07 10:51

    I would recommend Brad Larson's GPUImage which is fully backed by the GPU for a wide variety of image processing effects. It's very fast, and even fast enough that in his demo app he does real-time video processing from the camera and the frame-rate is excellent.

    https://github.com/BradLarson/GPUImage

    Here is a code snippet I wrote to apply a basic box blur which blurs the bottom and top thirds of the image but leaves the middle of the image un-blurred. His library is extremely extensive and contains almost every kind of image filter effect imaginable.

    GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:[self screenshot]];
    
    GPUImageTiltShiftFilter *boxBlur = [[GPUImageTiltShiftFilter alloc] init];
            boxBlur.blurSize = 0.5;
    
    [stillImageSource addTarget:boxBlur];
    
    [stillImageSource processImage];
    
    UIImage *processedImage = [stillImageSource imageFromCurrentlyProcessedOutput];
    
    0 讨论(0)
  • 2020-12-07 10:59

    I haven't tested this but I wonder if you could place a CALayer where you want the box to be blurred and then find a useful CIFilter that you can set on the CALayer's backgroundFilters. Just a thought.

    See CALayer.backgroundFilters

    0 讨论(0)
  • 2020-12-07 11:02

    Though it may be a bit late to respond, You can use Core Image filters. The reason it is so slow is this line.

    CIContext *context = [CIContext contextWithOptions:nil];
    

    In the Apple documents to get the best performance in Core Image they state firstly

    "Don’t create a CIContext object every time you render. Contexts store a lot of state information; it’s more efficient to reuse them."

    My personal solution to this is to make a Singleton for the Core Image Context. So I only ever create one.

    My code is in this demo project on GitHub.

    https://github.com/KyleLopez/DemoCoreImage

    Feel free to use it, or find another solution to your liking. The slowest part I've found in CoreImage is the context, Image processing after that is really fast.

    0 讨论(0)
  • 2020-12-07 11:02

    Have you tried this library:

    https://github.com/gdawg/uiimage-dsp

    It uses the vDSP/Accelerate framework and seems easy to use.

    BTW: 0.01s seems far too quick. 0.03 should do as well.

    0 讨论(0)
提交回复
热议问题