How to create simple custom filter for iOS using Core Image Framework?

后端 未结 4 1829
爱一瞬间的悲伤
爱一瞬间的悲伤 2021-02-06 13:33

I want to use in my app an custom filter. Now I know that I need to use Core Image framework, but i not sure that is right way. Core Image fram

相关标签:
4条回答
  • 2021-02-06 13:59

    Original accepted answer is depreciated. From iOS 8 you can create custom kernels for filters. You can find more information about this in:

    • WWDC 2014 Session 514 Advances in Core Image
    • Session 514 transcript
    • WWDC 2014 Session 515 Developing Core Image Filters for iOS
    • Session 515 transcript
    0 讨论(0)
  • OUTDATED

    You can't create your own custom kernels/filters in iOS yet. See http://developer.apple.com/library/mac/#documentation/graphicsimaging/Conceptual/CoreImaging/ci_intro/ci_intro.html, specifically:

    Although this document is included in the reference library, it has not been updated in detail for iOS 5.0. A forthcoming revision will detail the differences in Core Image on iOS. In particular, the key difference is that Core Image on iOS does not include the ability to create custom image filters.

    (Bolding mine)

    0 讨论(0)
  • 2021-02-06 14:10

    As Adam states, currently Core Image on iOS does not support custom kernels like the older Mac implementation does. This limits what you can do with the framework to being some kind of combination of existing filters.

    (Update: 2/13/2012)

    For this reason, I've created an open source framework for iOS called GPUImage, which lets you create custom filters to be applied to images and video using OpenGL ES 2.0 fragment shaders. I describe more about how this framework operates in my post on the topic. Basically, you can supply your own custom OpenGL Shading Language (GLSL) fragment shaders to create a custom filter, and then run that filter against static images or live video. This framework is compatible with all iOS devices that support OpenGL ES 2.0, and can create applications that target iOS 4.0.

    For example, you can set up filtering of live video using code like the following:

    GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
    GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"CustomShader"];
    GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, viewWidth, viewHeight)];
    
    // Add the view somewhere so it's visible
    
    [videoCamera addTarget:thresholdFilter];
    [customFilter addTarget:filteredVideoView];
    
    [videoCamera startCameraCapture];
    

    As an example of a custom fragment shader program that defines a filter, the following applies a sepia tone effect:

    varying highp vec2 textureCoordinate;
    
    uniform sampler2D inputImageTexture;
    
    void main()
    {
        lowp vec4 textureColor = texture2D(inputImageTexture, textureCoordinate);
        lowp vec4 outputColor;
        outputColor.r = (textureColor.r * 0.393) + (textureColor.g * 0.769) + (textureColor.b * 0.189);
        outputColor.g = (textureColor.r * 0.349) + (textureColor.g * 0.686) + (textureColor.b * 0.168);    
        outputColor.b = (textureColor.r * 0.272) + (textureColor.g * 0.534) + (textureColor.b * 0.131);
    
        gl_FragColor = outputColor;
    }
    

    The language used for writing custom Core Image kernels on the Mac is very similar to GLSL. In fact, you'll be able to do a few things that you can't in desktop Core Image, because Core Image's kernel language lacks some things that GLSL has (like branching).

    0 讨论(0)
  • 2021-02-06 14:14

    You can create custom filters for iOS easier than an Image Unit plug-in for MacOS X, so much so that they would be preferred, even if Image Unit plug-ins were supported by iOS. The problem is you cannot actually "package" them or otherwise bundle them as a resource like Image Unit plug-ins; you have to expose your source code to developers that use them. Moreover, they are only useful to developers; you cannot distribute them to end-users of iOS graphics apps the same way you can for MacOS X graphics apps that import third-party Core Image filters. For that, you must embed them in a Photo Editing Extension.

    Still, even processing images with a custom Core Image filter for iOS is easier than with an Image Unit plug-in. There's no importing, followed by the confusing task of configuring .plist and description files and what-not.

    A custom Core Image filter for iOS is simply a Cocoa Touch Class that is a subclass of CIFilter; in it, you specify input parameters (always at least the image), custom attributes settings and their defaults, and then any combination of built-in or custom Core Image filters. If you want to add an OpenGL kernel to the image-processing pipeline, you simply add a CIKernel method, which loads the .cikernel you write in a separate file.

    The beauty of this particular method for developing a custom Core Image Filter for iOS is that custom filters are instantiated and called the same way as built-in filters:

    CIFilter* PrewittKernel = [CIFilter filterWithName:@"PrewittKernel"];
    
    CIImage *result = [CIFilter filterWithName:@"PrewittKernel" keysAndValues:kCIInputImageKey, self.inputImage, nil].outputImage;
    

    Here's a simple example that uses OpenGL to apply the Prewitt Operator to an image; first, the Cocoa Touch Class (subclassing CIFilter), then, the CIKernel file (containing the OpenGL ES 3.0 code):

    The header file:

    //
    //  PrewittKernel.h
    //  Photo Filter
    //
    //  Created by James Alan Bush on 5/23/15.
    //
    //
    
    #import <CoreImage/CoreImage.h>
    
    @interface PrewittKernel : CIFilter
    {
        CIImage *inputImage;
    }
    
    @property (retain, nonatomic) CIImage *inputImage;
    
    @end
    

    The implementation file:

    //
    //  PrewittKernel.m
    //  Photo Filter
    //
    //  Created by James Alan Bush on 5/23/15.
    //
    //
    
    #import <CoreImage/CoreImage.h>
    
    @interface PrewittKernel : CIFilter
    {
        CIImage *inputImage;
    }
    
    @property (retain, nonatomic) CIImage *inputImage;
    
    @end
    
    
    @implementation PrewittKernel
    
    @synthesize inputImage;
    
    - (CIKernel *)prewittKernel
    {
        static CIKernel *kernelPrewitt = nil;
    
        NSBundle    *bundle = [NSBundle bundleForClass:NSClassFromString(@"PrewittKernel")];
        NSStringEncoding encoding = NSUTF8StringEncoding;
        NSError     *error = nil;
        NSString    *code = [NSString stringWithContentsOfFile:[bundle pathForResource:@"PrewittKernel" ofType:@"cikernel"] encoding:encoding error:&error];
    
        static dispatch_once_t onceToken;
        dispatch_once(&onceToken, ^{
            kernelPrewitt = [CIKernel kernelWithString:code];
        });
    
        return kernelPrewitt;
    }
    
    - (CIImage *)outputImage
    {
        CIImage *result = self.inputImage;
        return [[self prewittKernel] applyWithExtent:result.extent roiCallback:^CGRect(int index, CGRect rect) {
            return CGRectMake(0, 0, CGRectGetWidth(result.extent), CGRectGetHeight(result.extent));
        } arguments:@[result]];
    }
    
    @end
    

    The CIKernel (OpenGL ES 3.0):

    /* PrewittKernel.cikernel */
    
    kernel vec4 prewittKernel(sampler image)
    {
        vec2 xy = destCoord();
        vec4 bottomLeftIntensity = sample(image, samplerTransform(image, xy + vec2(-1, -1)));
        vec4 topRightIntensity = sample(image, samplerTransform(image, xy + vec2(+1, +1)));
        vec4 topLeftIntensity = sample(image, samplerTransform(image, xy + vec2(+1, -1)));
        vec4 bottomRightIntensity = sample(image, samplerTransform(image, xy + vec2(-1, +1)));
        vec4 leftIntensity = sample(image, samplerTransform(image, xy + vec2(-1, 0)));
        vec4 rightIntensity = sample(image, samplerTransform(image, xy + vec2(+1, 0)));
        vec4 bottomIntensity = sample(image, samplerTransform(image, xy + vec2(0, -1)));
        vec4 topIntensity = sample(image, samplerTransform(image, xy + vec2(0, +1)));
        vec4 h = vec4(-topLeftIntensity - topIntensity - topRightIntensity + bottomLeftIntensity + bottomIntensity + bottomRightIntensity);
        vec4 v = vec4(-bottomLeftIntensity - leftIntensity - topLeftIntensity + bottomRightIntensity + rightIntensity + topRightIntensity);
        float h_max = max(h.r, max(h.g, h.b));
        float v_max = max(v.r, max(v.g, v.b));
        float mag = length(vec2(h_max, v_max)) * 1.0;
    
        return vec4(vec3(mag), 1.0);
    }
    

    Here's another filter that generates an unsharp mask by subtracting (or, rather, differencing) a Gaussian blurred image from the original using built-in Core Image filters—no Core Image kernel code (OpenGL); it shows how to specify and use a custom attribute, namely, the radius of the Gaussian blur:

    The header file:

    //
    //  GaussianKernel.h
    //  Chroma
    //
    //  Created by James Alan Bush on 7/12/15.
    //  Copyright © 2015 James Alan Bush. All rights reserved.
    //
    
    #import <CoreImage/CoreImage.h>
    
    @interface GaussianKernel : CIFilter
    {
        CIImage *inputImage;
        NSNumber *inputRadius;
    }
    
    @property (retain, nonatomic) CIImage *inputImage;
    @property (retain, nonatomic) NSNumber *inputRadius;
    
    @end
    

    The implementation file:

    //
    //  GaussianKernel.m
    //  Chroma
    //
    //  Created by James Alan Bush on 7/12/15.
    //  Copyright © 2015 James Alan Bush. All rights reserved.
    //
    
    #import "GaussianKernel.h"
    
    @implementation GaussianKernel
    
    @synthesize inputImage;
    @synthesize inputRadius;
    
    + (NSDictionary *)customAttributes
    {
        return @{
                 @"inputRadius" :
                     @{
                         kCIAttributeMin       : @3.0,
                         kCIAttributeMax       : @15.0,
                         kCIAttributeDefault   : @7.5,
                         kCIAttributeType      : kCIAttributeTypeScalar
                         }
                 };
    }
    
    - (void)setDefaults
    {
        self.inputRadius = @7.5;
    }
    
        - (CIImage *)outputImage
        {
            CIImage *result = self.inputImage;
    
            CGRect rect = [[GlobalCIImage sharedSingleton].ciImage extent];
            rect.origin = CGPointZero;
            CGRect cropRectLeft = CGRectMake(0, 0, rect.size.width, rect.size.height);
            CIVector *cropRect = [CIVector vectorWithX:rect.origin.x Y:rect.origin.y Z:rect.size.width W:rect.size.height];
    
        result = [[CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:kCIInputImageKey, result, @"inputRadius", [NSNumber numberWithFloat:inputRadius.floatValue], nil].outputImage imageByCroppingToRect:cropRectLeft];
    
        result = [CIFilter filterWithName:@"CICrop" keysAndValues:@"inputImage", result, @"inputRectangle", cropRect, nil].outputImage;
    
        result = [CIFilter filterWithName:@"CIDifferenceBlendMode" keysAndValues:kCIInputImageKey, result, kCIInputBackgroundImageKey, result, nil].outputImage;
    
            return result;
        }
    
        @end
    
    0 讨论(0)
提交回复
热议问题