CIFilter is not working correctly when app is in background

最后都变了- 提交于 2019-12-21 06:59:10

问题


We are applying a 'CIGaussianBlur' filter on few images. The process is working fine most of the time. But when the app moves to the background the process produce whit stripes on the image. (Images below, notice that the left and bottom of the image is striped to white and that the image is shrieked a bit in comparing to the original image).

The Code:

 - (UIImage*)imageWithBlurRadius:(CGFloat)radius
{
    UIImage *image = self;
    LOG(@"(1) image size before resize = %@",NSStringFromCGSize(image.size));
    NSData *imageData = UIImageJPEGRepresentation(self, 1.0);
    LOG(@"(2) image data length = %ul",imageData.length);

    //create our blurred image
    CIContext *context = [CIContext contextWithOptions:nil];
    CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];

    //setting up Gaussian Blur (we could use one of many filters offered by Core Image)
    CIFilter *filter = [CIFilter filterWithName:@"CIGaussianBlur"];
    [filter setValue:inputImage forKey:kCIInputImageKey];
    [filter setValue:[NSNumber numberWithFloat:radius] forKey:@"inputRadius"];
    CIImage *result = [filter valueForKey:kCIOutputImageKey];
    //CIGaussianBlur has a tendency to shrink the image a little, this ensures it matches up exactly to the bounds of our original image
    CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];
    UIImage *finalImage = [UIImage imageWithCGImage:cgImage];

    CGImageRelease(cgImage);
   LOG(@"(3) final image size after resize = %@",NSStringFromCGSize(finalImage.size));
    return finalImage;
}

Before Filter

)

After Filter


回答1:


Actually, I just faced this exact problem, and found a solution that's different than what @RizwanSattar describes.

What I do, based on an exchange with "Rincewind" on the Apple developer boards, is to first apply a CIAffineClamp on the image, with the transform value set to identity. This creates an image at the same scale, but with an infinite extent. That causes the blur to blur the edges correctly.

Then after I apply the blur, I crop the image to it's original extent, cropping away the feathering that takes place on the edges.

You can see the code in a CI Filter demo app I've posted on github:

CIFilter demo project on github

It's a general-purpose program that handles all the different CI filters, but it has code to deal with the Gaussian blur filter.

Take a look at the method showImage. It has special-case code to set the extent on the source image before applying the blur filter:

if ([currentFilterName isEqualToString: @"CIGaussianBlur"])
{
  // NSLog(@"new image is bigger");
  CIFilter *clampFilter = [self clampFilter];

  CIImage *sourceCIImage = [CIImage imageWithCGImage: imageToEdit.CGImage];
  [clampFilter setValue: sourceCIImage
                 forKey: kCIInputImageKey];


  [clampFilter setValue:[NSValue valueWithBytes: &CGAffineTransformIdentity
                                       objCType:@encode(CGAffineTransform)]
                 forKey:@"inputTransform"];



  sourceCIImage = [clampFilter valueForKey: kCIOutputImageKey];
  [currentFilter setValue: sourceCIImage
                   forKey: kCIInputImageKey];
}

(Where the method "clampFilter" just lazily loads a CIAffineClamp filter.)

Then I apply the user-selected filter:

outputImage = [currentFilter valueForKey: kCIOutputImageKey];

Then after applying the selected filter, I then check the extent of the resulting image and crop it back to the original extent if it's bigger:

CGSize newSize;

newSize = outputImage.extent.size;

if (newSize.width > sourceImageExtent.width || newSize.height > sourceImageExtent.height)
{
  // NSLog(@"new image is bigger");
  CIFilter *cropFilter = [self cropFilter];  //Lazily load a CIAffineClamp filter

  CGRect boundsRect = CGRectMake(0, 0, sourceImageExtent.width, sourceImageExtent.height);

  [cropFilter setValue:outputImage forKey: @"inputImage"];

  CIVector *rectVector = [CIVector vectorWithCGRect: boundsRect];

  [cropFilter setValue: rectVector
                forKey: @"inputRectangle"];
  outputImage = [cropFilter valueForKey: kCIOutputImageKey];
}



回答2:


The reason you are seeing those "white strips" in the blurred image is that the resulting CIImage is bigger than you original image, because it has the fuzzy edges of the blur. When you are hard-cropping the resulting image to be the same size as your original image, it's not accounting for the fuzzy edges.

After:

CIImage *result = [filter valueForKey:kCIOutputImageKey];

Take a look at result.extent which is a CGRect that shows you the new bounding box relative to the original image. (i.e. for positive radii, result.extent.origin.y would be negative)

Here's some code (you should really test it):

CIImage *result = blurFilter.outputImage;

// Blur filter will create a larger image to cover the "fuzz", but
// we should cut it out since goes to transparent and it looks like a
// vignette
CGFloat imageSizeDifference = -result.extent.origin.x;
// NOTE: on iOS7 it seems to generate an image that will end up still vignetting, so
// as a hack just multiply the vertical inset by 2x
CGRect imageInset = CGRectInset(result.extent, imageSizeDifference, imageSizeDifference*2);

CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:result fromRect:imageInset];

Hope that helps.



来源:https://stackoverflow.com/questions/18315684/cifilter-is-not-working-correctly-when-app-is-in-background

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!