I want to blur my view, and I use this code:
//Get a UIImage from the UIView
NSLog(@"blur capture");
UIGraphicsBeginImageContext(BlurContrainerView.
The issue isn't that it's not blurring all of the image, but rather that the blur is extending the boundary of the image, making the image larger, and it's not lining up properly as a result.
To keep the image the same size, after the line:
CIImage *resultImage = [gaussianBlurFilter valueForKey: @"outputImage"];
You can grab the CGRect
for a rectangle the size of the original image in the center of this resultImage
:
// note, adjust rect because blur changed size of image
CGRect rect = [resultImage extent];
rect.origin.x += (rect.size.width - viewImage.size.width ) / 2;
rect.origin.y += (rect.size.height - viewImage.size.height) / 2;
rect.size = viewImage.size;
And then use CIContext
to grab that portion of the image:
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:resultImage fromRect:rect];
UIImage *blurredImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
Alternatively, for iOS 7, if you go to the iOS UIImageEffects sample code and download iOS_UIImageEffects.zip
, you can then grab the UIImage+ImageEffects
category. Anyway, that provides a few new methods:
- (UIImage *)applyLightEffect;
- (UIImage *)applyExtraLightEffect;
- (UIImage *)applyDarkEffect;
- (UIImage *)applyTintEffectWithColor:(UIColor *)tintColor;
- (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius tintColor:(UIColor *)tintColor saturationDeltaFactor:(CGFloat)saturationDeltaFactor maskImage:(UIImage *)maskImage;
So, to blur and image and lightening it (giving that "frosted glass" effect) you can then do:
UIImage *newImage = [image applyLightEffect];
Interestingly, Apple's code does not employ CIFilter
, but rather calls vImageBoxConvolve_ARGB8888 of the vImage high-performance image processing framework. This technique is illustrated in WWDC 2013 video Implementing Engaging UI on iOS.
Looks like the blur filter is giving you back an image that’s bigger than the one you started with, which makes sense since pixels at the edges are getting blurred out past them. The easiest solution would probably be to make newView
use a contentMode
of UIViewContentModeCenter
so it doesn’t try to squash the blurred image down; you could also crop blurredImage
by drawing it in the center of a new context of the appropriate size, but you don’t really need to.
A faster solution is to avoid CGImageRef altogether and perform all transformations at CIImage lazy level.
So, instead of your unfitting:
// create UIImage from filtered image (but size is wrong)
blurredImage = [[UIImage alloc] initWithCIImage:resultImage];
A nice solution is to write:
// cropping rect because blur changed size of image
CIImage *croppedImage = [resultImage imageByCroppingToRect:imageToBlur.extent];
// create UIImage from filtered cropped image
blurredImage = [[UIImage alloc] initWithCIImage:croppedImage];
// cropping rect because blur changed size of image
let croppedImage = resultImage.cropping(to: imageToBlur.extent)
// create UIImage from filtered cropped image
let blurredImage = UIImage(ciImage: croppedImage)
// cropping rect because blur changed size of image
let croppedImage = resultImage.cropped(to: imageToBlur.extent)
// create UIImage from filtered cropped image
let blurredImage = UIImage(ciImage: croppedImage)