I\'m trying to do a Gaussian blur on a UIImage that replicates my photoshop mockup.
Desired Behavior: In Photoshop, when I run a Gaussian blur filter,
GPUImage will only produce a result that is processed up to the limits of your image. In order to extend past your image, you'll need to expand the canvas on which it operates.
To do this, you'll want to feed your image into a GPUImageTransformFilter, then use -forceProcessingAtSize:
or -forceProcessingAtSizeRespectingAspectRatio:
to enlarge the working area. However, this will also enlarge the image by default, so to counter that, use a scale transform with your GPUImageTransformFilter to reduce the size of your image relative to the larger area. This will keep it with the same pixel dimensions, while placing it within a larger overall image.
Then all you need to do is feed this output into your blur filter and the blur will now extend past the edge of your original image. The size you force the image to be will depend on how far the blur needs to extend past the original image's edges.
Try resizing the UIImageView's bounds to adjust to the blur. A view will clip what is outside of its bounds. Note that in your example, the box blurred in photoshop looks to be about 20% larger than the original image.
UIImageView *image;
image.layer.bounds = CGRectMake(0,
0,
image.layer.bounds.size.width + 5,
image.layer.bounds.size.height + 5);