8
votes

I'm trying to do a Gaussian blur on a UIImage that replicates my photoshop mockup.

Desired Behavior: In Photoshop, when I run a Gaussian blur filter, the image layer gets larger as a result of the blurred edges.

Observed Behavior: Using GPUImage, I can successfully blur my UIImages. However, the new image is cropped at the original bounds, leaving a hard edge all the way around.

Setting UIImageView.layer.masksToBounds = NO; doesn't help, as the image is cropped not the view.

I've also tried placing the UIImage centered on a larger clear image before blurring, and then resizing. This also didn't help.

Is there a way to achieve this "Photoshop-style" blur?

enter image description here

UPDATE Working Solution thanks to Brad Larson:

UIImage sourceImage = ...
GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:sourceImage];
GPUImageTransformFilter *transformFilter = [GPUImageTransformFilter new];
GPUImageFastBlurFilter *blurFilter = [GPUImageFastBlurFilter new];

//Force processing at scale factor 1.4 and affine scale with scale factor 1 / 1.4 = 0.7
[transformFilter forceProcessingAtSize:CGSizeMake(SOURCE_WIDTH * 1.4, SOURCE_WIDTH * 1.4)];
[transformFilter setAffineTransform:CGAffineTransformMakeScale(0.7, 0.7)];

//Setup desired blur filter        
[blurFilter setBlurSize:3.0f];
[blurFilter setBlurPasses:20];

//Chain Image->Transform->Blur->Output        
[imageSource addTarget:transformFilter];
[transformFilter addTarget:blurFilter];
[imageSource processImage];

UIImage *blurredImage = [blurFilter imageFromCurrentlyProcessedOutputWithOrientation:UIImageOrientationUp];
2
When you "tried placing the UIImage centered on a larger clear image before blurring", do you mean you merged the original image into a larger UIImage, so it is one image? If so, try using a white image instead of a clear image, and make sure the white image has an alpha channel. I suspect the clear image has no alpha, and thus the result looks clipped.bobnoble

2 Answers

8
votes

GPUImage will only produce a result that is processed up to the limits of your image. In order to extend past your image, you'll need to expand the canvas on which it operates.

To do this, you'll want to feed your image into a GPUImageTransformFilter, then use -forceProcessingAtSize: or -forceProcessingAtSizeRespectingAspectRatio: to enlarge the working area. However, this will also enlarge the image by default, so to counter that, use a scale transform with your GPUImageTransformFilter to reduce the size of your image relative to the larger area. This will keep it with the same pixel dimensions, while placing it within a larger overall image.

Then all you need to do is feed this output into your blur filter and the blur will now extend past the edge of your original image. The size you force the image to be will depend on how far the blur needs to extend past the original image's edges.

1
votes

Try resizing the UIImageView's bounds to adjust to the blur. A view will clip what is outside of its bounds. Note that in your example, the box blurred in photoshop looks to be about 20% larger than the original image.

UIImageView *image;
image.layer.bounds = CGRectMake(0, 
                                0, 
                                image.layer.bounds.size.width + 5, 
                                image.layer.bounds.size.height + 5);