0
votes

I've been looking for alternatives to UIVisualEffectView blur effect that let me vary the blur radius. I've thought about using a Core Image filter to create a variable blur myself. However, I'm a little out of practice with Core image filters. (The last time I used them was in Objective-C.)

I found this answer here on SO that provided an extension to UIView that captures the composited contents of a view and it's subviews, generates a blurred version, and adds it as an image view on top of the view: https://stackoverflow.com/a/56521521/205185

I decided to modify it as a custom subclass of UIView that adds a blurred layer on top of the view's layer.

The result is that the contents of the blurred layer are mostly transparent, and it doesn't blur the underlying view very much. I added an image view to the view, and it's contents don't seem to blur at all.

Here is a link to the project I created.

When I generate the blurred image, I both install it as the contents of a layer on top of the view's layer, and copy it to a separate image view that is below. The separate image view looks exactly like the results I want, but the view with the layer on top has very little visible change. Mostly, the original image below shows through.

The original composited view looks like this:

enter image description here

In the screenshot below, the top image is the composited image with the blur layer on top (blur radius 10) and the bottom image is the blur output installed in a separate image view:

enter image description here

You can see from the image view version that the blurred image is completely opaque. The green background shows everywhere. None of the view controller's white background bleeds through. Yet in the top custom view with the blurred layer on top, the crisp edges of the text and the border on the image view and blue box view are still visible. The BlurView's appearance with the blurred layer on top has barely changed.Why is that?

The two key functions in the BlurView class are below:

func applyBlur() {
    let context = CIContext(options: nil)
    self.makeBlurredImage(with: blurLevel, context: context, completed: { processedImage in
        self.blurLayer.contents = processedImage.cgImage
        self.updateBlurImage?(processedImage)
    })
}

private func makeBlurredImage(with level: CGFloat, context: CIContext, completed: @escaping (UIImage) -> Void) {
    // screen shot
    layer.isOpaque = true

    UIGraphicsBeginImageContextWithOptions(self.frame.size, false, 1)
    self.layer.render(in: UIGraphicsGetCurrentContext()!)
    let resultImage = UIGraphicsGetImageFromCurrentImageContext()!
    UIGraphicsEndImageContext()
    let beginImage = CIImage(image: resultImage)


    //Clamp the image
    guard let clampFilter = CIFilter(name: "CIAffineClamp") else {
        fatalError()
    }
    clampFilter.setValue(beginImage, forKey: kCIInputImageKey)
    clampFilter.setValue(CGAffineTransform.identity, forKey: kCIInputTransformKey)
    var output = clampFilter.outputImage

    let blurFilter = CIFilter(name: "CIGaussianBlur")!
    blurFilter.setValue(level, forKey: kCIInputRadiusKey)
    blurFilter.setValue(output, forKey: kCIInputImageKey)


    let cropFilter = CIFilter(name: "CICrop")!
    cropFilter.setValue(blurFilter.outputImage, forKey: kCIInputImageKey)
    cropFilter.setValue(CIVector(cgRect: beginImage!.extent), forKey: "inputRectangle")

    output = cropFilter.outputImage
    var cgimg: CGImage?
    var extent: CGRect?

    let global = DispatchQueue.global(qos: .userInteractive)

    global.async {
        extent = output!.extent
        cgimg = context.createCGImage(output!, from: extent!)!
        let processedImage = UIImage(cgImage: cgimg!)

        DispatchQueue.main.async {
            completed(processedImage)
        }
    }
}

The makeBlurredImage() function operates asynchronously, and invokes a closure once the blurred image is available.

The applyBlur() function calls makeBlurredImage(), and the closure it passes installs the resulting blurred image in a layer that is placed on top of the BlurView's content layer.

I would expect the appearance of the BlurView to be exactly like the appearance of the image view below. I'm stumped.

1

1 Answers

0
votes

I found the solution, but am not clear why the code in my project exhibits the behavior I describe.

I have a didSet method on my view's frame property that sets the bounds of the layer. That lets it adapt to rotations and resizes.

I made my blur layer a lazy var. Here is what that code looks like:

lazy var blurLayer: CALayer = {
    let newLayer = CALayer()
    self.layer.addSublayer(newLayer)
    newLayer.isOpaque = true
    return newLayer
}()

override var frame: CGRect {
    didSet {
        blurLayer.frame = self.bounds
    }
}

It looks like the frame gets set before the view has been added to the view hierarchy, and therefore adding the layer as a sublayer when the frame is set messes things up. If I change my lazy var blurLayer to remove adding the sublayer:

lazy var blurLayer: CALayer = {
    let newLayer = CALayer()
    //self.layer.addSublayer(newLayer)
    //newLayer.isOpaque = true
    return newLayer
}()

And then do it in awakeFromNib() instead, it works:

override func awakeFromNib() {
    self.layer.addSublayer(blurLayer)
    blurLayer.isOpaque = true
    applyBlur()
}

As I say, I'm not clear on why, but it works.