6
votes

I have a UIImage loaded into a UIImageView. The UIImage is larger than the UIImageView and it has been scaled down to fit. Obviously the scaled down UIImage shows jagged edges.

What is the best way to anti-alias this image with regards to performance?

I've seen this method using drawInRect but I've also read that drawInRect does not give the best performance.

I've read several different articles and I've tried a few methods myself. But after reading a few more posts on the performance differences between using UIViews and Core Graphics, I was wondering which method for anti aliasing an image gives the best performance?

3

3 Answers

5
votes

Investigate the list of available Core Image Filters. Specifically, the Lanczos Scale Transform available via CILanczosScaleTransform seems to be exactly what you need. It should be available on all iOS versions >= 6.0.

Typically, using Core Image filters will be more performant than manually resorting to Core Graphics. However, I urge you to verify the results as well as the performance in your specific case.

2
votes

The best solution is always having the right image size in your UIImageView. However, if you cannot have the correct image size and you need to resize it, another good solution is to use CoreGraphics to perform an image scale operation outside the main thread.

Since SDK 4.0, CoreGraphics operations are thread safe, so you can put all the resizing stuff into a background queue and handle the resizing in there. Once the resizing has finished, you have to assign the cropped image in your UIImageView in the main thread because all UIKit stuff must be done in that thread. With this approach, you're not going to block the main thread every time you resize the images.

Once you've done that, you can also cache the cropping results in order to avoid repetitive cropping calculation (i.e., every time you scroll into the same UITableViewCell) and improve performance.

You can implement this as a UIImage category, take my code as an example:

- (void)resizeImageWithSize:(CGSize)size
                   cacheKey:(NSString *)cacheKey
            completionBlock:(void (^)(UIImage *croppedImage))completionBlock
{
    dispatch_async([[self class] sharedBackgroundQueue], ^{
        // Check if we have the image cached
        UIImage *resizedImage = [[[self class] resizedImageCache] objectForKey:cacheKey];
        if (nil == resizedImage) {
            // If not, resize and cache it
            @autoreleasepool {
               resizedImage = [self resizeImageWithSize:size];
               [[[self class] resizedImageCache] setObject:resizedImage forKey:cacheKey];
            }
        }
        dispatch_async(dispatch_get_main_queue(), ^{
            completionBlock(resizedImage);
        });
    });
}

Then, the resizeImageWithSize: method implementation is the one where all CoreGraphics stuff happen. You may find interesting the FXImageView library by Nick Lockwood, which uses the same approach: UIImageView category, has a resize cache and uses a background thread to do the Core Graphics stuff.

1
votes

Since you asked about Swift in the comments:

Alamofire has all of this built in, including an automatic caching layer. If you're not using this for network requests, at the very least it's a great example to work from.

Example:

// Note: this seamlessly handles both scaling AND caching the scaled image
let filter = AspectScaledToFillSizeFilter(size: imageView.frame)
imageView.af_setImage(withURL: url, filter: filter)

Just make sure imageView.frame is set beforehand (e.g. call layoutIfNeeded() first for autolayout) or it'll assert.