I've written an app to test image performance on iOS. I've tried 3 different views, all displaying the same large PNG. The first is a view that draws using CGContextDrawImage(). The second sets self.layer.content. The third is a plain UIImageView.
The image used is created using -[UIImage initWithContentsOfData:] and cached in the viewController. Each test repeatedly allocs a view, adds it to the view hierarchy and then removes it and releases it. Timings are taken from the start of loadView to viewDidAppear and given as fps (effectively, view draws per second).
Here are the results from an iPad 1 running 5.1 using a 912 x 634 unscaled image:
CGContext: 11 fps
CALayer: 10 fps
UIImageView: 430 fps (!)
Am I hallucinating? It seems almost impossible that UIImageView can draw that fast, but I can actually watch the images flicker. I tried swapping between two similar views to defeat possible caching, but the frame rate was even higher.
I had always assumed that UIImageView was just a wrapper for -[CALayer setContent]. However, profiling the UIImageView shows almost no time spent in any drawing method that I can identify.
I'd love to understand what's going on. Any help would be most appreciated.
viewDidAppearmight be called before the view actually shows up on screen. If you aren't forcing the screen to update, you're not really timing anything useful; you might create and destroy several UIImageViews but only one actually gets drawn. I wonder if UIImageView is setting its layer's content on-demand in-layoutSubviews-- that's certainly how I'd do it, to ensure that I touched the layer only once per screen update. - Kurt Revis