To retrieve pixel values from CGImage I use CGContextDrawImage (like described here: How to get pixel data from a UIImage (Cocoa Touch) or CGImage (Core Graphics)?). The only difference is that I create 128 bpp float components context, not usual 32 bpp context. The source CGImage obtained from CGImageSource created with option kCGImageSourceShouldAllowFloat. That way I hoped to get access to float pixel values color matched with my bitmap context's color space and use them in further image processing. The problem is that resulting image data seems to be loosing dynamic range. It can be seen in shadow, plain blue sky areas. They become contoured and lacking detail. Some investigation showed the problem occurs in CGContextDrawImage (Source CGImage contains full dynamic range, saving it through CGImageDestination proves it) and after CGContextDrawImage context contents become posterized.
After some more investigation I found this: http://lists.apple.com/archives/quartz-dev/2007/mar/msg00026.html That led me to conclusion that problem is not in my code but in core graphics or that is intended behaviour.
My question is: what is correct way to obtain floating point data from image using core graphics?