I have an unmasked CGImage, using four bytes RGBA per pixel, whose first few pixels are:
C5 D2 D4 FF
C7 D4 D6 FF
C8 D5 D6 FF
C6 D4 D3 FF
C7 D5 D4 FF
I then make a masking image in greyscale (i.e. one byte per pixel)
00
00
00
00
00
Finally I call CGImageCreateWithMask to apply the mask. The first few pixels of the resulting masked image are:
C5 D2 D4 FF
C7 D4 D6 FF
C8 D5 D6 FF
C6 D4 D3 FF
C7 D5 D4 FF
If I convert this CGImage to a UIImage and add it to an image control these first few pixels are correctly 100% transparent, and yet their underlying pixels are incorrectly 100% opaque.
This is the pattern I am using to look at the bytes of any CGImage.
let imageCGDataProvider:CGDataProvider? = CGImageGetDataProvider(imageCG)
let imageCGPixelData:CFData? = CGDataProviderCopyData(imageCGDataProvider!)
let imageCGData:UnsafePointer<UInt8> = CFDataGetBytePtr(imageCGPixelData!)
How do I get the byte values of the pixels in the masked CGImage, i.e. the byte values with the mask applied?