2
votes

In my app, I convert and process images. from colour to greyscale, then doing operations such as histogram-equalisation, filtering, etc. that part works fine.

my UIImage display correctly, I also save them to jpeg files and it works.

The only problem is that, although my images are now greyscales, they are still saved as RGB jpegs. that is the red, green and blue value for each pixel are the same but it still waste space to keep the duplicated value, making the file size higher than it could be.

So when i open the image file in photoshop, it is black & white but when I check "Photoshop > Image > Mode", it still says "RGB" instead of "Greyscale".

Anyone know how to tell iOS that the UIImageJPEGRepresentation call should create data with one channel per pixel instead of 4?

Thanks in advance.

1

1 Answers

1
votes

You should do an explicit conversion of your image using CGColorSpaceCreateDeviceGray() as color space which is 8 bits per component, 1 channel.