When using libjpeg to feed images into OpenCL, to be able to treat channels as normalized uint8's with CL_UNORM_INT8
(floats in the range [0.0, 1.0]
), you can only feed it buffers with 4 channel components. This is problematic, because libjpeg only outputs 3 (by default in RGB order) since JPEG has no notion of opacity.
The only workaround I see is to scanlines with libjpeg and then make a duplicate buffer of the appropriate length (with the fourth channel component added for each pixel in the scanlines) and then memcpy
the values over, setting the alpha component to 255
for each. You could even do this in place if you are tricky and initialize the buffer to be of row_stride * 4
initially and then walk backwards from index row_stride * 3 - 1
to 0
, moving components to the proper places in the full buffer (and adding 255
for alpha where necessary).
However, this feels hacky and if you're dealing with large images (I am), it's unacceptable to have this extra pass over (what will be in aggregate) the entire image.
So, is there a way to get libjpeg to just extend the number of components to 4? I've tried setting properties on cinfo
like output_components
to no avail. I've read that the only workaround is to compile a special version of libjpeg with the constant RGB_COMPONENTS = 4
set in jmorecfg.h
, but this certainly doesn't feel portable or for that matter necessary for such a (common) change of output.