I want to load the opencv::mat images with alpha channel to OpenGL texture.
With the help of following posts I could manage to load RGB images into an OpenGL texture from opencv::mat.
https://stackguides.com/questions/16809833/opencv-image-loading-for-opengl-texture
But when I try to load the image with an alpha channel, there is a problem.
Here is how the glTexImage2D function is called.
glTexImage2D(GL_TEXTURE_2D, // Type of texture
0, // Pyramid level (for mip-mapping) -
GL_RGBA, // Internal colour format to convert to
image.cols, // Image width
image.rows, // Image height
0, // Border width in pixels (can either be 1 or 0)
GL_BGRA_INTEGER, // Input image format
GL_UNSIGNED_BYTE, // Image data type
image.ptr()); // The actual image data itself
Also, before rendering, I enable the blending by using glEnable(GL_BLEND);
If I specifyinput image format and internal color format to both be GL_RGBA / GL_BGRA, I get a segmentation fault, but if I set either of them or both of them to GL_RGBA_INTEGER, at least I can see the window, but just blank.
I can change the transparency and accordingly the window become more or less transparent, but there is no image in it. There is an image in the cv::mat as i can see it using cv::imshow, but somehow there seems a problem passing it to the OpenGL texture.
Could any one suggest something I might be missing. Thanks in advance.
P.S : I am new to OpenGL, I would really appreciate if explained with smallest detail.. :P
_INTEGER
pixel transfer formats do not make a lot of sense when used together withGL_UNSIGNED_BYTE
, these are for special new packed integer formats likeUNSIGNED_SHORT_5_5_5_1
(16-bits, where each component is a fractional portion of an integer type). Try usingUNSIGNED_INT_8_8_8_8
instead ofGL_UNSIGNED_BYTE
if you are going to useGL_BGRA_INTEGER
, this at least makes sense. – Andon M. Coleman