We are currently switching to WebP for textures in a video game. We've come across a problem where the areas in an image that have the alpha channel set to zero end up losing all the detail. You can see this effect in the following example:
Original Image (left is color channels, right is alpha channel)
As you can see, the zero-alpha areas have lost their detail.
This optimization makes sense when the alpha channel is being used as transparency. However, in our game we are using the alpha for something else and need to maintain the color channel integrity independently from the alpha channel. How do I disable this effect in the encoder so the color channel encodes normally?
I should mention I'm using libwebp in C++, calling the function WebPEncodeRGBA.
Thanks!