9
votes

I've got a LibGDX game with cartoon clouds with a smooth gradient. There are other examples of gradients in the game that have a similar issue, but the clouds are the most obvious example. They look fine in Android, on iOS and on the Desktop version of the game, but on the WebGL version the gradients are not drawn as smooth. It only appears to be alpha gradients that have the problem. Other gradients look ok.

I've tried on 3 different devices in Chrome and IE, and all 3 produce the same results. You can find a test of the HTML5 version here.

https://wordbuzzhtml5.appspot.com/canvas/

I've added an example IntelliJ project on github here

https://github.com/WillCalderwood/CloudTest

If you have intelliJ, clone that project, open the build.gradle file, press Alt-F12, type gradlew html:superdev and then browse to http://localhost:8080/html/

The critical code is render() here

The bottom image here is the desktop version, the top is the WebGL version, both running on the same hardware.

enter image description here

There's nothing clever going on with the drawing. It's just a call to

    spriteBatch.draw(texture, getLeft(), getBottom(), getWidth(), getHeight());

I'm using the default shader, textures packed with premultiplied alpha with the blend function set as

    spriteBatch.setBlendFunction(GL20.GL_ONE, GL20.GL_ONE_MINUS_SRC_ALPHA);

This is the actual image, although alpha not premultiplied as that's done by my packer.

enter image description here

Does anyone know a possible reason for this and how I might resolve it?

Update

This only appears to happen when using the blending mode GL20.GL_ONE, GL20.GL_ONE_MINUS_SRC_ALPHA

Another Update

I've tried changing the whole game to use non-premultiplied alpha textures. I use Texture Packer which can help fix the halo issues that often occur with non-premultiplied alpha. All this works fine in the Android and Desktop version. In the WebGL version, while I get smooth gradients, I get still get a small halo effect, so I can't use this as a solution either.

And another update

Here's a new image. Desktop version on the top, web version on the bottom. Blending mode GL20.GL_ONE, GL20.GL_ONE_MINUS_SRC_ALPHA on the left and GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA on the right

enter image description here

Here's a zoomed version of the bottom left image above with increased contrast to show the issue.

enter image description here

I've also done a lot of playing with the fragment shader to try and work out what's happening. If I set

gl_FragColor = vec4(c.a, c.a, c.a, 1.0);

then the gradient is smooth, but if I set

gl_FragColor = vec4(c.r, c.r, c.r, 1.0);

Then I get banding. This points towards a precision issue I believe as the colour channels have been squeezed into the darker end of the spectrum by the pre-multiplication process.

3
What texture format are you using and what is your frambuffer format? RGBA8 uncompressed?solidpixel
@Isogen74 Yes, RGBA8 uncompressed.Will Calderwood
are you sure the target alpha is 1 all the way? webgl will composite with the page background as premult alpha. To be sure just clear alpha to 1 before blit.starmole
@starmole check this. I've changed render() to double check alpha is set to 1. The same problem still occurs.Will Calderwood
hm. i was just guessing. :) what if you clear to 1,1,1 instead? Also: Is the background without the clouds already different? It looks a bit darker to me?starmole

3 Answers

6
votes

WebGL treats alphas slightly differently from standard OpenGL, and can often cause problems.

This site explains the differences quite well.

The biggest difference between OpenGL and WebGL is that OpenGL renders to a backbuffer that is not composited with anything so, or effectively not composited with anything by the OS's window manager, so it doesn't matter what your alpha is.

WebGL is composited by the browser with the web page and the default is to use pre-multiplied alpha the same as .png tags with transparency and 2d canvas tags.*

That site also gives workarounds for the typical problems people face. It's a little involved but should sort out your issues.

I'm not going to paste the whole article in here, but I suspect you'd be best sticking with non-pre-multiplied, and making sure you clear the alpha channel after each render. The site goings into much more detail.

4
votes

I spent the best part of the day looking into this, because I'm also seeing this exact issue. I think I finally got to the bottom of it.

This is caused by the way libGDX loads images. A texture is created from a Pixmap on all platforms, where a Pixmap is basically an in-memory mutable image. This is implemented in the core library with some native code (presumably for speed).

However, since native code is obviously impossible in the browser, Pixmap has a different implementation in the GWT backend. The salient part there is the constructor:

public Pixmap (FileHandle file) {
    GwtFileHandle gwtFile = (GwtFileHandle)file;
    ImageElement img = gwtFile.preloader.images.get(file.path());
    if (img == null) throw new GdxRuntimeException("Couldn't load image '" + file.path() + "', file does not exist");
    create(img.getWidth(), img.getHeight(), Format.RGBA8888);
    context.setGlobalCompositeOperation(Composite.COPY);
    context.drawImage(img, 0, 0);
    context.setGlobalCompositeOperation(getComposite());
}

This creates a HTMLCanvasElement and a CanvasRenderingContext2D, then draws the image to the canvas. This makes sense in the libGDX context, since a Pixmap is supposed to be mutable, but an HTML image is read-only.

I'm not exactly sure how the pixels are eventually retrieved again for upload to the OpenGL texture, but by this point we're doomed already. Because note this warning in the canvas2d spec:

Note: Due to the lossy nature of converting to and from premultiplied alpha color values, pixels that have just been set using putImageData() might be returned to an equivalent getImageData() as different values.

To show the effect, I created a JSFiddle: https://jsfiddle.net/gg9tbejf/ This doesn't use libGDX, just raw canvas, JavaScript and WebGL, but you can see that the image is mutilated after a round-trip through canvas2d.

Apparently most (all?) major browsers store their canvas2d data with premultiplied alpha, so lossless recovery is impossible. This SO question shows fairly conclusively that there is currently no way around that.


Edit: I wrote a workaround in my local project without modifying libGDX itself. Create ImageTextureData.java in your GWT project (package name matters; it accesses package-private fields):

package com.badlogic.gdx.backends.gwt;

import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Pixmap;
import com.badlogic.gdx.graphics.TextureData;
import com.badlogic.gdx.utils.GdxRuntimeException;
import com.google.gwt.dom.client.ImageElement;
import com.google.gwt.webgl.client.WebGLRenderingContext;

public class ImageTextureData implements TextureData {

    private final ImageElement imageElement;
    private final Pixmap.Format format;
    private final boolean useMipMaps;

    public ImageTextureData(ImageElement imageElement, Pixmap.Format format, boolean useMipMaps) {
        this.imageElement = imageElement;
        this.format = format;
        this.useMipMaps = useMipMaps;
    }

    @Override
    public TextureDataType getType() {
        return TextureDataType.Custom;
    }

    @Override
    public boolean isPrepared() {
        return true;
    }

    @Override
    public void prepare() {
    }

    @Override
    public Pixmap consumePixmap() {
        throw new GdxRuntimeException("This TextureData implementation does not use a Pixmap");
    }

    @Override
    public boolean disposePixmap() {
        throw new GdxRuntimeException("This TextureData implementation does not use a Pixmap");
    }

    @Override
    public void consumeCustomData(int target) {
        WebGLRenderingContext gl = ((GwtGL20) Gdx.gl20).gl;
        gl.texImage2D(target, 0, GL20.GL_RGBA, GL20.GL_RGBA, GL20.GL_UNSIGNED_BYTE, imageElement);
        if (useMipMaps) {
            gl.generateMipmap(target);
        }
    }

    @Override
    public int getWidth() {
        return imageElement.getWidth();
    }

    @Override
    public int getHeight() {
        return imageElement.getHeight();
    }

    @Override
    public Pixmap.Format getFormat() {
        return format;
    }

    @Override
    public boolean useMipMaps() {
        return useMipMaps;
    }

    @Override
    public boolean isManaged() {
        return false;
    }
}

Then add GwtTextureLoader.java anywhere in your GWT project:

package com.example.mygame.gwt;

import com.badlogic.gdx.assets.AssetDescriptor;
import com.badlogic.gdx.assets.AssetManager;
import com.badlogic.gdx.assets.loaders.AsynchronousAssetLoader;
import com.badlogic.gdx.assets.loaders.FileHandleResolver;
import com.badlogic.gdx.assets.loaders.TextureLoader;
import com.badlogic.gdx.backends.gwt.GwtFileHandle;
import com.badlogic.gdx.backends.gwt.ImageTextureData;
import com.badlogic.gdx.files.FileHandle;
import com.badlogic.gdx.graphics.Pixmap;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.TextureData;
import com.badlogic.gdx.utils.Array;
import com.google.gwt.dom.client.ImageElement;

public class GwtTextureLoader extends AsynchronousAssetLoader<Texture, TextureLoader.TextureParameter> {
    TextureData data;
    Texture texture;

    public GwtTextureLoader(FileHandleResolver resolver) {
        super(resolver);
    }

    @Override
    public void loadAsync(AssetManager manager, String fileName, FileHandle fileHandle, TextureLoader.TextureParameter parameter) {
        if (parameter == null || parameter.textureData == null) {
            Pixmap.Format format = null;
            boolean genMipMaps = false;
            texture = null;

            if (parameter != null) {
                format = parameter.format;
                genMipMaps = parameter.genMipMaps;
                texture = parameter.texture;
            }

            // Mostly these few lines changed w.r.t. TextureLoader:
            GwtFileHandle gwtFileHandle = (GwtFileHandle) fileHandle;
            ImageElement imageElement = gwtFileHandle.preloader.images.get(fileHandle.path());
            data = new ImageTextureData(imageElement, format, genMipMaps);
        } else {
            data = parameter.textureData;
            if (!data.isPrepared()) data.prepare();
            texture = parameter.texture;
        }
    }

    @Override
    public Texture loadSync(AssetManager manager, String fileName, FileHandle fileHandle, TextureLoader.TextureParameter parameter) {
        Texture texture = this.texture;
        if (texture != null) {
            texture.load(data);
        } else {
            texture = new Texture(data);
        }
        if (parameter != null) {
            texture.setFilter(parameter.minFilter, parameter.magFilter);
            texture.setWrap(parameter.wrapU, parameter.wrapV);
        }
        return texture;
    }

    @Override
    public Array<AssetDescriptor> getDependencies(String fileName, FileHandle fileHandle, TextureLoader.TextureParameter parameter) {
        return null;
    }
}

Then set that loader on your AssetManager in your GWT project only:

assetManager.setLoader(Texture.class, new GwtTextureLoader(assetManager.getFileHandleResolver()));

Note: You have to ensure that your images are power of two to begin with; this approach can obviously do no conversions for you. Mipmapping and texture filtering options should be supported though.

It would be nice if libGDX would to stop using canvas2d in the common case of just loading an image, and just pass the image element to texImage2D directly. I'm not sure how to fit that in architecturally (and I'm a GWT noob to boot). Since the original GitHub issue is closed, I've filed a new one with the suggested solution.

Update: the issue was fixed in this commit, which is included in libGDX 1.9.4 and above.

1
votes

I wonder if you are hitting a precision issue somewhere - premultiplied alpha textures make the color channels darker than the original.

Conceptually this process squashes color values into the bottom end of the color range, which can cause quantization banding when you re-encode as an 8-bit texture. What I can't explain is why you get oscillations between light and dark, unless this is an interaction of different band strides in the color and alpha channels.

OpenGL and OpenGL ES 3.0 support sRGB textures which could help with this (effectively they are much more able to express color differences in the dark end of the spectrum, sacrificing high luminance values where the eye is less able to distingusih them). Unfortunately this is not widely available on OpenGL ES 2.0 mobile devices (and therefore not on WebGL, as WEbGL is based on ES 2.0, although the sRGB extension may be available on some devices).