To experiment with WebGL, I've written a script that renders a bunch of 2D sprites on a canvas. All the sprites are just textured rectangles.The texture on each rectangle is the same. They change their positions randomly with each frame.
I'm seeing a strange issue on Windows: in Chrome, the framerate is almost 2 times lower than in Firefox (on Mac, Chrome and Firefox have similar framerates). Is there a known problem in the Windows build of Chrome that affects WebGL performance?
Another issue is that when I increase the sprite count from about 500 to 5000, the framerate drops from almost 60 to 20-30 frames per second. Correct me if I'm wrong, but shouldn't rendering 5000 textured quads be a relatively light workload for modern graphic cards? Modern games that operate at 30 fps probably have MUCH higher polygon count.
Here's how my rendering works:
I create three WebGL buffers: for vertex positions, texture coodinates and vertex indices. Each of these buffers has more than enough space to accomodate for 5000 sprites. I also create in-memory typed arrays to hold vertex positions, texture coords and indices (
Float32Arrayfor vertices and texture coords,Uint16Arrayfor indices).On each frame, when adding a sprite, I write vertex, texture coord and index data into the in-memory arrays (at this point, no WebGL calls occur).
At the end of the frame, I upload the data from the in-memory arrays into the WebGL buffers using
bufferSubDataand calldrawElements.
Am I doing something wrong? Shouldn't the framerate be pretty high for 5k sprites? And finally, why is the framerate lower in Chrome than in Firefox?