Note that much of this code as changed as of edit 3 below.
So I really like a blog post by Brandon Jones (found here). I wanted to convert his code to Three.js, but I am having some issues. You can find his full code here. Here is my attempt so far, with a couple comments for questions I have:
// Shader
var tilemapVS = [
"attribute vec2 pos;",
"attribute vec2 texture;",
"varying vec2 pixelCoord;",
"varying vec2 texCoord;",
"uniform vec2 viewOffset;",
"uniform vec2 viewportSize;",
"uniform vec2 inverseTileTextureSize;",
"uniform float inverseTileSize;",
"void main(void) {",
" pixelCoord = (texture * viewportSize) + viewOffset;",
" texCoord = pixelCoord * inverseTileTextureSize * inverseTileSize;",
" gl_Position = vec4(pos, 0.0, 1.0);",
"}"
].join("\n");
var tilemapFS = [
"precision highp float;",
"varying vec2 pixelCoord;",
"varying vec2 texCoord;",
"uniform sampler2D tiles;",
"uniform sampler2D sprites;",
"uniform vec2 inverseTileTextureSize;",
"uniform vec2 inverseSpriteTextureSize;",
"uniform float tileSize;",
"uniform int repeatTiles;",
"void main(void) {",
" if(repeatTiles == 0 && (texCoord.x < 0.0 || texCoord.x > 1.0 || texCoord.y < 0.0 || texCoord.y > 1.0)) { discard; }",
" vec4 tile = texture2D(tiles, texCoord);",
" if(tile.x == 1.0 && tile.y == 1.0) { discard; }",
" vec2 spriteOffset = floor(tile.xy * 256.0) * tileSize;",
" vec2 spriteCoord = mod(pixelCoord, tileSize);",
" gl_FragColor = texture2D(sprites, (spriteOffset + spriteCoord) * inverseSpriteTextureSize);",
//" gl_FragColor = tile;",
"}"
].join("\n");
this.material = new THREE.ShaderMaterial({
attributes: {
//not really sure what to use here, he uses some quadVertBuffer
//for these values, but not sure how to translate.
pos: { type: 'v2', value: new THREE.Vector2(0, 0) },
texture: { type: 'v2', value: new THREE.Vector2(0, 0) }
},
uniforms: {
viewportSize: { type: 'v2', value: new THREE.Vector2(viewport.width() / this.tileScale, viewport.height() / this.tileScale) },
inverseSpriteTextureSize: { type: 'v2', value: new THREE.Vector2(1/tileset.image.width, 1/tileset.image.height) },
tileSize: { type: 'f', value: this.tileSize },
inverseTileSize: { type: 'f', value: 1/this.tileSize },
tiles: { type: 't', value: tilemap },
sprites: { type: 't', value: tileset },
viewOffset: { type: 'v2', value: new THREE.Vector2(Math.floor(0), Math.floor(0)) },
inverseTileTextureSize: { type: 'v2', value: new THREE.Vector2(1/tilemap.image.width, 1/tilemap.image.height) },
//is 'i' the correct type for an int?
repeatTiles: { type: 'i', value: 1 }
},
vertexShader: tilemapVS,
fragmentShader: tilemapFS,
transparent: false
});
/*this.material = new THREE.MeshBasicMaterial({
color: 0xCC0000
})*/
this.plane = new THREE.PlaneGeometry(
tilemap.image.width * this.tileSize * this.tileScale, //width
tilemap.image.height * this.tileSize * this.tileScale//, //height
//tilemap.image.width * this.tileScale, //width-segments
//tilemap.image.height * this.tileScale //height-segments
);
this.plane.dynamic = true;
this.mesh = new THREE.Mesh(this.plane, this.material);
Once I load the page I get the following error:
TypeError: v1 is undefined
customAttribute.array[ offset_custom ] = v1.x;
I'm sure this has to do with how I set the attributes, but i'm not sure what they should be. Any help is appreciated as there is little to no documentation on Custom Shaders in Three.js.
EDIT: Here is the code used in the blog post to fill the 2 attributes of the vertex shader (pos
, and texture
):
//in ctor
var quadVerts = [
//x y u v
-1, -1, 0, 1,
1, -1, 1, 1,
1, 1, 1, 0,
-1, -1, 0, 1,
1, 1, 1, 0,
-1, 1, 0, 0
];
this.quadVertBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, this.quadVertBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(quadVerts), gl.STATIC_DRAW);
this.tilemapShader = GLUtil.createProgram(gl, tilemapVS, tilemapFS);
//...
//then on the draw method
gl.bindBuffer(gl.ARRAY_BUFFER, this.quadVertBuffer);
gl.enableVertexAttribArray(shader.attribute.position);
gl.enableVertexAttribArray(shader.attribute.texture);
gl.vertexAttribPointer(shader.attribute.position, 2, gl.FLOAT, false, 16, 0);
gl.vertexAttribPointer(shader.attribute.texture, 2, gl.FLOAT, false, 16, 8);
I really don't fully understand exactly what is happening here, but if I am correct I think it is filling 2 Float32Array
s with half the data of the quadVertBuffer
in each. Not only am I not sure why, I am not sure if I'm correct, nor do I know how t convert this to the Three.js method.
EDIT2: Right now I am using a plane to display the (2D) background, should I be using a sprite instead?
EDIT3:
So I got a little farther when I realized that Three.js will set position and uv vectors for me (which seems to be similar if not the same as position/texture in the above example). I also noticed that I may have had some types wrong since many of the 'v2'
types I had (which invoke uniform2f
) were actually being loaded via uniform2fv
, so I changed those to 'v2v'
and updated the value. Now I don't get the error, and it does paint something, just not quite the tilemap.
Here is the updated Vertex Shader:
var tilemapVS = [
"varying vec2 pixelCoord;",
"varying vec2 texCoord;",
"uniform vec2 viewOffset;",
"uniform vec2 viewportSize;",
"uniform vec2 inverseTileTextureSize;",
"uniform float inverseTileSize;",
"void main(void) {",
" pixelCoord = (uv * viewportSize) + viewOffset;",
" texCoord = pixelCoord * inverseTileTextureSize * inverseTileSize;",
" gl_Position = vec4(position.x, position.y, 0.0, 1.0);",
"}"
].join("\n");
and the updated Shader Material:
this._material = new THREE.ShaderMaterial({
uniforms: {
viewportSize: { type: 'v2v', value: [new THREE.Vector2(viewport.width() / this.tileScale, viewport.height() / this.tileScale)] },
inverseSpriteTextureSize: { type: 'v2v', value: [new THREE.Vector2(1/tileset.image.width, 1/tileset.image.height)] },
tileSize: { type: 'f', value: this.tileSize },
inverseTileSize: { type: 'f', value: 1/this.tileSize },
tiles: { type: 't', value: tilemap },
sprites: { type: 't', value: tileset },
viewOffset: { type: 'v2', value: new THREE.Vector2(0, 0) },
inverseTileTextureSize: { type: 'v2v', value: [new THREE.Vector2(1/tilemap.image.width, 1/tilemap.image.height)] },
repeatTiles: { type: 'i', value: 1 }
},
vertexShader: tilemapVS,
fragmentShader: tilemapFS,
transparent: false
});
And here is the result that I get:
Any ideas are welcome!
EDIT 4:
If I change the Vertex shader to use what I have found to be the "Three.js method" of setting gl_Position
I can get even closer, but the offset is wrong in the sprite sheet. I think the pixelCoord
varying is set wrong (since uv
has slightly different values than texture
I think).
I changed the Vertex Shader's main function to:
void main(void) {
pixelCoord = (uv * viewportSize) + viewOffset;
texCoord = pixelCoord * inverseTileTextureSize * inverseTileSize;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
and now I get actual tiles form the texture sheet, but the actual tile it chooses is wrong:
getting closer, any help is still appreciated.
EDIT 5:
I suspect this will be my last update, as I am close to having an answer. After setting tileset.flipY = false;
, where tileset is the actual texture tiles, not the red map. I get all the right tiles landing in the right places; except they are all upside down!
Here is what it looks like after this change:
Is there some way to flip each individual texture over the Y axis (without editing the tileset image)? I feel like there is some simple vector math I could add to my shader to flip each texture it draws and finalize this.
I do note that if I don't flip both (tilemap.flipY = false;
and tileset.flipY = false;
) I get the right textures, int the right spots, fitting together correctly. But the entire map is upside down! so close...
js/game/lib/core/TileMap.js
, which is instantiated and added to the scene injs/game/lib/core/Engine.js
. All the properties in theresources
object areTHREE.Texture
s loaded withTHREE.TextureLoader
. Thanks! – Chadbrandons_
). – Chad.flipY
properties (that I am setting to false) do. From_gl.pixelStorei( _gl.UNPACK_FLIP_Y_WEBGL, texture.flipY );
but the resulting texture is still upside down. If I don't set this flag on both thetilemap
andtileset
textures, then the maps don't line up and incorrect textures are used. – Chad