Question updated with new code
I am trying to write a WebGL shader that will draw a TMX Layer (exported from the Tiled editor). I am using THREE.js to create a Plane
mesh and have the material be a ShaderMaterial
that will draw the map on it.
For those who don't know a tilemap exported by the Tiled editor as json will give a data
attribute for each layer; it contains an array of numerical values, each of which is the tile index in the tileset like:
"data": [5438, 5436, 5437, 5438, 5436, 5437, 5438, 5436, 5437, 5438, 845, ...]
Since my tilemap is 256 by 256 tiles, the array is 65536 elements in length. Each element's value refers to a tile in the tilemap, where the indexes are defined like:
(source: melonjs.org)
So index 0 of the tilemap refers to tile 5438 where they are counted like above. The indexes represent which tile in the tilemap has that tile from the tileset, where they are counted in the same manner.
Here is how I am creating the material, plane, and mesh:
this.size = new THREE.Vector2(256, 256);
this.tileSize = new THREE.Vector2(16, 16);
this._material = new THREE.ShaderMaterial({
uniforms: this._uniforms,
vertexShader: this.vShader,
fragmentShader: this.fShader,
transparent: (this.opacity === 0)
});
this._plane = new THREE.PlaneGeometry(
this.size.x * this.tileSize.x,
this.size.y * this.tileSize.y
);
this._mesh = new THREE.Mesh(this._plane, this._material);
And finally the uniforms, and shaders. Basically I need to map the data element to an actual tile in the tileset and draw it. In order to get the data
array into the shader I am loading it as a THREE.DataTexture
and treating it as a texture.
Here is my second attempt:
//Shaders
var vShader = [
'varying vec2 pixelCoord;',
'varying vec2 texCoord;',
'uniform vec2 layerSize;',
'uniform vec2 tilesetSize;',
'uniform vec2 inverseTilesetSize;',
'uniform vec2 tileSize;',
'uniform vec2 inverseTileSize;',
'uniform float scale;',
'void main(void) {',
' pixelCoord = (uv * layerSize) * tileSize * scale;', //pixel we are at
' texCoord = pixelCoord * inverseTilesetSize * inverseTileSize;', //calculate the coord on this map
' gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);',
'}'
].join('\n');
var fShader = [
//"precision highp float;",
'varying vec2 pixelCoord;',
'varying vec2 texCoord;',
'uniform vec2 tilesetSize;',
'uniform vec2 inverseTilesetSize;',
'uniform vec2 tileSize;',
'uniform vec2 inverseTileSize;',
'uniform vec2 numTiles;',
'uniform float scale;',
'uniform sampler2D tileset;',
'uniform sampler2D tileIds;',
//I store the tile IDs as a texture (1 float value = rgba)
//this will decode the rgba values back into a float ID
'highp float decode32(highp vec4 rgba) {',
' const vec4 bit_shift = vec4(1.0/(256.0*256.0*256.0), 1.0/(256.0*256.0), 1.0/256.0, 1.0);',
' float depth = dot(rgba, bit_shift);',
' return depth;',
'}',
'void main(void) {',
' vec4 tileId = texture2D(tileIds, texCoord);', //grab this tileId from the layer data
' tileId.rgba = tileId.abgr;', //flip flop due to endianess
//I find that this value is always `0 < tileValue < 1`, I think my `decode32` sucks...
' float tileValue = decode32(tileId);', //decode the vec4 into the float ID
' vec2 tileLoc = vec2(mod(tileValue, numTiles.y), floor(tileValue / numTiles.y));', //convert the ID into x, y coords
' vec2 coord = floor(tileLoc * 256.0) * tileSize;', //coord in the tileset
' vec2 offset = mod(pixelCoord, tileSize);', //how much to draw
' gl_FragColor = texture2D(tileset, (coord + offset) * inverseTilesetSize);', //grab tile from tilset
'}'
].join('\n');
And the uniforms, and the data texture:
//tried making this 256 x 256 like it is conceptually,
//and also tried 65536 x 1 like the data structure
this.dataTex = new THREE.DataTexture(
this.data,
this.data.length, //width (65536)
1, //height (1)
THREE.RGBAFormat, //format
THREE.UnsignedByteType, //type
THREE.UVMapping, //mapping
THREE.ClampToEdgeWrapping, //wrapS
THREE.ClampToEdgeWrapping, //wrapT
THREE.NearestFilter, //magFilter
THREE.NearestMipMapNearestFilter //minFilter
);
this.dataTex.needsUpdate = true;
this._uniforms = window._uniforms = {
layerSize: { type: 'v2', value: this.size },
tilesetSize: { type: 'v2', value: new THREE.Vector2(this.tileset.image.width, this.tileset.image.height) },
inverseTilesetSize: { type: 'v2', value: new THREE.Vector2(1 / this.tileset.image.width, 1 / this.tileset.image.height) },
tileSize: { type: 'v2', value: this.tileSize },
inverseTileSize: { type: 'v2', value: new THREE.Vector2(1 / this.tileSize.x, 1 / this.tileSize.y) },
numTiles: { type: 'v2', value: new THREE.Vector2(this.tileset.image.width / this.tileSize.x, this.tileset.image.height / this.tileSize.y) },
scale: { type: 'f', value: 1 / this.scale },
tileset: { type: 't', value: this.tileset },
tileIds: { type: 't', value: this.dataTex },
repeatTiles: { type: 'i', value: this.repeat ? 1 : 0 }
};
So when this renders I just get the first tile of the tileset repeated over and over:
Not sure what is causing it, but since it is at position 0, 0
I think I have a zero messing with me somewhere.