I am trying to store a vertex array in a WebGL texture but can't really figure out how to do it correctly. The purpose of this is to pass the vertices to my fragment shader and process them for raytracing.
My JavaScript code looks currently like this:
var a = new ArrayBuffer();
// Model.VerticeMap is an array holding all vertices [x1, y1, z1, x2, y2, z2, ...]
a = Model.VerticeMap; // array length: 36864
var dataArray = new Float32Array(a);
var vMapTexture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, vMapTexture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, 2, 2, 0, gl.RGB, gl.FLOAT, dataArray);
I get no errors with a texture width and height of 2. Taking higher values results in the error ArrayBufferView not big enough for request
.
I am also using the extension OES_texture_float
for float values in textures.
I must admit that I'm not 100% sure what I'm doing there. It was more of a back and forth and I was kind of happy when I got to the point of not getting any errors with this code. My main thought was to store the x,y,z values of every vertex as r,g,b values of every pixel of the texture. Therefore the texture size would have to be equal with or higher than the number of triangles of my 3D model. Unfortunately, texture sizes in WebGL are limited to power of two numbers. The number of vertices should also be variable, since I'm planning to render multiple models in my scene.
I would be grateful for some assistance concerning the correct usage of an ArrayBuffer
, a Float32Array
and the texImage2D()
command to solve my problem of sending vertices to a fragment shader in WebGL.