0
votes

I've been banging my head on this for a while now. I have an indexed PlaneBufferGeometry I'm using for a GPGPU cloth physics simulation, and I can't for the life of me calculate the normals correctly in the final vertex shader after rendering the simulation to the texture.

This is my current setup (relevant parts included), and I think it's not working because I need a way to know the order of the current vert and it's neighbours in the "face". Just can't quite get it right.

javascript:

// get faces
const indices = geometry.index.array;
const faces = [];
for(let i = 0; i < indices.length; i += 3)
{
    faces.push([indices[i + 0] * 3, indices[i + 1] * 3, indices[i + 2] * 3]);
}

const vertices = geometry.attributes.position.array;

// begin loop

// initializing data texture vertex positions
dataTexturePixels[index * 4 + 0] = vertices[index * 3 + 0];
dataTexturePixels[index * 4 + 1] = vertices[index * 3 + 1];
dataTexturePixels[index * 4 + 2] = vertices[index * 3 + 2];
dataTexturePixels[index * 4 + 3] = 0;

// storing lookup uv's in an attribute for looking up positions
positionReference[index * 3 + 0] = (index % size) / size;
positionReference[index * 3 + 1] = Math.floor(index / size) / size;

// end loop

This is where my brain is tripping up. I've tried using the face index values from the faces array in various ways, but because indices are duplicated, data's being overwritten. Can't think of how to properly store the vertex index information for each face so it can be looked up in the vertex shader using the positionReference (or some other way).

vertex shader / after simulation runs:

// how I'd calculate the normals if I could get a proper ordered reference
vec2 coord1 = faceVert1UvReference.xy;
vec3 pos1 = texture2D(tPositions, coord1).xyz;

vec2 coord2 = faceVert2UvReference.xy;
vec3 pos2 = texture2D(tPositions, coord2).xyz;

vec2 coord3 = faceVert3UvReference.xy;
vec3 pos3 = texture2D(tPositions, coord3).xyz;

vec3 tangent = pos3 - pos2;
vec3 bitangent = pos1 - pos2;
vec3 normal = normalMatrix * normalize(cross(tangent, bitangent));

fragment shader / lighting:

vec3 lightDirection = normalize(lightPosition); // also tried normalize(lightPosition - vWorldPosition);
vec3 normal = normalize(vNormal);
float lightValue = max(0.0, dot(normal, lightDirection)) * lightIntensity;
finalColor.rgb *= lightValue;

Not sure if I'm missing something obvious/doing something dumb, or this problem is indeed hard. Without posting the many failed ways I've tried does anyone have any ideas?

Any help is greatly appreciated.


[Edit 1]: I've added a couple examples, this one that uses flat shading with face normals, and this one showing my current messed up smooth vertex normals progress. Having a hard time finding my error...


[Edit 2]: This is how I'm piping in the face data to each vertex. Everything looks correct to me from a data standpoint, but totally messed up visually. Can't for the life of me find where I'm going wrong.

javascript

const indices = geometry.index.array;
const faces = [];

// store faces for each vertex
for(let i = 0; i < indices.length; i += 3)
{
    const vertIndex1 = indices[ i + 0 ];
    const vertIndex2 = indices[ i + 1 ];
    const vertIndex3 = indices[ i + 2 ];

    faces[ vertIndex1 ] = faces[ vertIndex1 ] || [];
    faces[ vertIndex2 ] = faces[ vertIndex2 ] || [];
    faces[ vertIndex3 ] = faces[ vertIndex3 ] || [];

    faces[ vertIndex1 ].push([ vertIndex1, vertIndex2, vertIndex3 ]);
    faces[ vertIndex2 ].push([ vertIndex1, vertIndex2, vertIndex3 ]);
    faces[ vertIndex3 ].push([ vertIndex1, vertIndex2, vertIndex3 ]);
}

const size = 128;
const vertices = geometry.attributes.position;
const faceIndices = new Uint16Array( vertices.array.length );
const indices0 = gpuCompute.createTexture(size * 2, size * 2); // need 256x256 texture for all the data.
const indicesPixels0 = indices0.image.data;

let faceVertPixelIndex = 0,
    faceIndexRangeStart = 0,
    faceIndexRangeEnd = -1,
    index;

for(let i = 0; i < size; i++)
{
    for(let j = 0; j < size; j++)
    {
        index = j + (i * size);

        // ----------------------------------------------
        // writing vertex positions to data texture here
        // ----------------------------------------------


        if(faces[index])
        {
            const face = faces[index];
            const fLen = face.length;

            faceIndexRangeStart = faceIndexRangeEnd + 1;
            faceIndexRangeEnd = faceIndexRangeStart + fLen - 1;

            // face index range for looking up up all faces a single vertex is in
            faceIndices[index * 3 + 0] = faceIndexRangeStart;
            faceIndices[index * 3 + 1] = faceIndexRangeEnd;
            faceIndices[index * 3 + 2] = 0; // unused

            for(let v = 0; v < fLen; v++)
            {
                // store face vertex indices in each pixel rgb
                indicesPixels0[faceVertPixelIndex * 4 + 0] = face[v][0]; // current face, vertex 1 index
                indicesPixels0[faceVertPixelIndex * 4 + 1] = face[v][1]; // current face, vertex 2 index
                indicesPixels0[faceVertPixelIndex * 4 + 2] = face[v][2]; // current face, vertex 3 index
                indicesPixels0[faceVertPixelIndex * 4 + 3] = 0; // unused

                faceVertPixelIndex++;
            }
        }
    }
}

geometry.addAttribute('faceIndices', new THREE.BufferAttribute(faceIndices, 3));

uniforms.tIndices.value = indices0;

vertex shader (relevant parts)

uniform vec2 resolution;
uniform sampler2D tPositions;
uniform sampler2D tIndices;

attribute vec3 faceIndices;

varying vec3 vNormal;

vec2 getCoord(in float index, in vec2 size)
{
    return vec2(mod(index, size.x) / size.x, floor(index / size.y) / size.y);
}

void addNormal(inout vec3 nrml, in float index)
{
    vec2 coord = getCoord(index, resolution * 2.0); // 256x256 sized texture for faces
    vec4 face = texture2D(tIndices, coord);

    // get uv for each vertex index in the face and grab positions.
    vec2 v1Coord = getCoord(face.x, resolution);
    vec3 v1 = texture2D(tPositions, v1Coord).xyz;

    vec2 v2Coord = getCoord(face.y, resolution);
    vec3 v2 = texture2D(tPositions, v2Coord).xyz;

    vec2 v3Coord = getCoord(face.z, resolution);
    vec3 v3 = texture2D(tPositions, v3Coord).xyz;

    vec3 tangent = v3 - v2;
    vec3 bitangent = v1 - v2;

    vec3 n = normalize(cross(tangent, bitangent));

    nrml += n;
}

void main()
{
    vec3 nrml = vec3(0.0);
    vec2 faceIndexRange = faceIndices.xy;

    float from = faceIndexRange.x;
    float to = faceIndexRange.y;
    float index = from;

    for(int i = 0; i < 6; i++)
    {
        if(index <= to)
        {
            addNormal(nrml, index);
            index += 1.0;
        }
        else
        {
            break;
        }
    }

    vNormal = normalMatrix * normalize(nrml);
}
1
You may have better luck getting help if you first demonstrate a live example with flat shading. Normals not required in that case. Also, your uvs should index into the center of a pixel, so if your texture is 4x4, then your uvs would be 1/8, 3/8, 5/8, 7/8.WestLangley
@WestLangley I've added a couple codepens showing my current progress. Flat shading - link, and my current normals progress that's pretty messed up - linkmystaticself
You clearly have the skills to debug this. :) I'd set the size to 4 for debugging. BTW, you no longer need to set type when specifying uniforms.WestLangley
Thanks for the moral support :) I'm obviously keeping at it (been at it since long before I originally posted), hate asking for help, just running out of time. I know it's quite an involved thing to debug and probably too big of an ask for people to look at it, was worth a shot though...mystaticself
@WestLangley Does my approach in Edit 2 seem sound? Or is that a dumb way of trying to pipe in face data?mystaticself

1 Answers

0
votes

I ended up scrapping the above shared-face approach and used a neighbour lookup approach instead, thanks to @luigi-de-rosa for the help. I had previously tried this but was using incorrect values for grabbing the neighbours from the lookup texture that made me think it wasn't working.

Here's how I'm calculating the normals now in the vertex shader.

float diff = 0.06; // tweak this value to yield different results.
vec2 coord = positionReference.xy;
vec3 transformed = texture2D(tPositions, coord).xyz;
vec3 neighbour1 = texture2D(tPositions, coord + vec2(diff, 0.0)).xyz;
vec3 neighbour2 = texture2D(tPositions, coord + vec2(0.0, diff)).xyz;
vec3 tangent = neighbour1 - transformed;
vec3 bitangent = neighbour2 - transformed;            
vec3 nrml = cross(tangent, bitangent);
vNormal = normalMatrix * -normalize(nrml); // pass to fragment shader

Much easier approach... sigh.