I have created a few experiments to learn how to create better scenes and I have been getting into more advanced topics that I can get to work, but I do not truly understand why.
Simplified questions:
- What is the InstancedBufferGeomtry process for drawing each object in the shader using a dataTexture? (Code below for my shader)
- What information gets saved to WebGlRenderTarget from the shader?
- Can you apply THREE.js lighting to InstancedBufferGeomtry or do I have to include lighting and shadowing in my shaders for it?
InstancedBufferGeometry
I understand that if you want to create many instances of a shape you clone the index, position, normal, and if attributes to your InstancedBufferGeometry, but I do not understand why it works in my shader.
For example, I can have a DataTexture which holds positions x, y, z and w for life. In my simulation I am ping ponging back and fourth between that DataTexture and updating the data in my fragment shader. Next, I am drawing my InstancedBufferGeometry with a different shader and I am using the simulations DataTextures x, y, and z for the position of each of the object. This is where I get confused, but it works how it should and I want to undsrstand why.
My vertex shader is using the position, normal, and uv from the boxbuffergeometry that I applied to my InstancedBufferGeometry, and then the position which is the x, y, z, and w from my DataTexture. From here I go through the basic calculations and when I get to the gl_position of my shader I am wondering how it draws the box based off of my DataTextures x, y, and z. The way I thought this worked was that when the gl_position was drawn it would just draw that vertex. I do not fully understand why it draws a fully box at that position.
precision highp float;
uniform mat4 modelMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform mat3 normalMatrix;
attribute vec3 position;
attribute vec3 normal;
attribute vec2 uv;
attribute vec2 lookup; // 0,0 to 1,1 for x amount of objects
uniform sampler2D curPos; //Data texture x,y,z and w for x amount of objs
uniform sampler2D prevPos; //Data texture x,y,z and w for x amount of objs
varying vec2 vUv;
mat3 calcLookAtMatrix(vec3 origin, vec3 target, float roll) {
vec3 rr = vec3(sin(roll), cos(roll), 0.0);
vec3 ww = normalize(target - origin);
vec3 uu = normalize(cross(ww, rr));
vec3 vv = normalize(cross(uu, ww));
return mat3(uu, vv, ww);
}
void main() {
vUv = uv;
vec2 luv = lookup;
vec4 i = texture2D( curPos, luv );
vec4 p = texture2D( prevPos, luv );
mat3 rot = calcLookAtMatrix( p.xyz, i.xyz, 0. );
vec3 vPosition = rot * position;
vPosition += mix(p.xyz, i.xyz, .5);
gl_Position = projectionMatrix * modelViewMatrix * vec4( vPosition, 1.0 );;
}
Ping-pong - I ping pong back and fourth on my data texture using WebGlRenderTargets. How does using .texture on the WebGlRenderTarget receive my new data? Does the gl_frag get saved to the buffer every execution?
In my simulation I have two render targets stored in an array, which are my front and back, which get ping ponged back and fourth because you cannot read and write at the same time. My mesh is a simple plane geometry, which uses a shader that updates my DataTexture in the fragment shader. How does the values for each execution from the gl drag get saved the the render targets texture buffer?
precision highp float;
uniform sampler2D source; // data texture x,y,z, and w
uniform sampler2D seed; // Original data texture x,y,z, and w
void main() {
vec4 s = texture2D(source,vUv);
if( s.w <= 0. ) {
s = texture2D(seed,vUv);
if( init == 0. ) s.w = 100.;
}else{
s.xyz += Noise here;
s.w -= decay value here;
}
gl_FragColor = s;
}
Render Ping pong
public render() {
this.shader.uniforms.source.value = this.front.texture;
this.target++;
this.target %= this.buffersCount;
this.front = this.targets[this.target];
let prev = this.target - 1;
if (prev < 0) prev += this.buffersCount;
this.back = this.targets[prev];
this.renderer.render(this.orthoScene, this.orthoCamera, this.front);
}
Lighting - When I switch to using my own InstancedBufferGeometry, and rawshadermaterial is there a way to use the three.js lighting that is provided or do I always have to add my own custom lighting and shadowing to my shaders?
gl_fragColor
gets saved,gl_Position
isn't even in the fragment shader – pailhead