0
votes

I'm trying to develop a fragment shader that fades to 0 where the face normals are perpendicular to the direction of the player 'camera'. (This is for spherical planet atmospheres; I want them to fade at their outer extent).

I have the game set up so that the player is always at 'universe position' 0, 0, 0 and all objects have their transform.matrix3Ds translated around the player as it moves.

I should point out that I have multiple shaders working fine, some including mixing textures, interpolating between 2 models and specular shading; THIS problem, though, has me beat.

I've been thinking that the fragment shader needs to know the direction to the player (so that it can dot product between the current face normal and the player direction). But, it also needs to have the model's 'current' vertex position (ie, the vertex output position the shader is currently drawing) added onto the inverse player camera direction; that way the direction to the camera from that model surface location will be correct.

Well, apparently not. (I could explain what I've been doing, but I can sense people ignoring this question already...). Can anybody tell me HOW I can correctly calculate the direction to the player, given that I also need to include (I THINK) the fact that the model's vertex positions are 'offset' from the model's central position? This has been doing my head in!

Thanks.

EDIT

Here's the relevant AS3 code followed by the AGAL:

// invMatrix is a cloned and inverted copy of the Planet object's transform.matrix3D...
// .deltaTransformVector ONLY uses the matrix3D's rotation to rotate a vector3D...
var tempVector:Vector3D = invMatrix.deltaTransformVector(transform.matrix3D.position);
tempVector.normalize();
context3D.setProgramConstantsFromVector(Context3DProgramType.VERTEX, 5, Vector.<Number>([-tempVector.x, -tempVector.y, -tempVector.z, 0]));
context3D.setProgramConstantsFromMatrix(Context3DProgramType.VERTEX, 7, rotMatrix, true);
... other constants set here ...
context3D.setVertexBufferAt(3, mesh.normalsBuffer, 0, Context3DVertexBufferFormat.FLOAT_3);

In the vertex shader:

"m44 v3, va3, vc7 \n" + // passed rotMatrix is used to rotate the face normals
"dp3 v4, vc5, va3 \n" + // put the dot product of the normalised, rotated camera position and the rotated normal in v4

Then, in the fragment shader:

"mul ft0, ft0, v4 \n" + // multiply the sampled texture in ft0 by the passed dot product in v4

But this is all making for some weird rendering behaviour, where half of the atmosphere appears drawn, depending on your position relative to the planet. Baffling. Any help well appreciated.

2

2 Answers

1
votes

Are you need to find vector from vertex to some point in space? If so, you just need to work in one frame, say in world frame. For that you need to know the desired position in world space and you need to convert your vertices to world space (just multiply by model matrix).

0
votes

OK - thanks, Volgogradetzzz, for your reply; you definitely reassured me that I was thinking in the right 'direction'. It ended up being a matter of getting many things correct at the same time. I ended up:
1) passing the relative, UN-rotated camera position to the shader.
2) rotating the vertex AND normal buffers in the vertex shader, by the object's rotation matrix (which includes the scale, but NOT the positional translation)
3) finally, the key thing was SUBTRACTING the rotated vertex position from the camera position in the fragment shader
So - it finally works! (Well THAT was a week of my life I'll never get back).