I'm programming a (hopefully) planetary shader for a Unity project. I'm relatively new to shaders, and as I understand the language here is a subset of CG Shader called ShaderLab.
My question is can this be done in a shader? I have the formulas to map coords to a sphere, but I just get a tangled mess. I used the same formulas in an XNA project to map planes from a unit cube to a unit sphere and it worked fine. What am I doing wrong?
Oh, and I'm testing this on a stock plane prefab in Unity placed at (0, 1, 0), scale (1, 1, 1).
Things to note:
- appdata_base : a vertex format that consists of position, normal and one texture coordinate
- _Object2World : current model matrix from Unity
- UNITY_MATRIX_VP : current view * projection matrix from Unity
Here is my vertex shader:
struct v2f {
float4 pos : SV_POSITION;
float3 col : COLOR0;
};
v2f vert(appdata_base v) {
v2f o;
o.pos = v.vertex;
o.pos = mul(_Object2World, o.pos); // _Object2World is the current model matrix from UNITY
o.pos.x *= sqrt(1 - o.pos.y * o.pos.y / 2 - o.pos.z * o.pos.z / 2 + o.pos.y * o.pos.y * o.pos.z * o.pos.z / 3);
o.pos.y *= sqrt(1 - o.pos.z * o.pos.z / 2 - o.pos.x * o.pos.x / 2 + o.pos.z * o.pos.z * o.pos.x * o.pos.x / 3);
o.pos.z *= sqrt(1 - o.pos.x * o.pos.x / 2 - o.pos.y * o.pos.y / 2 + o.pos.x * o.pos.x * o.pos.y * o.pos.y / 3);
o.pos = mul(UNITY_MATRIX_VP, o.pos);
o.col = v.normal * 0.5 + 0.5;
return o;
}