I've been struggling for a while to accomplish something, partially because I don't even know if I'm on the right track or not.
The task
I'm developing a three.js web application that loads js models and then assigns them materials that I define in three.js. The object at hand is a house. The ultimate goal is to simulate as best as possible natural lighting (including Global illumination and Ambient Occlusion) on my mesh in a handful of different combinations. If the scene were completely static, I wouldn't have a problem, but I want to be able to toggle the visibility of various objects on the fly. Needless to say, this creates the problem of sticky shadows (AO) when objects are moved apart from one another, because I have previously rendered and baked lighting and ambient occlusion into my textures which then get assigned to my separted mesh content in three.js, so I don't have any lights in my three.js scene to make it easy on the client.
the challenge
I have pre-rendered the lighting for my objects in its various combinations, so there are alternate texture maps available for that. I'm trying to find a way to update the texture for localized faces at runtime so that when I remove the object casting the shadow (AO), I can tell the surface underneath to show the alternate, shadow-free texture instead.
My current approach
After reading through countless online resources and stack-overflow questions and answers, I've determined that
to update textures on the fly, I need to use as THREE.MeshFaceMaterial, feed it multiple alternate materials and refer to their indices later. (Change material or color in mesh face)
I can't tell one specific face in my mesh to now use the material for index n at runtime, but I have to change the texture of one of the materials instead and all faces that use that index will show the new texture (which is not very flexible for my problem).
I'm trying to implement the example at (WebGL / Three.js Different materials on one complex object (grid)).
1) I can't even seem to get this to work
I have been able to change the color of a face at runtime, but not the texture. When I try to use a THREE.MeshFaceMaterial and assign it to a geometry, the mesh just shows up at a random color, but when I assign one of the materials directly, it shows up ok. See here: (http://tinyurl.com/ndh99j4). Do you have a debugging tool that can quickly point you towards the issue? when I check the DOM in firebug the structure for avalon.materials.planes_m seems to be proper for a THREE.FaceMeshMaterial and I didn't get any errors.
2) Is this even the best approach to accomplishing what I need?
Assuming that I can get this to work, and I am able update the texture on select faces, I will end up with hard edges where the texture a and texture b don't fit together (different lighting scenarios). Will my next step then be to fix that using vertex colors (http://www.chandlerprall.com/2011/06/blending-webgl-textures/) by using a custom shader? It would seem that you can blend textures by setting the material at the vertex rather than the face. Do these two solutions go together at all? Is this leading me down the wrong path? I have limited knowledge of best practices for game development. I know that I can simulate ambient occlusion at runtime but that doesn't fix my need for global illumination and final gathering, so I still have to bake lighting into my textures.
I have been looking into separating the lighting out into a light map (http://threejs.org/examples/#webgl_materials_lightmap) and blending textures (http://threejs.org/examples/#webgl_materials_blending) but those really solve a different problem.
How would a game dev pro best approach this challenge?