My overall goal is to be able to create a pixel shader that takes multiple textures as input, and renders to multiple targets. As well as via an initialise and finalise shader, through repeated runs of this shader i will get my result.
I've created shaders with multiple input textures before, and shaders that render to multiple targets, but i've never combined the 2.
What i belive is causing issues is my lack of full understanding of semantics, and how to properly set up input and output textures.
I've seen several different ways of getting input textures and am getting confused as to how it should be set up.
Below is the code for a shared struct that is output by the init and itterate shaders (the finalise shader simply outputs a colour):
struct FRACTAL_OUTPUT { float4 IterationsAndControl : COLOR0; float4 Variables1And2 : COLOR1; float4 Variables3And4 : COLOR2; };
Below is the texture declarations for the itterate and finalise shaders (the init shader doesn't use any textures):
Texture2D IterationsAndControl; sampler IterationsAndControlSampler : register(s4) { Texture = <IterationsAndControl>; };Texture2D Variables1And2; sampler Variables1And2Sampler : register(s5) { Texture = <IterationsAndControl>; };
Texture2D Variables3And4; sampler Variables3And4Sampler : register(s6) { Texture = <IterationsAndControl>; };
In c# XNA code, i set the render targets (by doing GraphicsDevice.SetRenderTargets() then set the texture parameters (by doing Effect.Parameter["TextureVariableName"].SetValue(), then draw a quad (via a sprite batch).
Any help would be much appreciated, as i can't find any examples of doing something like this.