I'm trying to render to two textures with one pass using C++ directx 11 SDK. I want one texture to contain the color of each pixel of the result image (what I normally see on the screen when rendering a 3D scene), and another texture to contain the normal of each pixel and depth (3 float for normal and 1 float for depth). Right now, what I can think of is to create two rendering targets and render the first pass as the colors and the second pass the normals and depth to each rendering target respectively. However, this seems a waste of time because I can get the information of each pixel's color, normal, and depth in the first pass. So is there a way to somehow output two textures with the pixel shader?
Any help would be appreciated.
P.S. I'm thinking something along the lines of RWTexture2D or RWStructuredBuffer in the pixel shader. A little background: I will need the two images for further processing in the compute shader. Which brings up a side question of synchronization: since the pixel shader (unlike the compute shader) writes each pixel one at a time, how would I know when the pixel shader is finished and tell the compute shader to start image post-processing?