I'm rendering the same geometry, using the same projection/view matrix in both DirectX 11 and DirectX 9. The vertex shader uses
Output.oPosition = mul(float4(Position, 1.0), mul(mul(mHookModel, mHookView), mHookProjection));
in both APIs. Then I'm reading back the depth stencil/z-buffer values in each API. I'm getting very different values back between the two APIs. In DX9, for parts of the scene, I'm getting between 0.17 and 0.99. For that same range of points, I'm getting between 0.08 and 0.10 in DX11. Same geometry, same matrices, very simliar shader. Does anyone know what values from the vertex shader (interpolated) end up being stored in the depth stencil/z-buffer, and what would cause a difference like this?