0
votes

I'm working on a compute shader. It needs to output some textures as UAVs. Some of them have 8bits components and some have 16bits per channel. Consider this line:

RWTexture2D<float4> _watNormTex;

I bind a R8G8B8A8_UNORM texture to it and the output is black but when I bind a R32G32B32A32_FLOAT texture it saves the correct values. So it seems it has a problem with texture format.

What can I do to output to 8bit or 16bit per channel textures?

PS: My test device has a Geforce 710 GT GPU.

1
Do you have debug layer enabled? It should provide you some information in case something is wrong, also without posting any code it's gonna be difficult to source the error (as a side note I tried your case and it worked with both formats, so I would think something is wrong in your pipeline setup).mrvux
@catflier You are absolutely right. It was a framework setup problem. Thanksmorteza khosravi

1 Answers

0
votes

It was a setup problem as @catflier suggested. The formats should work as expected.