I have written a little DirectX RenderEngine, to render and texture simple Polygons. The texturing worked with CreateDDSTextureFromFile(.dds) very well.
Now, I try to set up a Texture and a corresponding ShaderResourceView from a Memory buffer. Creating the Texture2D works, but then, CreateShaderResourceView ALWAYS fails. Also tried to call it with 2nd param set to NULL, also fails. I was trying to figure out what's wrong here for the whole week.
I hope somebody can help me here.
[Code]:
D3D11_TEXTURE2D_DESC tdesc;
ZeroMemory(&tdesc, sizeof(tdesc));
D3D11_SUBRESOURCE_DATA srInitData;
ZeroMemory(&srInitData, sizeof(srInitData));
ID3D11Texture2D* tex = 0;
D3D11_SHADER_RESOURCE_VIEW_DESC srDesc;
ZeroMemory(&srDesc, sizeof(srDesc));
// ------------------------- [CHECK MATE ] ----------------------------
int w = 512;
int h = 512;
int bpp = 4;
int *buf = new int[w*h];
// filling the image
for (int i = 0; i<h; i++)
for (int j = 0; j<w; j++)
{
if ((i & 32) == (j & 32))
buf[i*w + j] = 0x00000000;
else
buf[i*w + j] = 0xffffffff;
}
// setting up D3D11_SUBRESOURCE_DATA
srInitData.pSysMem = (void *)buf;
srInitData.SysMemPitch = w*bpp;
srInitData.SysMemSlicePitch = w*h*bpp; // Not needed since this is a 2d texture
// setting up D3D11_TEXTURE2D_DESC
tdesc.Width = w;
tdesc.Height = h;
tdesc.MipLevels = 1;
tdesc.ArraySize = 1;
tdesc.SampleDesc.Count = 1;
tdesc.SampleDesc.Quality = 0;
tdesc.Usage = D3D11_USAGE_DEFAULT;
tdesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
tdesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
tdesc.CPUAccessFlags = 0;
tdesc.MiscFlags = 0;
// checking inputs
if (device->CreateTexture2D(&tdesc, &srInitData, NULL) == S_FALSE)
std::cout << "Inputs correct" << std::endl;
else
std::cout << "wrong inputs" << std::endl;
// create the texture
if (FAILED(device->CreateTexture2D(&tdesc, &srInitData, &tex)))
{
std::cout << "Failed" << std::endl;
return(0);
}
else
std::cout << "Success" << std::endl;
// setup the Shader Resource Desc.
srDesc.Format = tdesc.Format;
srDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
srDesc.Texture2D.MostDetailedMip = 0;
srDesc.Texture2D.MipLevels = 1;
// ------------------ [ Here it always Fails ...]
if (FAILED(device->CreateShaderResourceView(tex, &srDesc, &texture_)));
{
std::cerr << "Can't create Shader Resource View" << std::endl;
return false;
}
delete[] buf;
[Sidequestion]: I noticed when I Change the D3D11_TEXTURE2D_DESC.USAGE to D3D11_USAGE_DYNAMIC, CreateTexture2D aswell fails. but why?
[Debug]: ... D3D11 INFO: Create ID3D11Buffer: Name="unnamed", Addr=0x00C31374, ExtRef=1, IntRef=0 [ STATE_CREATION INFO #2097228: CREATE_BUFFER] D3D11 INFO: Create ID3D11Buffer: Name="unnamed", Addr=0x00C319A4, ExtRef=1, IntRef=0 [ STATE_CREATION INFO #2097228: CREATE_BUFFER] The thread 'Win32 Thread' (0x9c8) has exited with code 0 (0x0). D3D11 INFO: Create ID3D11Texture2D: Name="unnamed", Addr=0x00C41984, ExtRef=1, IntRef=0 [ STATE_CREATION INFO #2097234: CREATE_TEXTURE2D] D3D11 INFO: Create ID3D11ShaderResourceView: Name="unnamed", Addr=0x00C41F6C, ExtRef=1, IntRef=0 [ STATE_CREATION INFO #2097240: CREATE_SHADERRESOURCEVIEW]
.. doesn't say much to me, can't see anything of an error hint here.
[Note]: actually I want to use a RGBA8 Texture from Camera Input (stored in a buffer) - but I could not achieve that. So I tried this example, to find out why CreateShaderResourceView fails even with a normal int* buffer.
I am very thankful for any help! thanks!
[Solved] as the code above, I always reached the output "Can't create Shader Resource View" assuming CreateShaderResourceView failed. Checking HRESULT, it's value was S_OK, which seemed strange to me. So I just changed the condition for a try:
if (SUCCEEDED(device->CreateShaderResourceView(tex, &srDesc, &texture_)));
{
std::cout << "Successfully created ShaderResourceView" << std::endl;
delete[] buf;
return true;
}
return false;
And it worked. This behavior seems really odd to me, but I am finally glad it works. So Also the camera stream works now, many thanks to Adam Miles for the support! ;)