5
votes

I'm just wondering how I can use Texture2DArray in HLSL.

I am trying to implement a model loader where models have different amount of textures. At the moment my HLSL uses Texture2D with a size of 2 (texture and normal textures) but as my models have varying amounts of textures, I am looking to use Texture2DArray but have no clue where to start. I've been trying to find examples and such on the internet but have had no luck :(

I load in textures which are compiled to 'ID3D11ShaderResourceView*' so should I create a variable ID3D11ShaderResourceView** which I create an array of points to textures then pass that to the shader or what?

Any help?

1

1 Answers

9
votes

You need only set the ArraySize field of D3D11_TEXTURE2D_DESC when you create it.

One ShaderResourceView views the entire array, you don't need multiple views. If you're not providing your own D3D11_SHADER_RESOURCE_VIEW_DESC on creation (and instead passing nullptr) then you're already set.

If you are passing your own D3D11_SHADER_RESOURCE_VIEW_DESC then set the ViewDimension field to D3D11_SRV_DIMENSION_TEXTURE2DARRAY, then fill out all 4 members of the Texture2DArray field of the view description (MostDetailedMip, MipLevels, FirstArraySlice, ArraySize).

In HLSL you just need to specify:

Texture2DArray myTexture;

instead of

Texture2D myTexture;

and when sampling, uv coordinates are three dimensional. x and y are as normal, but z is the index of the slice of the array you want to read from.

If you're initialising your textures on creation using a D3D11_SUBRESOURCE_DATA structure, then you simply need to pass a pointer to an array of D3D11_SUBRESOURCE_DATA's, one for each slice in the array.