0
votes

I have a shader that no longer draws correctly. It was working in XNA, but had to be rewritten for DX11 and SharpDX and now uses shader model 5. There are no exceptions and the shader effect compiles fine, no debug-layer messages from the device except for an unrelated(?) default sampler-state complaint. It is a normal, fullscreen quad doing the simplest texture blit. (Using a random noise texture for testing)

Vertex Problem Example

I am feeding the vertex shader a VertexPositionTexutre, a SharpDX object which contains:

[VertexElement("SV_Position")]
public Vector3 Position;
[VertexElement("TEXCOORD0")]
public Vector2 TextureCoordinate;

I suspect there is a mismatch between signatures. VertexPositionTexture has a Vector3 position component defined for "SV_Position".

D3D11 defines "SV_Position" as a float4. It is a System-Value constant with a concrete native type. I fear the shader is eating an extra float beyond the end of the float3 of Vector3. This would also explain why my UVs are broken on several vertices.

This is my vertex shader:

struct V2P
{
    float4 vPosition    : SV_Position;
    float2 vTexcoord    : TEXCOORD0;
};

V2P VSCopy(float4 InPos  : SV_Position,
                    float2 InTex : TEXCOORD0)
{
    V2P Out = (V2P)0;

    // transform the position to the screen
    Out.vPosition = float4(InPos);
    Out.vTexcoord = InTex;

    return Out;
}

I have tried to change the input of VSCopy() from float4 to float3 with no change in result. Short of recompiling SharpDX VertexPositionTexture, I don't see how to fix this, yet SharpDX fully supports DirectX11 so I must be doing something incorrectly.

I am defining my vertices and setting them as follows:

//using SharpDX.Toolkit.Graphics
[...]//In the initialization
    verts = new VertexPositionTexture[]
    {
        new VertexPositionTexture(
            new Vector3(1.0f,-1.0f,0),
            new Vector2(1,1)),
        new VertexPositionTexture(
            new Vector3(-1.0f,-1.0f,0),
            new Vector2(0,1)),
        new VertexPositionTexture(
            new Vector3(-1.0f,1.0f,0),
            new Vector2(0,0)),
        new VertexPositionTexture(
            new Vector3(1.0f,1.0f,0),
            new Vector2(1,0))
    }

    //vb is a Buffer<VertexPositionTexture>
    short[] indeces = new short[] { 0, 1, 2, 2, 3, 0 };
        vb = SharpDX.Toolkit.Graphics.Buffer.Vertex.New(Game.Instance.GraphicsDevice, verts, SharpDX.Direct3D11.ResourceUsage.Dynamic);
        ib = SharpDX.Toolkit.Graphics.Buffer.Index.New(Game.Instance.GraphicsDevice, indeces, dx11.ResourceUsage.Dynamic);

        [...] //In the Render method, called every frame to draw the quad
        Game.Instance.GraphicsDevice.SetVertexBuffer<VertexPositionTexture>(vb, 0);
        Game.Instance.GraphicsDevice.SetIndexBuffer(ib, false);
        Game.Instance.GraphicsDevice.DrawIndexed(PrimitiveType.TriangleList, 6);
1
How have you created your vertex buffer and input layout?megadan
I've updated the question with exactly how the vertex and index buffers are being setselkathguy
Are you creating a VertexInputLayout and calling SetVertexInputLayout as well?megadan
I had tried that, but I discovered that I was doing it incorrecty, trying to use the D3D11 device object directly without realizing the Toolkit had a method GraphicsDevice.SetVertexInputLayout(); for doing precisely this. This has solved my issue. I'll generate an answer.selkathguy

1 Answers

0
votes

SharpDX.Tookit provides a method for specifying the InputLayout to the vertex shader. I had missed this method and was incorrectly trying to set the vertex input layout via the property:

SharpDX.Direct3D11.ImmediateContext.InputAssembler.InputLayout

You must set the input layout via the SharpDX.Toolkit.Graphics.GraphicsDevice.SetVertexInputLayout(); method.