0
votes

I have an MTLTexture in RGBA8Unorm format, and a screen texture (in MTKView) in BGRA8Unorm format (reversed). In the Metal shader, when I sample from that texture using sample(), I get float4. When I write to texture in metal shader, I also write float4. It seems that when I am inside the shader code, float4 always represents the same order of components RGBA regardless of the original format the texture is in ([0] for red, [1] for green, [2] for blue, and [3] for alpha). Is my conclusion correct that the meaning of the components of the sampled/written float4 is always the same inside the shader, regardless of what the storage format of the texture is?

UPDATE: I use the following code to write to a texture with RGBA8Unnorm format:

kernel void
computeColourMap(constant Uniforms &uniforms [[buffer(0)]],
                 constant array<float, 120> &amps [[buffer(1)]],
                 constant array<float, 120> &red [[buffer(2)]],
                 constant array<float, 120> &green [[buffer(3)]],
                 constant array<float, 120> &blue [[buffer(4)]],
                 texture2d<float, access::write> output [[texture(0)]],
                 uint2 id [[thread_position_in_grid]])
{
    if (id.x >= output.get_width() || id.y >= output.get_height()) {
        return;
    }

    uint i = id.x % 120;

    float4 col (0, 0, 0, 1);
    col.x += amps[i] * red[i];
    col.y += amps[i] * green[i];
    col.z += amps[i] * blue[i];

    output.write(col, id);
}

I then use the following shaders for the rendering stage:

vertex VertexOut
vertexShader(const device VertexIn *vertexArray [[buffer(0)]],
             unsigned int vid [[vertex_id]])
{
    VertexIn vertex_in = vertexArray[vid];

    VertexOut vertex_out;
    vertex_out.position = vertex_in.position;
    vertex_out.textureCoord = vertex_in.textureCoord;

    return vertex_out;
}

fragment float4
fragmentShader(VertexOut interpolated [[stage_in]],
               texture2d<float> colorTexture [[ texture(0) ]])
{
    const float4 colorSample = colorTexture.sample(nearestSampler,
                                                   interpolated.textureCoord);

    return colorSample;
}

where colourTexture passed into the fragment shader is the one I generated in RGBA8Unorm format, and in Swift I have:

let renderPipelineDescriptor = MTLRenderPipelineDescriptor()
renderPipelineDescriptor.vertexFunction = library.makeFunction(name: "vertexShader")!
renderPipelineDescriptor.fragmentFunction = library.makeFunction(name: "fragmentShader")!
renderPipelineDescriptor.colorAttachments[0].pixelFormat = colorPixelFormat

the colorPixelFormat of the MTKView is BGRA8Unorm (reversed relative to texture), which is not the same as my texture, but the colours on the screen come out correct.

UPDATE 2: one further pointer that within a shader the colour represented by float4 always has order of rgba is: float4 type actually has accessors called v.r, v.g, v.b, v.rgb, etc...

1
can you post the shader code?Nikos M.
@NikosM. - doneakuz

1 Answers

1
votes

The vector always has 4 components, but the type of the components is not necessarily float. When you declare a texture, you specify the component type as a template argument (texture2d<float ...> in your code).

For example, from Metal Shading Language Specification v2.1, section 5.10.1:

The following member functions can be used to sample from a 1D texture.

Tv sample(sampler s, float coord) const

Tv is a 4-component vector type based on the templated type used to declare the texture type. If T is float, Tv is float4. If T is half, Tv is half4. If T is int, Tv is int4. If T is uint, Tv is uint4. If T is short, Tv is short4 and if T is ushort, Tv is ushort4.

The same Tv type is used in the declaration of write(). The functions for other texture types are documented in a similar manner.

And, yes, component .r always contains the red component (if present), etc. And [0] always corresponds to .r (or .x).