0
votes

In the fragment function of a Metal Shader file, is there a way to redefine the "bounds" of the texture with respect to what the sample will consider it's normalized coordinates to be?

By default, a value of 0,0 for the sample is the top-left "pixel" and 1,1 is the bottom right "pixel" of the texture. However, I'm re-using textures for drawing and at any given render pass there's only a portion of the texture that contains the relevant data.

For example, in a texture of width: 500 and height: 500, I might have only copied data into the region of 0,0,250,250. In my fragment function, I'd like the sampler to interpret a normalized coordinate of 1.0 to be 250 and not 500. Is that possible?

I realize I can just change the sampler to use pixel addressing, but that comes with a few restrictions as noted in the Metal Shader Specification.

1
Can you just create another texture sharing the same buffer as the full texture, with the same bytesPerRow but a smaller width and height?rob mayoff
Well, I suppose I can just create new textures but that's what I was hoping to avoid. These textures are being recycled from a CVMetalTextureCache, which is being fed with a CVPixelBufferRef that is backed by an IOSurface who's width and height could be anything. The IOSurface is being used to shuttle data from an XPC service and the XPC service is only going to write into the IOSurface within the region it needs. So I was hoping I could easily clamp my sampler to that region of the texture rather than creating a new texture.kennyc
Is CVMetalTextureCacheCreateTextureFromImage too expensive? You can use the same CVPixelBufferRef to create multiple textures.rob mayoff
Good point. I hadn't considered making multiple textures from the same CVPixelBuffer, but looking at the API again I don't see a way to specify an origin for the new texture relative to source pixel buffer. (There's a width and height property for setting the dimensions of the buffer, but no origin.) I guess I just assumed that any textures that get created with a CVPixelBuffer backing should have the same dimensions.kennyc

1 Answers

2
votes

No, but if you know the region you want to sample from, it's quite easy to do a little math in the shader to fix up your sampling coordinates. This is used often with texture atlases.

Suppose you have an image that's 500x500 and you want to sample the bottom-right 125x125 region (just to make things more interesting). You could pass this sampling region in as a float4, storing the bounds as (left, top, width, height) in the xyzw components. In this case, the bounds would be (375, 375, 125, 125). Your incoming texture coordinates are "normalized" with respect to this square. The shader simply scales and biases these coordinates into texel coordinates, then normalizes them to the dimensions of the whole texture:

fragment float4 fragment_main(FragmentParams in [[stage_in]],
                              texture2d<float, access::sample> tex2d [[texture(0)]],
                              sampler sampler2d [[sampler(0)]],
                              // ...
                              constant float4 &spriteBounds [[buffer(0)]])
{
    // original coordinates, normalized with respect to subimage
    float2 texCoords = in.texCoords;

    // texture dimensions
    float2 texSize = float2(tex2d.get_width(), tex2d.get_height());

    // adjusted texture coordinates, normalized with respect to full texture
    texCoords = (texCoords * spriteBounds.zw + spriteBounds.xy) / texSize;

    // sample color at modified coordinates
    float4 color = tex2d.sample(sampler2d, texCoords);
    // ...
}