1
votes

I am trying to display a texture loaded with MTKTextureLoader, I have a buffer that stores my vertices coordinates (I build two triangles to have a rectangle in which to display my image), then I have a buffer that stores the texture coordinates of each vertex.
I made a sampler to sample data from my texture, the problem is that I am getting nothing (black image).

I putted the Swift code just in case my error comes from there, but I think it comes form the Metal code. If you look at my fragment shader, you will see two comments, they show something that I can't understand :

  • If I give the coordinates directly to the sample function, it works (colours the triangles with the color that corresponds to the given coordinates).

  • If I give the coordinates I pass to the sampler as color components, it also displays something coherent (triangles coloured in function of the given coordinates).

So it doesn't seem to come from the sampler, nor from the coordinates, that's what I don't understand.

Here is my Swift code :

import Cocoa
import MetalKit
import Metal

class ViewController: NSViewController, MTKViewDelegate {

    var device:MTLDevice!
    var texture:MTLTexture!
    var commandQueue:MTLCommandQueue!
    var vertexBuffer:MTLBuffer!
    var vertexCoordinates:[Float] = [
        -1, 1, 0, 1,
        -1, -1, 0, 1,
        1, -1, 0,  1,

        1,-1,0,1,
        1,1,0,1,
        -1,1,0,1,

    ]

    var vertexUVBuffer:MTLBuffer!
    var vertexUVCoordinates:[Float] = [
    0,1,
    0,0,
    1,0,
    1,0,
    1,1,
    0,1
    ]

    var library:MTLLibrary!
    var defaultPipelineState:MTLRenderPipelineState!
    var samplerState:MTLSamplerState!

    @IBOutlet var metalView: MTKView!

    override func viewDidLoad() {
        super.viewDidLoad()

        device = MTLCreateSystemDefaultDevice()
        let textureLoader = MTKTextureLoader(device: device)

        metalView.device = device
        metalView.delegate = self
        metalView.preferredFramesPerSecond = 0
        metalView.sampleCount = 4



        texture = try! textureLoader.newTextureWithContentsOfURL(NSBundle.mainBundle().URLForResource("abeilles", withExtension: "jpg")!, options: [MTKTextureLoaderOptionAllocateMipmaps:NSNumber(bool: true)])

        commandQueue = device.newCommandQueue()
        library = device.newDefaultLibrary()


        vertexBuffer = device.newBufferWithBytes(&vertexCoordinates, length: sizeof(Float)*vertexCoordinates.count, options: [])
        vertexUVBuffer = device.newBufferWithBytes(&vertexUVCoordinates, length: sizeof(Float)*vertexUVCoordinates.count, options: [])



        let renderPipelineDescriptor = MTLRenderPipelineDescriptor()
        renderPipelineDescriptor.vertexFunction = library.newFunctionWithName("passTroughVertex")
        renderPipelineDescriptor.fragmentFunction = library.newFunctionWithName("myFragmentShader")
        renderPipelineDescriptor.sampleCount = metalView.sampleCount
        renderPipelineDescriptor.colorAttachments[0].pixelFormat = metalView.colorPixelFormat
        defaultPipelineState = try! device.newRenderPipelineStateWithDescriptor(renderPipelineDescriptor)
        let samplerDescriptor = MTLSamplerDescriptor()
        samplerDescriptor.minFilter = .Linear
        samplerDescriptor.magFilter = .Linear
        samplerDescriptor.mipFilter = .Linear
        samplerDescriptor.sAddressMode = .ClampToEdge
        samplerDescriptor.rAddressMode = .ClampToEdge
        samplerDescriptor.tAddressMode = .ClampToEdge
        samplerDescriptor.normalizedCoordinates = true

        samplerState = device.newSamplerStateWithDescriptor(samplerDescriptor)
        metalView.draw()






        // Do any additional setup after loading the view.
    }

    func drawInMTKView(view: MTKView) {

        let commandBuffer = commandQueue.commandBuffer()
        let commandEncoder = commandBuffer.renderCommandEncoderWithDescriptor(metalView.currentRenderPassDescriptor!)
        commandEncoder.setRenderPipelineState(defaultPipelineState)

        commandEncoder.setVertexBuffer(vertexBuffer, offset: 0, atIndex: 0)
        commandEncoder.setVertexBuffer(vertexUVBuffer, offset:0, atIndex:1)
        commandEncoder.setFragmentSamplerState(samplerState, atIndex: 0)
        commandEncoder.setFragmentTexture(texture, atIndex: 0)
        commandEncoder.drawPrimitives(MTLPrimitiveType.Triangle, vertexStart: 0, vertexCount: 6, instanceCount: 1)

        commandEncoder.endEncoding()
        commandBuffer.presentDrawable(metalView.currentDrawable!)
        commandBuffer.commit()

    }

    func mtkView(view: MTKView, drawableSizeWillChange size: CGSize) {
        // view.draw()
    }

    override var representedObject: AnyObject? {
        didSet {
        // Update the view, if already loaded.
        }
    }


}  

Here is my Metal code :

#include <metal_stdlib>
using namespace metal;


struct VertexOut {
    float4 position [[position]];
    float2 texCoord;
};


vertex VertexOut passTroughVertex(uint vid [[ vertex_id]],
                                  constant float4 *vertexPosition [[ buffer(0) ]],
                                  constant float2 *vertexUVPos [[ buffer(1)]]) {

    VertexOut vertexOut;
    vertexOut.position = vertexPosition[vid];
    vertexOut.texCoord = vertexUVPos[vid];
    return vertexOut;
}

fragment float4 myFragmentShader(VertexOut inFrag [[stage_in]],
                                 texture2d<float> myTexture [[ texture(0)]],
                                 sampler mySampler [[ sampler(0) ]]) {


    float4 myColor = myTexture.sample(mySampler,inFrag.texCoord);
    // myColor = myTexture.sample(mySampler,float2(1));
    // myColor = float4(inFrag.texCoord.r,inFrag.texCoord.g,0,1);

    return myColor;
}
1
Your code works on my system. Which version of OS X and Xcode are you using? - warrenm
@warrenm I am on the latest public version of El Capitan (10.11.2), latest public version of Xcode (7.2). - Pop Flamingo
When you say that it works, do you mean that it displays an image ? - Pop Flamingo
There is something wired, since you told me that it works for you, I tried with an other image and it works..... sort of. Actually, for this other image, it works but only when I enlarge the window. When I downsize the window, it fades to black. I would say that it has something to do with bitmaps, because if I set the sampler mipFilter to .Linear, it does this fading, but if I set it to .Nearest, there is a point where, instead of fading, it suddenly disappears. It would look like there is nothing in the other mipmap level. - Pop Flamingo
Here's what I think is happening. You're allocating space for mipmaps but not actually generating them. Your sampler configuration (mipmap-linear) causes the texture to be sampled at the base level as long as the texture is small relative to the rect on the screen, but if you feed in a larger texture, it starts sampling down the mip stack, picking up all-black pixels, which are then blended together to either darken the source image or cause the output to be entirely black. Use a MTLBlitCommandEncoder to generate a complete set of mipmaps. - warrenm

1 Answers

1
votes

You're allocating space for mipmaps but not actually generating them. The docs say that when specifying MTKTextureLoaderOptionAllocateMipmaps, "a full set of mipmap levels are allocated for the texture when the texture is loaded, and it is your responsibility to generate the mipmap contents."

Your sampler configuration causes the resulting texture to be sampled at the base mipmap level as long as the texture is small relative to the rect on the screen, but if you feed in a larger texture, it starts sampling the smaller levels of the mipmap stack, picking up all-black pixels, which are then blended together to either darken the image or cause the output to be entirely black.

You should use the -generateMipmapsForTexture: method on a MTLBlitCommandEncoder to generate a complete set of mipmaps once your texture is loaded.