I am currently working on live filters using Metal. After defining my CIImage I render the image to a MTLTexture.
Below is my rendering code. context
is a CIContext backed by Metal;
targetTexture
is the alias to the texture attached to the currentDrawable
property of my MTKView instance:
context?.render(drawImage, to: targetTexture, commandBuffer: commandBuffer, bounds: targetRect, colorSpace: colorSpace)
It renders correctly as I can see the image being displayed on the metal view.
The problem is that after rendering the image (and displaying it), I want to extract the CVPixelBuffer and save it to disk using the class AVAssetWriter.
Another alternative would be to have two rendering steps, one rendering to the texture and another rendering to a CVPixelBuffer. (But it isn't clear how to create such buffer, or the impact that two rendering steps would have in the framerate)
Any help will be appreciated, Thanks!