I am a C#, SharpDX and Directx newbie. Please excuse my ignorance. I am following up on an old post: Exception of Texture2D.FromMemory() in SharpDX code. It was very helpful.
My goal:
- Build a Texture2d from softwarebitmap.
- Make the texture available to HLSL.
The way I approached it:
- Using IMemoryBufferByteAccess, I was able to retrieve the pointer to byte and the total capacity of Frame. From the previous post, it seems I would need to use the DataRectangle to point to the byte array.
- Have 2 textures with different descriptors- Texture1 (_staging_texture)- none binding flag, cpu write and read privileges, usage- staging. I created this texture with the datarectangle pointing to the byte array. Texture2 (_final_texture)- Shader binding flag, no cpu access, usage- default. This texture would be eventually made available to the shader. The intention was to use the copyResource function from Texture1 to Texture2.
Below, I copy my unpolished code for reference:
bitmap = latestFrame.SoftwareBitmap;
Windows.Graphics.Imaging.BitmapBuffer bitmapBuffer= bitmap.LockBuffer(Windows.Graphics.Imaging.BitmapBufferAccessMode.Read);
Windows.Foundation.IMemoryBufferReference bufferReference = bitmapBuffer.CreateReference();
var staging_descriptor = new Texture2DDescription
{
Width = Width,
Height = Height,
MipLevels = 1,
ArraySize = 1,
Format = SharpDX.DXGI.Format.R8G8B8A8_UNorm,
SampleDescription = new SharpDX.DXGI.SampleDescription(1, 0),
Usage = ResourceUsage.Staging,
BindFlags = BindFlags.None,
CpuAccessFlags = CpuAccessFlags.Read | CpuAccessFlags.Write,
OptionFlags = ResourceOptionFlags.None
};
var final_descriptor = new Texture2DDescription
{
Width = Width,
Height = Height,
MipLevels = 1,
ArraySize = 1,
Format = SharpDX.DXGI.Format.R8G8B8A8_UNorm,
SampleDescription = new SharpDX.DXGI.SampleDescription(1, 0),
Usage = ResourceUsage.Default,
BindFlags = BindFlags.ShaderResource,
CpuAccessFlags = CpuAccessFlags.None,
OptionFlags = ResourceOptionFlags.None
};
var dataRectangle = new SharpDX.DataRectangle();
unsafe
{
byte* dataInBytes;
uint capacityInBytes;
((InteropStatics.IMemoryBufferByteAccess)bufferReference).GetBuffer(out dataInBytes, out capacityInBytes);
dataRectangle.DataPointer = (IntPtr)dataInBytes;
dataRectangle.Pitch = 4;
}
Texture2D _stagingTexture = new Texture2D(device, staging_descriptor, dataRectangle);
Texture2D _finalTexture = new Texture2D(device, final_descriptor);
_stagingTexture.Device.ImmediateContext.CopyResource(_stagingTexture, _finalTexture);
My question is two fold:
- The DataRectangle uses IntPtr type while the pointer retrieved from the interface is Byte array.. Is this not a problem? OR does the pitch member in the DataRectangle address this? For now I casted byteArray to IntPtr.
- Would this approach work? OR is there a better way to handle this?
Any pointers, suggestions or constructive criticisms would be much appreciated!