2
votes

I'm working on real-time video Streaming using hardware accelerated h.264 encoding, decoding and I'm able to make it work using Windows Media Foundation .

I created IMFSample from IMFMediaBuffer which I got from ID3D11Texture2D using MFCreateDXGISurfaceBuffer and encoded to h.264 using Hardware MFT and it rendered properly after decoding.

The Goal is to get RGB buffer using BitBlt and create an IMFSample to provide as input to h.264 encoder.

Now, The issue is when I try to create IMFMediaBuffer by storing RGB buffer in it using GetDIBits and create IMFSample to provide h.264 encoder, I am getting the below image rendered after decoding.

enter image description here

I also tried an approach to create IMFMediaBuffer by copying Data Buffer from ID3D11Texture2D using D3D11_MAPPED_SUBRESOURCE and copy to IMFMediaBuffer instead of MFCreateDXGISurfaceBuffer and created IMFSample. It gave the same result from the image.

I cannot understand why h.264 encoder has no problems encoding when getting Media Buffer using MFCreateDXGISurfaceBuffer directly from Texture and has problems when copying the RGB buffer into Media Buffer using both BitBlt and D3D11_MAPPED_SUBRESOURCE.

Any help to solve this issue is appreciated. Thanks in advance.

1

1 Answers

1
votes

This is a pretty typical problem, which you have not supplied with sufficient detail to get a good hint. The problem is essentially this: you did something wrong with data on video encoder input (or even earlier) and you are encoding wrong data. The encoder accepts the data because it is formally correct, in terms of having expected properties: resolution, media type, texture format, and it does not validate the pixel data itself. You have encoder output compliant to H.264 specification and you can play the video as the encoder correctly encoded wrong input data. Posted code could have been of some help.