I'm creating Bitmaps and I want them to be (hardware-)encoded and muxed into .mp4 file.
I'm using MediaCodec class to encode and MediaMuxer to mux.
The problem is that encoder (MediaCodec) gives me too small input buffers. No matter what color format I choose (in MediaFormat object passed to encoder), it always returns to me buffers that are equal to resWidth * resHeight * 1.5
which kinda looks like it wants me to use 12-bit color coding.
For example when I choose my frames to be 960x540 pixels, encoder will pass me 777600 bytes long buffers. When I choose other resolution it will always scale buffers accordingly.
Bitmaps are created during program execution and I render Android Views on them (using Canvas). Bitmap class specification doesn't give me many choices in color formatting and there is no option to choose 12-bit coding as far as I know.
I can however choose 8-bit and this is what I get when I try to fill input buffers with my 8-bit-per-pixel bitmaps' content: https://www.youtube.com/watch?v=5c-fjYp9KMQ
As you can see everything is greenish and it shouldn't be. Here is minimal (2 classes) working code example exposing this issue: https://github.com/eeprojects/MediaCodecExample
You can generate video shown above just by running this application and waiting a few seconds. Everything app is doing in runtime is documented in LogCat.
I tried setting buffer size manually (which is possible by setting MediaFormat.KEY_MAX_INPUT_SIZE
field) and set it to resWidth * resHeight * 2
and then coded bitmaps in 16-bit but while trying to dequeue output buffer in this way, codec returns fatal internal error and application crashes:
E/ACodec: [OMX.Exynos.AVC.Encoder] ERROR(0x80001001)
E/ACodec: signalError(omxError 0x80001001, internalError -2147483648)
E/MediaCodec: Codec reported err 0x80001001, actionCode 0, while in state 6