I'm trying to get this to work on Android 4.1 (using an upgraded Asus Transformer tablet). Thanks to Alex's response to my previous question, I already was able to write some raw H.264 data to a file, but this file is only playable with ffplay -f h264
, and it seems like it's lost all information regarding the framerate (extremely fast playback). Also the color-space looks incorrect (atm using the camera's default on encoder's side).
public class AvcEncoder {
private MediaCodec mediaCodec;
private BufferedOutputStream outputStream;
public AvcEncoder() {
File f = new File(Environment.getExternalStorageDirectory(), "Download/video_encoded.264");
touch (f);
try {
outputStream = new BufferedOutputStream(new FileOutputStream(f));
Log.i("AvcEncoder", "outputStream initialized");
} catch (Exception e){
e.printStackTrace();
}
mediaCodec = MediaCodec.createEncoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 320, 240);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 125000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
}
public void close() {
try {
mediaCodec.stop();
mediaCodec.release();
outputStream.flush();
outputStream.close();
} catch (Exception e){
e.printStackTrace();
}
}
// called from Camera.setPreviewCallbackWithBuffer(...) in other class
public void offerEncoder(byte[] input) {
try {
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(input);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);
while (outputBufferIndex >= 0) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
outputStream.write(outData, 0, outData.length);
Log.i("AvcEncoder", outData.length + " bytes written");
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
}
} catch (Throwable t) {
t.printStackTrace();
}
}
Changing the encoder type to "video/mp4" apparently solves the framerate-problem, but since the main goal is to make a streaming service, this is not a good solution.
I'm aware that I dropped some of Alex' code considering the SPS and PPS NALU's, but I was hoping this would not be necessary since that information was also coming from outData
and I assumed the encoder would format this correctly. If this is not the case, how should I arrange the different types of NALU's in my file/stream?
So, what am I missing here in order to make a valid, working H.264 stream? And which settings should I use to make a match between the camera's colorspace and the encoder's colorspace?
I have a feeling this is more of a H.264-related question than a Android/MediaCodec topic. Or am I still not using the MediaCodec API correctly?
Thanks in advance.