35
votes

I have a project where I have been asked to display a video stream in android, the stream is raw H.264 and I am connecting to a server and will receive a byte stream from the server.

Basically I'm wondering is there a way to send raw bytes to a decoder in android and display it on a surface?

I have been successful in decoding H264 wrapped in an mp4 container using the new MediaCodec and MediaExtractor API in android 4.1, unfortunately I have not found a way to decode a raw H264 file or stream using these API's.

I understand that one way is to compile and use FFmpeg but I'd rather use a built in method that can use HW acceleration. I also understand RTSP streaming is supported in android but this is not an option. Android version is not an issue.

3
why not use BitmapFactory decodeByteArray or decodeFile. it should support h.264 according to this developer.android.com/guide/appendix/media-formats.html#coresteveh
This was asked over 1 year ago, there's already a solution using the Android MediaCodec API which was designed for decoding video. I really doubt BitmapFactory could decode h264 video.will
i don't think so too. in fact i don't think any of the built in classes help. i'm looking into ffmpeg nowsteveh
Well I mean I was able to play raw h264 using the method I described in my answer, it should work as long as you give it the correct video data to configure the decoder first.will

3 Answers

31
votes

I can't provide any code for this unfortunately, but I'll do my best to explain it based on how I got it to work.

So here is my overview of how I got raw H.264 encoded video to work using the MediaCodec class.

Using the link above there is an example of getting the decoder setup and how to use it, you will need to set it up for decoding H264 AVC.

The format of H.264 is that it’s made up of NAL Units, each starting with a start prefix of three bytes with the values 0x00, 0x00, 0x01 and each unit has a different type depending on the value of the 4th byte right after these 3 starting bytes. One NAL Unit IS NOT one frame in the video, each frame is made up of a number of NAL Units.

Basically I wrote a method that finds each individual unit and passes it to the decoder (one NAL Unit being the starting prefix and any bytes there after up until the next starting prefix).

Now if you have the decoder setup for decoding H.264 AVC and have an InputBuffer from the decoder then you are ready to go. You need to fill this InputBuffer with a NAL Unit and pass it back to the decoder and continue doing this for the length of the stream. But, to make this work I had to pass the decoder a SPS (Sequence Parameter Set) NAL Unit first. This unit has a byte value of 0x67 after the starting prefix (the 4th byte), on some devices the decoder would crash unless it received this Unit first. Basically until you find this unit, ignore all other NAL Units and keep parsing the stream until you get this unit, then you can pass all other units to the decoder.

Some devices didn't need the SPS first and some did, but you are better of passing it in first.

Now if you had a surface that you passed to the decoder when you configured it then once it gets enough NAL units for a frame it should display it on the surface.

0
votes

You can download the raw H.264 from the server, then offer it via a local HTTP server running on the phone and then let VLC for Android do playback from that HTTP server. You should use VLC's http/h264:// scheme to force the demuxer to raw H.264 (if you don't force the demuxer VLC may not be able to recognize the stream, even when the MIME type returned by the HTTP server is set correctly). See

https://github.com/rauljim/tgs-android/blob/integrate_record/src/com/tudelft/triblerdroid/first/VideoPlayerActivity.java#L211

for an example on how to create an Intent that will launch VLC.

Note: raw H.264 apparently has no timing info, so VLC will play as fast as possible. First embedding it in MPEGTS will be better. Haven't found a Android lib that will do that yet.

0
votes

Here are the resources I've found helpful in a similar project:

  1. This video has been super insightful in understanding how MediaCodec handles raw h.264 streams on a high level.
  2. This thread goes into a bit more detail as to handling the SPS/PPS NALUs specifically. As was mentioned above, you need to separate individual NAL Units using the start prefix, and then hand the remaining data to the MediaCodec.
  3. This repo (libstreaming) is a great example of decoding an H264 stream in Android using RTSP/RTP for transmission.