2
votes

We have an application for Android that work with network camera. Our main problem is that video displayed with artifacts. Most of the screen in green squares. When you start to move your hand in front of the camera, the squares disappear but video still with artifacts. We have checked buffer length, packets size and many parameters…. Now we have no idea what is wrong.

I will describe the whole process: Camera work with SIP protocol. According to SIP we collect SDP data and establish connection. We have discovered that video translate as H264 base profile in RTP packets.We receive UDP packets. Extract RTP. Look to the headers of RTP. We received packets with type 7 and 8. These two packets we use to configure MediaCodec.

private void initMedia(ByteBuffer header_sps, ByteBuffer header_pps) {
    try {
        mMediaCodec = MediaCodec.createDecoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
        //mMediaCodec = MediaCodec.createByCodecName("OMX.google.h264.decoder");
        MediaFormat mediaFormat = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 640, 480);
        mediaFormat.setByteBuffer("csd-0", header_sps);
        mediaFormat.setByteBuffer("csd-1", header_pps);
        mMediaCodec.configure(mediaFormat, videoView.getHolder().getSurface(), null, 0);
        mMediaCodec.start();
        mConfigured = true;
        startMs = System.currentTimeMillis();
        show.start();
    } catch (IOException e) {
        e.printStackTrace();
    }
}

Also we receive packets 28 it mean that it is parts and we should reconstruct it.

public ByteBuffer writeRawH264toByteBuffer() throws IOException, NotImplementedException {
ByteBuffer res = null;
switch (nal.getType()){
    case NAL.FU_A:    //FU-A, 5.8.  Fragmentation Units (FUs)/rfc6184
        FUHeader fu = getFUHeader();

        if(fu.isFirst()){
            //if(debug) System.out.println("first");
            res = ByteBuffer.allocate(5+getH264PayloadLength());
            res.put(H264RTP.NON_IDR_PICTURE);
            res.put(getReconstructedNal());
            res.put(rtp.getBuffer(), getH264PayloadStart(), getH264PayloadLength());
        } else {
            //if(debug) System.out.println("end");
            res = ByteBuffer.allocate(getH264PayloadLength());
            res.put(rtp.getBuffer(), getH264PayloadStart(), getH264PayloadLength());
        }
        break;
    case NAL.SPS: //Sequence parameter set
    case NAL.PPS: //Picture parameter set
    case NAL.NAL_UNIT:
        res = ByteBuffer.allocate(4+getH264PayloadLength());
        //System.out.println("sps or pps write");
        res.put(H264RTP.NON_IDR_PICTURE);
        res.put(rtp.getBuffer(), rtp.getPayloadStart(), rtp.getPayloadLength());
        break;
    default:
        throw new NotImplementedException("NAL type " + getNAL().getType() + " not implemented");
}
return res;

}

NON_IDR_PICTURE is byte array {0x00, 0x00, 0x00, 0x01}

We use VideoView for translating video on Android device This one write packets:

if (mConfigured) {
int index = mMediaCodec.dequeueInputBuffer(mTimeoutUsDegueueInput);
if (index >= 0) {
    ByteBuffer buffer = mMediaCodec.getInputBuffer(index);
    //buffer.clear();
    int capacity = wrapper.getByPayload().writeRawH264toByteBuffer(buffer);
    mMediaCodec.queueInputBuffer(index, 0, capacity, wrapper.getSequence(), 0);
}

}

and this one renew VideoView (in the separate thread)

while(true)
if (mConfigured) { 
    MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
    int index = mMediaCodec.dequeueOutputBuffer(info, mTimeoutUsDegueueOutput);
    if (index >= 0) { 
        mMediaCodec.releaseOutputBuffer(index, info.size > 0);
        if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) == MediaCodec.BUFFER_FLAG_END_OF_STREAM) {
            break;
        }
    }
} else {
    try {
        Thread.sleep(10);
    } catch (InterruptedException ignore) {
    }
}

Now i have no idea why video crashed with artifacts and what to debug.

Example of video: screen of video

1

1 Answers

1
votes

The problem was with a FU_A reconstruction. Problem was in this string

int capacity = wrapper.getByPayload().writeRawH264toByteBuffer(buffer);

Packet FU_A should be reconstruct to full packet and only after it put in to the decoder