1
votes

I am trying to stream H264 based data using Live555 over RTSP.

I am capturing data using V4L2, and then encodes it using FFMPEG and then passing data to Live555's DeviceSource file, in that I using H264VideoStreamFramer class,

Below is my codec settings to configure AVCodecContext of encoder,

codec = avcodec_find_encoder_by_name(CODEC_NAME);
if (!codec) {
    cerr << "Codec " << codec_name << " not found\n";
    exit(1);
}

c = avcodec_alloc_context3(codec);
if (!c) {
    cerr << "Could not allocate video codec context\n";
    exit(1);
}

pkt = av_packet_alloc();
if (!pkt)
    exit(1);

/* put sample parameters */
c->bit_rate = 400000;
/* resolution must be a multiple of two */
c->width = PIC_HEIGHT;
c->height = PIC_WIDTH;
/* frames per second */
c->time_base = (AVRational){1, FPS};
c->framerate = (AVRational){FPS, 1};
c->gop_size = 10;
c->max_b_frames = 1;
c->pix_fmt = AV_PIX_FMT_YUV420P;
c->rtp_payload_size = 30000;
if (codec->id == AV_CODEC_ID_H264)
    av_opt_set(c->priv_data, "preset", "fast", 0);
av_opt_set_int(c->priv_data, "slice-max-size", 30000, 0);
/* open it */
ret = avcodec_open2(c, codec, NULL);
if (ret < 0) {
    cerr << "Could not open codec\n";
    exit(1);
}

And I am getting encoded data using avcodec_receive_packet() function. which will return AVPacket.

And I am passing AVPacket's data into DeviceSource file below is code snippet of my Live555 code:

void DeviceSource::deliverFrame() {
    if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet

    u_int8_t* newFrameDataStart = (u_int8_t*) pkt->data;
    unsigned newFrameSize = pkt->size; //%%% TO BE WRITTEN %%%
    // Deliver the data here:
    if (newFrameSize > fMaxSize) { // Condition becomes true many times
        fFrameSize = fMaxSize;
        fNumTruncatedBytes = newFrameSize - fMaxSize;
    } else {
        fFrameSize = newFrameSize;
    }
    gettimeofday(&fPresentationTime, NULL); // If you have a more accurate time - e.g., from an encoder - then use that instead.
    // If the device is *not* a 'live source' (e.g., it comes instead from a file or buffer), then set "fDurationInMicroseconds" here.
    memmove(fTo, newFrameDataStart, fFrameSize);
}

But here, sometimes my packet's size is getting more than fMaxSize value and as per LIVE555 logic it will truncate frame data, so that sometimes I am getting bad frames on my VLC,

From Live555 forum, I get to know that encoder should not send packet whose size is more than fMaxSize value, so my question is:

How to restrict encoder to limit size of packet?

Thanks in Advance,

Harshil

1
were you able to identify the problem. I'm also stuck with the same.iamrameshkumar

1 Answers

0
votes

You can increase the maximum allowed sample size by changing "maxSize" in the OutPacketBuffer class in MediaSink.cpp. This worked for me. There are cases we may require high-quality video to be streamed, I don't think we will always be able to restrict the encoder to not to produce samples of size more than a particular value which would result in video quality issues. In fact, the samples are fragmented by the UDP sink live555 to match the default MTU (1500), so increasing the max sample size limit has no side effects.