3
votes

I am encoding a realtime H264 stream using NvEnc and I am trying to send it over RTP as a "live broadcast" (actually multicast). Things work fine so far, dumping h264 to disk or even writing flv to disk for debugging works fine. Sending a raw UDP stream works as well when watching the stream using MPlayer. As for the stream itself, it is using the LOW_LATENCY preset, I am only generating I and P Frames (force inserting an I Frame every 60 Frames, as well as SPS/PPS). Also NAL units are created to be smaller than MTU-Size minus the RTP header-length.

Now when I am trying to send it over RTP I am using single NAL mode (packetization-mode=0) and have problems figuring out what the rtp timestamp should look like. I am using jrtplib for setting up and using RTP.

For each encoded frame I get from NvEnc I extract n NAL Units, and every NAL Units gets sent in its own RTP packet. I tried sending all NAL Units of a frame using the same rtp timestamp, and increasing the timestamp for the next Frame by 1500 (90000 Hz / 60 fps, as I have a fixed 60 fps input) stream. I also tried to measure the time it takes between frame n and Frame n+1 as the increment to the timestamp (which roughly is 1500 anyway). Now MPlayer still plays the stream just fine, but VLCPlayer keeps rebuffering every few seconds, and I think it's related to the timestamp issue. MPlayer seems to be a lot more tolerant to misuse from my side I think. For more Information, here are the sdp settings I am using to playback the stream (part of it):

m=video 24712 RTP/AVP 96
a=rtpmap:96 H264/90000
a=ftmp:96 packetization-mode=0
a=recvonly

Am I misunderstanding something? I tried to read the RFCs but couldn't fix this on my own.

So the question is, what is the proper way to generate the timestamps for RTP when using single nal mode?

1
Are you containing these single NALU's inside an FLV? What happens if your record 10 seconds and then use a tool like FFMpeg to convert FLV first to MP4 then secondly from MP4 back to new FLV, now compare your old FLV against the newer FLV from FFMpeg. Look at timestamps and CTTS offsets. Do they match?VC.One
FLV was only used for 'offline debugging'. When sending over RTP / network I only send the NAL units. I'll have a try looking at the timestamps you mentioned when I get back to work. After changing some code it looks better with VLC but still not perfect.pettersson

1 Answers

0
votes

After fiddling around some more, I found a bug in my threading code, which resulted in broken frames. Once this was fixed, creating the timestamps as follows works fine for me:

I measure the time between each sent nal unit. So for each new frame the delay is roughly 16ms since the last sent unit (fixed 60fps application), I add this to the timestamp of the RTP packet. The delay between each nal unit of the same frame is rather small (near zero).

This works fine for my use case.