0
votes

Hi could someone please explain to me how to achieve a lipsync between two RTP streams (from the same RTSP session). I'm trying to calculate proper pts for ffmpeg AVPacket but I'm missing something and I can't get my head around this. I have the following data available to me:
u64RTCP_NTP_TS - NTP timestamp from RTCP Sender Report
u32RTCP_TS - Timestamp from the RTCP Sender Report
u32AudioRTP_TS - Timestamp from the Audio RTP packet
u32VideoRTP_TS - Timestamp from the Video RTP Packet

I've searched for an answer but still I can't get a clear picture of how this should be achieved calculation-wise, what am I still missing.

1

1 Answers

0
votes

Ok I've found the answer by browsing the code of the Live555 so the credits should go there, and many thanks to them for that. The answer is located in the file RTPSource.cpp RTPReceptionStats::noteIncomingPacket and RTPReceptionStats::noteIncomingSR

It's pretty straightforward to get the idea behind, one thing of note is that after calculation the current timestamp is being set as sync timestamp, and can be also overwritten by RTCP SR report (and well it should).

As a result the presentation timestamp for any stream in the RTSP session should have more or less the same time.