2
votes

I am searching for low level implementation details on muxing RTP and RTCP streams using BUNDLE on a Java based server. With Chrome as my source, this is what a local SDP looks like:

o=- 8554465656018336221 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE audio video data
a=msid-semantic: WMS
m=audio 1 RTP/SAVPF 111 103 104 0 8 126
c=IN IP4 0.0.0.0
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:Vw+winZTN4ejhvQJ
a=ice-pwd:ufBTUw/iszvCbL53dmPHQAYK
a=ice-options:google-ice
a=fingerprint:sha-256 5C:C6:19:38:4D:54:57:71:16:3F:67:A6:C8:21:CC:29:88:85:22:86:53:E5:7B:3F:3D:A4:5C:E5:BC:29:D8:B5
a=setup:actpass
a=mid:audio
a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=recvonly
a=rtcp-mux
a=rtpmap:111 opus/48000/2
a=fmtp:111 minptime=10
a=rtpmap:103 ISAC/16000
a=rtpmap:104 ISAC/32000
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:126 telephone-event/8000
a=maxptime:60
m=video 1 RTP/SAVPF 100 116 117
c=IN IP4 0.0.0.0
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:Vw+winZTN4ejhvQJ
a=ice-pwd:ufBTUw/iszvCbL53dmPHQAYK
a=ice-options:google-ice
a=fingerprint:sha-256 5C:C6:19:38:4D:54:57:71:16:3F:67:A6:C8:21:CC:29:88:85:22:86:53:E5:7B:3F:3D:A4:5C:E5:BC:29:D8:B5
a=setup:actpass
a=mid:video
a=extmap:2 urn:ietf:params:rtp-hdrext:toffset
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=recvonly
a=rtcp-mux
a=rtpmap:100 VP8/90000
a=rtcp-fb:100 ccm fir
a=rtcp-fb:100 nack
a=rtcp-fb:100 nack pli
a=rtcp-fb:100 goog-remb
a=rtpmap:116 red/90000
a=rtpmap:117 ulpfec/90000
m=application 1 DTLS/SCTP 5000
c=IN IP4 0.0.0.0
a=ice-ufrag:Vw+winZTN4ejhvQJ
a=ice-pwd:ufBTUw/iszvCbL53dmPHQAYK
a=ice-options:google-ice
a=fingerprint:sha-256 5C:C6:19:38:4D:54:57:71:16:3F:67:A6:C8:21:CC:29:88:85:22:86:53:E5:7B:3F:3D:A4:5C:E5:BC:29:D8:B5
a=setup:actpass
a=mid:data
a=sctpmap:5000 webrtc-datachannel 1024

I've google'd etc and have not found what I need as of yet. I did find this page and it only has mostly high-level info, but again I need more: http://tools.ietf.org/html/draft-ejzak-avtcore-rtp-subsessions-01

In addition, I am subscribed to https://groups.google.com/forum/#!aboutgroup/discuss-webrtc but I haven't seen any low level information about how muxing works with

a=group:BUNDLE audio video data
is used.

Related questions:
WebRTC java server trouble
How can I mux/demux RTP media from one stream?

1

1 Answers

2
votes

All this is meaning that the data is being sent over the same port. This does not mean that the packets themselves are modified in any way.

The way to separate out the packets(knowing which is audio/video and their respective control packets) are to check them against their respective SSRC in the RTP/RTCP packet headers. This way you don't modify your video stream given the audio control packet and vice versa.

In chrome, you can make it alert to the respective SSRC ids through the SDP exchange by including a=ssrc:<ID> for each media level(one for video and one for audio).

It also looks like your SDP is set to recvonly for both media types. This means that it does not RECEIVE any RTCP and will only send them back to the sender so that the streams can be modified accordingly.