1
votes

I'm developing audio calling using webRTC and Janus.

below are my steps

  1. peerconnection make offer

  2. set local description of peerconnection

  3. send message to socket
  4. add local candidates to peerconnection
  5. received message from socket
  6. peerconnection set remote sdp
  7. received remote media stream

But after receiving the stream also, it is not audible. If code is required, i can put the snippets on ask. Please help.

1
you can check my code, for now it's works well for listening remote streams github.com/Igor-Khomich/JanusAudioStreamPlayerIgor
does webRTC handles the playing of audio streams? or we need to handle manually? your code seems to different comparing mine.. so not able to get an idea.Krutika Sonawala
actually you just have to interchange sdp data with server using Janus API, all other things webRTC library will do herself.Igor
yes that is the step i have done. i have set local and remote SDPs. and also got remote media stream. something like (Janus[A=1:V=0]). but it is not played. so i came to the question if i need something else to play it. because in video call you need to add the track to video view. for audio is there anything like that?Krutika Sonawala
subscribe for stream id "request" : "watch" and then start listening "request" : "start" as it's described in documentation janus.conf.meetecho.com/docs/streaming.htmlIgor

1 Answers

1
votes

Check if this helps. Just found something in Swift.

https://github.com/Igor-Khomich/JanusAudioStreamPlayer