I have a two-way audio chat written in C#. Means: There is a server application, that sends wave-encoded audio as a byte array over UDP. The client-application then decodes and plays the audio. So this works fine. For recording and encoding/ decoding, I use NAudio (library).
Now, the task is a different one, to display the stream on a website. I would prefer to use ASP.NET, so I can receive and decode the string with C# and NAudio. I display a cam image there as well, which works smoothly and without any problem.
Still I don't know how to do this. Can the UDP-bytestream just be decoded and played? I don't think this is possible, or could javascript be used instead of C#? HTML5 audio tag could also be useful.
Would be very happy to receive answers.