16
votes

I need to have a real time live audio stream from 1 client to a server to multiple listener clients.

Currently I have the recording from the client working and stream the audio through socket.io to the server. The server receives this data and must stream the audio (also through socket.io?) to the clients that want to listen to this stream. It must be as real time as possible (minimize delay).

I'm using GetUserMedia to record the microphone (browser compatibility is not important here). I want the clients to use HTML5 audio tag to listen to the stream. The data received on the server are chunks (currently packed by 700) packed in a blob with type audio/wav.

This is my code to send it to the server:

mediaRecorder.ondataavailable = function(e) {
    this.chunks.push(e.data);
    if (this.chunks.length >= 700)
    {
        this.sendData(this.chunks);
        this.chunks = [];
    }
};
mediaRecorder.sendData = function(buffer) {
    blob = new Blob(buffer, { 'type' : 'audio/wav' });
    socket.emit('voice', blob);
}

On the server I'm able to send the chunks to the client the same way like this:

socket.on('voice', function(blob) {
    socket.broadcast.emit('voice', blob);
});

On the client I can play this like this:

var audio = document.createElement('audio');
socket.on('voice', function(arrayBuffer) {
    var blob = new Blob([arrayBuffer], { 'type' : 'audio/wav' });
    audio.src = window.URL.createObjectURL(blob);
    audio.play();
});

This works for the first blob of chunks I send but you're not allowed to keep changing to audio.src to new URL source so this is not a working solution.

I think I have to create some kind of stream on the server which I can put in the audio tag of the HTML5 on the listening clients but I don't know how. The received blobs with chunks should than be appended to this stream in real time.

What is the best approach to do this? Am I doing it right from client microphone to server?

2
If you have heard about webrtc, then that is a natural solution for your problem minus server as webrtc is peer to peer.Aman Gupta
I know but I don't want it to be peer to peer but broadcast from a server. The client recording the audio doensnt have enought bandwidth to broadcast to all clients so I need a server to do thatPeter van den Broek
You could use MCU architecture of WebRTC, but I guess that is out of scope here.Aman Gupta
why don't you just send a binary stream?jemiloii
Did this work with the answer you got? Very insanely curious about this!ErikBrandsma

2 Answers

2
votes

I'm a bit late to the party here but it looks like the web audio API will be your friend here, if you haven't already finished it. It allows you to play an audio stream directly to the output device without messing around with attaching it to an audio element.

I'm looking at doing the same thing and your question has answered my question - how to get data from client to server. The benefit of the web audio API is the ability to add streams together and apply audio effects to it on the server.

MDN Web Audio API

The io events should replace the data in an audio buffer object in the audio context. Audio processing can happen in the nodeJS web audio context before being emitted as a single stream to each client.

0
votes

You could change audio src dynamically as follows (assuming mp3 type):

<audio id="audio" controls="controls">
    <source id="mp3Source" type="audio/mp3"></source>
        Your browser does not support the audio format.
</audio>

Call following function whenever, socket event is received :

function updateSource() { 
        var audio = document.getElementById('audio');

        var source = document.getElementById('mp3Source');
        source.src= <blob>;

        audio.load(); //call this to just preload the audio without playing
        audio.play(); //call this to play the song right away
    }