Can I stream microphone audio from client to client using nodejs?

偶尔善良 提交于 2019-12-04 20:55:01

问题


I'm trying to create a realtime voice chat. once a client is holding a button and talks, I want the sound to be sent over the socket to the nodejs backend, then I want to stream this data to another client.

here is the sender client code:

socket.on('connect', function() {
      var session = {
          audio: true,
          video: false
      };

      navigator.getUserMedia(session, function(stream){
          var audioInput = context.createMediaStreamSource(stream);
          var bufferSize = 2048;

          recorder = context.createScriptProcessor(bufferSize, 1, 1);

          recorder.onaudioprocess = onAudio;

          audioInput.connect(recorder);

          recorder.connect(context.destination);

      },function(e){

      });

      function onAudio(e) {

          if(!broadcast) return;

          var mic = e.inputBuffer.getChannelData(0);

          var converted = convertFloat32ToInt16(mic);

          socket.emit('broadcast', converted);
      }

    });

The server then gets this buffer and stream it to another client (in this example, the same client)

Server Code

socket.on('broadcast', function(buffer) {
    socket.emit('broadcast', new Int16Array(buffer));
});

And then, in order to play the sound at the other side (the receiver), the client code is like:

socket.on('broadcast', function(raw) {

      var buffer = convertInt16ToFloat32(raw);

      var src = context.createBufferSource();
      var audioBuffer = context.createBuffer(1, buffer.byteLength, context.sampleRate);

      audioBuffer.getChannelData(0).set(buffer);

      src.buffer = audioBuffer;

      src.connect(context.destination);

      src.start(0);
    });

My expected result is that the sound from client A will be heard in client B, I can see the buffer on the server, I can see the buffer back in the client but I hear nothing.

I know socket.io 1.x supports binary data but I can't find any example of making a voice chat, I tried also using BinaryJS but the results are the same, also, I know that with WebRTC this is a simple task but I don't want to use WebRTC, can anyone point me to a good resource or tell me what am I missing?


回答1:


I build something like this on my own a few weeks ago. Problems I ran into (you will at some point):

  • To much Data without reducing bitrate and samplerate (over internet)
  • bad audio quallity without interpolation or a better audio compression
  • Even if its not shown to you, you will get different samplerates from different computers sound cards (my PC = 48kHz, my Laptop = 32Khz) that means you have to write a resampler
  • In WebRTC they reduce audio quallity if a bad internet connection is detected. You can not do this because this is low level stuff!
  • You have to implement this in a fast way because JS will block your frontent if not > use webworkers
  • Audio codex translated to JS are very slow and you will get unexpected results (see one audiocodex question from me: here) I have tried Opus as well, but no good results yet.

I dont work on this project at the moment but you can get the code at: https://github.com/cracker0dks/nodeJsVoip

and the working example: (link removed) for multi user voip audio. (Not working anymore! Websocketserver is down!) If you go into settings>audio (on the page) you can choose a higher bit and samplerate for better audioquallity.

EDIT: Can you tell me why u not want to use WebRTC?



来源:https://stackoverflow.com/questions/30957587/can-i-stream-microphone-audio-from-client-to-client-using-nodejs

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!