WebRTC and Asp.NetCore

前端 未结 1 571
北海茫月
北海茫月 2021-01-31 12:41

I would like to record the Audio stream from my Angular Web App to my Asp.net Core Api.

I think, using SignalR and its websockets it a good way to do that.

With

相关标签:
1条回答
  • 2021-01-31 12:59

    I found the way to get access to the microphone stream and transmit it to the server, here is the code:

      private audioCtx: AudioContext;
      private stream: MediaStream;
    
      convertFloat32ToInt16(buffer:Float32Array) {
        let l = buffer.length;
        let buf = new Int16Array(l);
        while (l--) {
          buf[l] = Math.min(1, buffer[l]) * 0x7FFF;
        }
        return buf.buffer;
      }
    
      startRecording() {
        navigator.mediaDevices.getUserMedia({ audio: true })
          .then(stream => {
            this.audioCtx = new AudioContext();
            this.audioCtx.createMediaStreamSource(stream);
            this.audioCtx.onstatechange = (state) => { console.log(state); }
    
            var scriptNode = this.audioCtx.createScriptProcessor(4096, 1, 1);
            scriptNode.onaudioprocess = (audioProcessingEvent) => {
              var buffer = [];
              // The input buffer is the song we loaded earlier
              var inputBuffer = audioProcessingEvent.inputBuffer;
              // Loop through the output channels (in this case there is only one)
              for (var channel = 0; channel < inputBuffer.numberOfChannels; channel++) {
    
                console.log("inputBuffer:" + audioProcessingEvent.inputBuffer.getChannelData(channel));
                var chunk = audioProcessingEvent.inputBuffer.getChannelData(channel);
                //because  endianness does matter
                this.MySignalRService.send("SendStream", this.convertFloat32ToInt16(chunk));
              }
            }
            var source = this.audioCtx.createMediaStreamSource(stream);
            source.connect(scriptNode);
            scriptNode.connect(this.audioCtx.destination);
    
    
            this.stream = stream;
          })
          .catch(function (e) {
            console.error('getUserMedia() error: ' + e.message);
          });
      }
    
      stopRecording() {
        try {
          let stream = this.stream;
          stream.getAudioTracks().forEach(track => track.stop());
          stream.getVideoTracks().forEach(track => track.stop());
          this.audioCtx.close();
        }
        catch (error) {
          console.error('stopRecording() error: ' + error);
        }
      }
    

    Next step will be to convert my int32Array to a wav file.

    sources which helped me:

    • https://subvisual.co/blog/posts/39-tutorial-html-audio-capture-streaming-to-node-js-no-browser-extensions/
    • https://medium.com/@yushulx/learning-how-to-capture-and-record-audio-in-html5-6fe68a769bf9

    Note: I didnt add the code on how to configure SignalR, it was not the purpose here.

    0 讨论(0)
提交回复
热议问题