Web Audio API for live streaming?

两盒软妹~` 提交于 2019-12-02 15:07:28

Yes, the Web Audio API (along with AJAX or Websockets) can be used for streaming.

Basically, you pull down (or send, in the case of Websockets) some chunks of n length. Then you decode them with the Web Audio API and queue them up to be played, one after the other.

Because the Web Audio API has high-precision timing, you won't hear any "seams" between the playback of each buffer if you do the scheduling correctly.

I wrote a streaming Web Audio API system where I used web workers to do all the web socket management to communicate with node.js such that the browser thread simply renders audio ... works just fine on laptops, since mobiles are behind on their implementation of web sockets inside web workers you need no less than lollipop for it to run as coded ... I posted full source code here

To elaborate on the comments on how to play a bunch of separate buffers stored in an array by shifting the latest one out everytime:

If you create a buffer through createBufferSource() then it has an onended event to which you can attach a callback, which will fire when the buffer has reached its end. You can do something like this to play the various chunks in the array one after the other:

function play() {
  //end of stream has been reached
  if (audiobuffer.length === 0) { return; }
  let source = context.createBufferSource();

  //get the latest buffer that should play next
  source.buffer = audiobuffer.shift();
  source.connect(context.destination);

  //add this function as a callback to play next buffer
  //when current buffer has reached its end 
  source.onended = play;
  source.start();
}

Hope that helps. I'm still experimenting on how to get this all smooth and ironed out, but this is a good start and missing in a lot of the online posts.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!