Send MediaStream object with Web Audio effects over PeerConnection

浪子不回头ぞ 提交于 2019-12-10 08:52:16

问题


I'm trying to send audio, obtained by getUserMedia() and altered with the Web Audio API, over a PeerConnection from WebRTC. The Web Audio API and WebRTC seem to have the ability to do this but I'm having trouble understanding how this can be done. Within the Web Audio API, the AudioContext object contains a method createMediaStreamSource(), which provides a way to connect the MediaStream obtained by getUserMedia(). Also, there is a createMediaStreamDestination() method, which seems to return an object with a stream attribute.

I'm getting both audio and video from the getUserMedia() method. What I'm having trouble with is how would I pass this stream object (with both audio and video) into those methods (ex: createMediaStreamSource())? Do I first need to extract, somehow, the audio from the stream (getAudioTracks) and find a way to combine it back with the video? Or do I pass it as is and it leaves the video unaffected? Can the audio only be altered once (before added to the PeerConnection)?


回答1:


The createMediaStreamSource() method takes a MediaStream object as its parameter, which it then takes the first AudioMediaStreamTrack from this object to be used as the audio source. This can be used with the MediaStream object received from the getUserMedia() method even if that object contains both audio and video. For instance:

var source = context.createMediaStreamSource(localStream);

Where "context", in the above code, is an AudioContext object and "localStream" is a MediaStream object obtained from getUserMedia(). The createMediaStreamDestination() method creates a destination node object which has a MediaStream object within its "stream" attribute. This MediaStream object only contains one AudioMediaStreamTrack (even if the input stream to the source contained both audio and video or numerous audio tracks): the altered version of the track obtained from the stream within the source. For instance:

var destination = context.createMediaStreamDestination();

Now, before you can access the stream attribute of the newly created destination variable, you must create the audio graph by linking all the nodes together. For this example, lets assume we have a BiquadFilter node named filter:

source.connect(filter);
filter.connect(destination);

Then, we can obtain the stream attribute from the destination variable. And this can be used to add to the PeerConnection object to send to a remote peer:

peerConnection.addStream(destination.stream);

Note: the stream attribute contains a MediaStream object with only the altered AudioMediaStreamTrack. Therefore, no video. If you want video to be sent as well, you'll have to add this track to a stream object that contains a video track:

var audioTracks = destination.stream.getAudioTracks();
var track = audioTracks[0]; //stream only contains one audio track
localStream.addTrack(track);
peerConnection.addStream(localStream);

Keep in mind, that the addTrack method will not add the track if there is already one in the MediaStream object with the same id. Therefore, you may have to first remove the track that was obtained in the source node.

The sound should be able to be altered at any time by adjusting the values within the intermediate nodes (between the source and destination). This is because the stream passes through the nodes before being sent to the other peer. Check out this example on dynamically changing the effect on a recorded sound (should be the same for a stream). Note: I have not tested this code yet. Though it works in theory, there may be some cross browser issues since both the Web Audio API and WebRTC are in working draft stages and not yet standardized. I assume for it to work in Mozilla Firefox and Google Chrome.

Reference

  • Media Capture and Streams
  • Web Audio API



回答2:


@Android Student's response is good, per the current specs - however, there are issues with both Firefox and Chrome in the implementation of the specs.

Last I checked in Chrome, it couldn't process the output of WebRTC through WebAudio, while Firefox can.

However, there are two bugs blocking Firefox from taking WebAudio-generated source streams in a PeerConnection, one of which has now been fixed in Nightly & Aurora, and the other should be shortly. Firefox does not yet implement stream.addTrack, however, which is another complication. Chrome apparently can handle WebAudio-sourced streams in a PeerConnection.



来源:https://stackoverflow.com/questions/26431333/send-mediastream-object-with-web-audio-effects-over-peerconnection

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!