问题
I'm trying to solve a problem with WebRTC Native Android. I've successfully adapted the AppRTCDemo from a 1-1 call to 1-N calls. Currently I have the following scenario:
- A (me) can talk to/listen to B
- A (me) can talk to/listen to C
- B and C can't talk to or listen to each other.
To accomplish that I (A) have 2 PeerConnections respectively with B and C. I understand I need to mix some how the media streams or audio tracks, or in other words:
Mix (A, B) -> send to C
Mix (A, C) -> send to B
Any pointers on how to accomplish that? Thank you
回答1:
I believe that you are sending the same localstream to both the parties (party-B and party-C) in multiconnection because if you try to make different local streams for different connections it will throw error as it will try to access mic and camera again which is not possible(i believe).
You must be creating localstream something like this:
stream = sessionFactory.createLocalMediaStream(LOCAL_MEDIA_ID);
For sending the stream of Party-B to Party-C you can do this:
- step1-- create p2p connection between party-A and party-B
- step2-- create p2p connection between party-A and party-C
step3-- when you add the localstream you call something like this
peerconnection2.addStream(myStream)
In place of myStream (for peer connection with C) you can add the remoteStreamB(remote stream that you are getting from B). This will add the stream that you are getting from B as localstream in second peerconnection that is with C.
For sending remote stream from Party-C to Party-B:
- Suppose you have peerconnection1 with Party-B. To add the remote stream from C to as local stream first you will have to remove the current tracks and then add the new tracks.
for this you can do something like this
AudioTrack t1 = mystream.audioTracks.get(0); VideoTrack v1 = myStream.videoTracks.get(0); if(myStream.removeTrack(t1)){ if(myStream.removeTrack(v1)){ t1 = remoteStreamC.audioTracks.get(0); v1 = remoteStreamC.videoTracks.get(0); myStream.addTrack(t1); myStream.addTrack(v1); } }
By this way you will change the content of the localstream that you are sending to B. Now that stream will be having audio and video tracks coming from remotestreamC.
But to make this process error free you will have to use error handling because when someone breaks the connection (either B or C) you will be getting null tracks.... In that case you will have to immediately send the hangup request to the other party as well.
来源:https://stackoverflow.com/questions/39579653/android-webrtc-mix-audiotracks-for-conference