问题
I'm building a React.js application that interacts with the WebRTC apis to do audio/video calling. When a call is successfully established, an 'onaddstream' event is fired on the RTCPeerConnection instance, which contains the stream that I as a developer am supposed to connect to a video element to display the remote video to the user.
Problem I'm having is understanding the best way to get the stream from the event to the React component for rendering. I have it successfully working by just dumping the stream into my redux state, but in this other answer, the creator of redux Dan Abramov mentioned this:
[...] don’t use classes inside the state. They are not serializable as is. [...] Just use plain objects and arrays.
Which leaves me wondering, if I shouldn't put these streams in the redux state, is there a better way to react to the 'onaddstream' event and get the React component to update without putting the stream in the redux state?
回答1:
In my experience things like socket connections and, as in your case, webrtc things, are well-suited for living inside their own middlewares hand-written for your application. You can wire up all connection management here, fire actions to communicate with UI and listen for actions coming from here.
Another solution would be to look on redux saga, which seems to be quite a nice option for handling complex effects as sockets and webrtc.
来源:https://stackoverflow.com/questions/34137783/where-to-store-webrtc-streams-when-building-react-app-with-redux