I wish do several little projects experimenting with video and audio streaming from client to server and from client-server-multiples points, I have several questions:
I disagree with MarijnS95 because I don't think WebRTC is made especially for browsers. You can use it in any platform and in any server or client application outside a browser. That's the good part.
WebRTC is just a set of protocols that already existed, bundled to provide Real time Communications. It's called web because Google wanted to make it available and widespread using browsers (and that was a big step to spread the word)...
So, to answer your questions: WebRTC is better than WebSockets to stream media content, for thee obvious reasons.
So, the advantages are obvious, but yes, you can also use WebSockets to stream data.
I can't found any experience about streaming client-server using webrtc.
Well, WebRTC uses the standard protocols, and you can use standard servers to support it. Do a little search about Asterisk + WebRTC.
Regarding the multi-point question, the answer is the same. You have better results with WebRTC (going to the server or not). The problems with peer to peer conferencing are known, as you stated, and the solution for that is indeed to use a server to reduce the number of streams to one per client. In a ideal world, you would be using a MCU to make this job. That's how it's done.