I wish do several little projects experimenting with video and audio streaming from client to server and from client-server-multiples points, I have several questions:
Can be done with WebRTC, not with WebSockets. See Do websockets allow for p2p (browser to browser) communication?
WebRTC: Chrome + Firefox (+ Opera)
WebSockets: Chrome + Firefox + IE + Safari (+ Opera and some others)
WebRTC: UDP (SRTP), (also possible: TCP mode with TURN server) hopefully always end-to-end encrypted, but I'm not sure in case of TURN servers
WebSockets: TCP, can be secured via HTTPS/WSS, but not end-to-end between peers!
Yes it's possible...
try to use KURENTO with WEBRTC.
You can find 'one to many' call applications in their documentation, from client to server and server to many clients.
I disagree with MarijnS95 because I don't think WebRTC is made especially for browsers. You can use it in any platform and in any server or client application outside a browser. That's the good part.
WebRTC is just a set of protocols that already existed, bundled to provide Real time Communications. It's called web because Google wanted to make it available and widespread using browsers (and that was a big step to spread the word)...
So, to answer your questions: WebRTC is better than WebSockets to stream media content, for thee obvious reasons.
So, the advantages are obvious, but yes, you can also use WebSockets to stream data.
I can't found any experience about streaming client-server using webrtc.
Well, WebRTC uses the standard protocols, and you can use standard servers to support it. Do a little search about Asterisk + WebRTC.
Regarding the multi-point question, the answer is the same. You have better results with WebRTC (going to the server or not). The problems with peer to peer conferencing are known, as you stated, and the solution for that is indeed to use a server to reduce the number of streams to one per client. In a ideal world, you would be using a MCU to make this job. That's how it's done.
I don't know whether a clear answer is still requested for this question, but I wanted to do similar things.
I personally used Node.js in combination with the following plug-in for Node.js to enable WebRTC at the server side: node-webrtc. It's only supported for Linux and Mac OSX right now, but it allowed me to quickly set up a WebRTC server. You could then use the server to distribute your stream to other peers, either connected using WebSockets, WebRTC, or something else.
The source code is also freely available from the WebRTC webpage. So you can build a native application yourself that acts as a server if you want.
WebRTC is made for browsers. You said it right:
or the benefit in webrtc is avoid middle communication
It is especially made for browsers. You can make a connection to your server, but then it must have a UI (ubuntu server with GUI), and install a browser (but that is not what you want I guess). Next to that there is no other way to stream to your server than websockets.
According to the other answer, it is possible to stream to and from RTC enabled servers
About multiple streams, that is a but hard. I answered a question about using webrtc as a multiple peer to peer. Maybe that is what you can do, but will require a good code managing who shall connect to who, and just chain your users. Server is in that case the best (and if you want to go with browser compatibility, websockets are a bit more supported than webRTC, even IE supports them now (?!?!).
You might want a conclusion:
Please note that webRTC requires some data exchange before it can start. You can do this with a websocket server in node.js faily easy.
Hope this will help you further in your development process, and I hope to hear from you what kind of solution you are going to use!