I\'m assuming that WebRTC is an API that decodes/encodes audio and video, although the communication between the server and the clients is done via web sockets, or some other ne
No, Signaling is not defined by WebRTC.
Here is an post by the IETF which explains it pretty good why it is not: http://www.ietf.org/mail-archive/web/rtcweb/current/msg01143.html
This means that you are free to choose how you exchange network information. I.e. you could use websockets, HTTP and even Email, but that would be a bit of a struggle :)
There's two sides to WebRTC.
getUserMedia
) that allow an app to access camera and microphone hardware. You can use this access to simply display the stream locally (perhaps applying effects), or send the stream over the network. You could send the data to your server, or you could use...PeerConnection
, an API that allows browsers to establish direct peer-to-peer socket connections. You can establish a connection directly to someone else's browser and exchange data directly. This is very useful for high-bandwidth data like video, where you don't want your server to have to deal with relaying large amounts of data.Take a look at the demos to see both parts of WebRTC in action.
So in a nutshell:
PeerConnection
allows full-duplex communication between two browsers.Instead of peerConnection
you can also look at the WebRTC data channel draft: http://tools.ietf.org/html/draft-jesup-rtcweb-data-protocol-00 which is basically bidirectional udp. Which can be a really valuable alternative to WebSockets as doesn't have the "negative" sides of a tcp connection.
WebRTC uses RTP (a UDP based protocol) for the media transport, but requires an out-of-band signaling channel to setup the communication. One option for the signaling channel is WebSocket.