[websocket] WebRTC vs Websockets: If WebRTC can do Video, Audio, and Data, why do I need Websockets?

WebRTC is designed for high-performance, high quality communication of video, audio and arbitrary data. In other words, for apps exactly like what you describe.

WebRTC apps need a service via which they can exchange network and media metadata, a process known as signaling. However, once signaling has taken place, video/audio/data is streamed directly between clients, avoiding the performance cost of streaming via an intermediary server.

WebSocket on the other hand is designed for bi-directional communication between client and server. It is possible to stream audio and video over WebSocket (see here for example), but the technology and APIs are not inherently designed for efficient, robust streaming in the way that WebRTC is.

As other replies have said, WebSocket can be used for signaling.

I maintain a list of WebRTC resources: strongly recommend you start by looking at the 2013 Google I/O presentation about WebRTC.