IceLink and WebRTC
by Anton Venema, on January 16, 2013
IceLink has always been fundamentally compatible with WebRTC. Since it's initial release, we have developed the peer connection negotiation algorithm to be completely interoperable with the open standard being developed by the team working on Google Chrome. By choosing to work off the same RFC specifications from day one, we have ensured that IceLink keeps as many options open as possible for you when it comes to interactions with third-party libraries.
But getting a WebRTC session going involves a lot more. In fact, you need twelve (!) additional components:
- An audio capture engine that can read raw audio samples from the device microphone.
- An audio render engine that can play back audio samples to the device speakers/headset.
- A video capture engine that can grab raw images from the device camera.
- A video render engine that can play back raw image to a visible on-screen container.
- An audio encoding engine that can convert raw audio samples to compressed frames.
- An audio decoding engine that can convert compressed frames back to raw audio samples.
- A video encoding engine that can convert raw images to compressed frames.
- A video decoding engine that can convert compressed frames back to raw images.
- An audio packetizer that can convert compressed frames into a sequence of RTP packets.
- An audio depacketizer that can convert a sequence of RTP packets back to compressed frames.
- A video packetizer that can convert compressed frames into a sequence of RTP packets.
- A video depacketizer that can convert a sequence of RTP packets back to compressed frames.
Basically, the sequence looks like this:
Mic > Capture > Encode > Packetize > Network > Depacketize > Decode > Render > Speakers
Camera > Capture > Encode > Packetize > Network > Depacketize > Decode > Render > Screen
Phew! That's an exhaustive list, but it doesn't end there. The encoding, decoding, packetizing, and depacketizing have to line up exactly with other implementations if you plan to have them talk to each other. Unless the packet format and contents of the packet are designed the same way, you can't have cross-communication between libraries.
Other platforms are coming soon (mobile devices are a priority), including a drop-in Java .jar file for web browsers that will let you use native WebRTC functionality when supported and otherwise gracefully backoff to the Java plugin.
The latest downloads for IceLink and IceLink+WebSync include a WebRTC example that lets you fire up a video chat between a desktop application and a web browser with just a few (very) simple lines of code. Check it out and let us know what you think!