Host Arin Sime was live from the RTC.ON 2025 conference in Krakow, Poland, with short discussions with three of the event speakers:
- Will Law, Chief Architect / Akamai. Conference presentation: A QUIC update on MOQ and WebTransport
- Cezary Siwek, Staff Engineer / Stream. Conference presentation: Challenges in Realtime livestreaming at 4k / 60FPS
- Christoph Guttandin, Developer / Media Codings. Conference presentation: Designing a media container library for the web
Read our conference wrap up: WebRTC.ventures Visits RTC.ON 2025
Key insights and episode highlights below.
Watch Episode 105!
Key Insights
⚡ MoQ is the next-generation foundation for real-time streaming and data transfer. WebRTC has proven that real-time video can work for anyone, anywhere, but as we push its limits, MoQ steps in with significant advantages, including tunable latency, caching, and payload flexibility. Will puts it nicely,
“WebRTC is a fantastically robust ecosystem. It doesn’t feel like it when you’re working in the innings of it. But the reality is, I go to any browser we take for granted that I can just chat with my parents on the other side of the world and they’re not technically advanced so the system is robust enough and simple enough that we can have video conferences with anyone, anywhere. And I think that’s full credit to WebRTC. So it’s for that as we start to push the boundaries and we want to decouple the transport from the codecs from the signaling and tease those apart and optimize them in different ways, that’s where MoQ allows you to do that, and that’s where it starts getting interesting.”
⚡ The demand for 4K/60 FPS live streaming is rising, but it comes with some big challenges. Deliver 4K video at scale isn’t just about higher resolution, but about solving problems across the entire streaming pipeline. As Cezary explains,
“There are lots of challenges in terms of 4K. First of all, many of the devices, especially the devices you want to watch on it, are not ready for 4K. But this is something that we have to deal with. We need to provide them additional way to watch it. But I think this is the most common issue that people have: they want to watch 4K, but they are not able to. Bandwidth is another issue that has to be solved. To provide good-quality video, you need a lot of bandwidth. From the production side, there are issues from the production side to generate video at high quality and stream it to us so we can redistribute it. It’s quite complex. Also, scale. So if we’re talking about live streaming and you want the stream to be watchable by thousands of customers without any interruption, that’s another challenge.”
⚡ The current state of the WebCodecs API. As developers explore the possibilities of in-browser media processing, understanding the current capabilities and limitations of the WebCodecs API becomes crucial. Christophe shares his perspective on where the API stands today and what could make it even more powerful: “When WebCodecs came along you could reduce the FFmpeg build to just handle the passing of the media containers and do the actual processing with WebCodex and that has been done as well but still I think that’s it’s not flexible enough for the use cases that I had in mind at least and that’s why I thought it should be much better to provide a dedicated or to build a dedicated API library for it, which does all that in JavaScript and nicely works with today’s bundler and frontend frameworks and all that. That was kind of my wish list.”
Episode Highlights
MoQ is no longer just about media.
MoQ isn’t a replacement for WebRTC but an adjacent technology that extends real-time data distribution far beyond audio and video. Will explains,
“I think if you’re coming from a WebRTC world, it’s sort of a very interesting adjacent technology. And I’m not here by any means to say, drop WebRTC and move to MoQ. I’m saying MoQ brings you a lot of tools and features that you will find very interesting. And if you come from a WebRTC background, you will appreciate them far faster than some people who have no idea what I’m talking about there. So yeah, it’s Media over QUIC limits it to media, but it doesn’t have to be media. It’s payload agnostic. So it’s moving arbitrary binary payloads in a Pub/Sub network very efficiently and smoothly. And video is an excellent candidate for payload, but it can be database updates. It can be GPS coordinates of race cars around a track. It can be security system updates. There’s a lot of linear timeline information that I want to move between different nodes, and I think that’s the market application of MoQ as it extends beyond audio and video.”
Do you really need 4K for live streaming?
When you’re just getting started with real-time streaming, it’s tempting to think 4K/60 FPS is the obvious choice. But is it always the right one? That’s the question Arin asked Cezary, a streaming expert, at RTC.ON conference: “For someone who is new to streaming media in real time, what sort of advice would you share with them? And should they be if they’re doing it for the first time or if they’re considering do I need 4K/60 FPS?”
Cezary explains, “They should really think if they really need 4k. I think it’s still early; we’ve been with 1080p for many years, and it worked well. 4K, you probably remember when the 4K TV started appearing, and it was a novelty. Everyone tried to use it, but it turned out most of the content isn’t 4K anyway. So we still need to think whether 4K is required. If it is, the content producer needs to be ready for it. You have good equipment, good internet connectivity. You need to think about the audience.”
What is the WebCodecs API and why is it important for real-time apps?
Real-time web developers now have a powerful tool at their fingertips: the WebCodecs API. Arin asked Christoph about it: “Let’s start, for those not familiar with it, can you give us a quick description of the WebCodecs API and why a real-time developer might be interested in using that?”
Christoph explains, “It’s access to the low-level encoding and decoding capabilities in the browser that always have been there, but somewhere buried in the video element or in the WebRTC stack, and now you can get access to it and pipe raw samples or raw pixels from the canvas into it and get encoded data or the other round, a bit like FFmpeg in the browser.”
Up Next! WebRTC Live #106:
Rearchitecting Your WebRTC Application to Integrate LLMs
Wednesday, October 22 at 12:30 pm Eastern