Every so often I see the following question:
How can I stream a webcam to a Jitsi Meet room?
Wait, isn’t this just like “joining” a room? Not quite. One may want to have some dedicated contraption streaming a webcam to a room. Think of a homemade security camera system or something alike.
This probably gets asked for other WebRTC conferencing services too. The way I see it, there are 2 ways to go about it:
- Buid a way (on the server) to accept a stream sent with ffmpeg or gstreamer, and broadcast that.
- Use a browser.
I know what you’re thinking: “Saúl, running an entire browser is overkill!”. Maybe. It will consume more RAM, yeah, but I argue the dominating factor here is video capture and encoding, not JavaScript execution, the DOM and other shenanigans.
Also, when all you have is a hammer, everything looks like a nail.
With the advent of headless mode in both Chrome and Firefox, this option looks more enticing than ever, so let’s roll our sleves and give it a shot.
I’m going to use Google’s puppeteer library, which runs Chrome headless to join a Jitsi Meet room. Being a headless client, we can cut down some of Jitsi Meet’s features in order to reduce the required resources:
- No need to receive video
- Disable simulcast (only encode video once)
- No audio levels
I could probably add some more, but those should be enough to make a difference. The astute amongst you may think “but Saúl, disabling simulcast means every streamer will send an HD stream, I can’t cope with so many!”. Great point! Here we are going to rely on adaptivity, so no need to worry, if the client can only receive a single HD stream, the rest will be suspended, but you can switch between them just fine!
Here is the code (I also got to play with async / await for the first time, which is pretty cool):
https://gist.github.com/saghul/179feba3df9f12ddf316decd0181b03e
My original intent here was to use some inexpensive (and not very powerful) device such as the Raspberry Pi, but alas puppeteer doesn’t yet support ARM devices 🙁
Happy streaming!