Proof-of-Concept Mobile LiveStreaming with a Raspberry Pi 4

Early 2023 Edit: since I wrote this post, I discovered the Nvidia Jetson series of systems, and found a real winner with the Jetson Nano. I’ve been meaning to document some of my journey with the Jetson Nano, but haven’t gotten it finished yet, and a few people asked if I was going to update things.

Original Post:

I’m an avid birder, and I enjoy evangelizing birding to basically anyone who will listen. I also like technology, so the intersection of birding and technology is something I especially like. What does this have to do with livestreaming from a Raspberry Pi? That requires a little bit of backstory.

PAX West (formerly PAX Prime, formerly PAX) has traditionally been very bad for cellular connectivity at the main Expo Hall. Having 60,000 people in on spot tends to do that to North American cellular infrastructure. While at PAX South in 2017, I started talking to a Twitch employee about the difficulties of livestreaming in busy environments or in places with marginal service. They liked the idea, and mentioned one of their team was working on it and gave me their card. Unfortunately, my email went unanswered and my interests changed to birding, which I discovered shortly after PAX South.

In the fall of 2018, I started wondering if livestreaming birding would be technically feasible without needing expensive (and heavy) broadcast grade equipment because it seemed like a great way to share a hobby I love with a wider audience. I decided that I’d need at least two cameras, one camera for closeups on bids and a wider camera to show the area, as well as possibly a camera showing what I saw in binoculars. I started doing some research, and found that other people had also had the same sort of idea, GunRun’s IRL Backpack and DiY versions, but the $2,500 (USD) to get a single camera using a LiveU Solo with bandwidth via UnlimitedIRL was way out of my price range, and I’d still need some way to have 2 (or more cameras) connected to it.

Do It Yourself (DiY)

So, what requirements did I settle on?

  • Two cameras: one wide to show the area I’m birding in and one telephoto camera to get nice closeups of birds.
  • 1080p video running at 30fps and up to 6Mb/s bitrate.
  • Ability to switch between the two cameras locally.
  • Not $2,500+.
  • Light enough that I can carry it around with me for a full day of birding.
  • Resilient network connection.

Why these requirement?
Two cameras is obvious: one shoulder, or similar mounted, camera for wide views of the area I’m birding and then a zoom/telephoto camera to get nice, close in views of the birds.
1080p because birding is going to require a lot of fine detail, and 720p just isn’t high enough resolution. 30 FPS is the absolute bare minimum for video to not look jerky, and 6Mb/s is as high as Twitch will ingest.
Being able to switch two cameras locally means that I won’t have to stream 12Mb/s out at all times and worry about switching streams on a remote OBS setup. This will also save on bandwidth costs and not require as much upstream bandwidth.
Low cost is also important, because I definitely didn’t want to spend a bunch of money on this.
Weight is also a factor, as my birding outings can easily last 4 hours and every gram of weight counts.
Resilient network connection because I don’t want to have to stop and restart the stream. It’d also be nice if the stream could go over multiple connections, due to handoffs between towers not being as “seamless” as they’re supposed to be.

The Setup

With this requirements in mind, I decided on the following general hardware setup:

  • Zoom Camera: Nikon P1000. A camera almost purpose built for taking pictures of birds. It has clean HDMI output, which is needed to get the HDMI signal out of it. I already had the camera when I started this. One downside is that the P1000’s screen can not be used in HDMI out mode, requiring an external monitor.
  • Wide/shoulder Camera: GoPro Hero7 Black. Again chosen because I had access to it already. I don’t know if it’s a better choice than the usual Sony Action cam, we’ll see. It also allows 720p streaming, so this is useful as a proof of concept, and I can upgrade to the Hero8 Black‘s 1080p streaming if it works out.
  • Capture Device: Avermedia Live Gamer Mini (GC311). I’d initially looked at various HDMI→USB capture devices, including AverMedia ExtremeCap UVC as well as a bunch of inexpensive AliExpress specials, but I ruled most of them out for not supporting Linux. Anything that supports UVC should support Linux, but I didn’t want to take my chances here due to my limited budget. Why this specific one? It (unofficially) works fine on Linux, it has a build in H.264 encoder capable of 1080p30. I also tested the Avermedia Live Gamer Portable 2 Plus (GC513), which worked about the same, but is significantly larger and heavier, a more awkward shape and draws slightly more power.
  • Encoder, Switcher and Network Control: Raspberry Pi 4. Low power, generally well supported, inexpensive and with a 1080p30 . I’d initially tried the Raspberry Pi 3, but the encoder proved unable to work at higher than roughly 720p24 and 1.5Mb/s, which was well under my minimum requirements.
  • Operating System: Arch Linux ARM. I’m very familiar with Arch Linux, as well as the ARM version, and I suspected that I’d need to build some of the tools from source, which Arch Linux makes (relatively) easy to do.
My P1000 with HDMI monitor, connected to the GC513 capture device running from an RPi4 powered by a battery bank over USB. It works!

Putting it all Together

Getting the capture device recognized by Linux was very straightforward. I plugged it in and connected an HDMI input, and Linux instantly recognized it. Getting useful output out of it was a different matter.

The first problem was that the ffmpeg packages that Arch Linux ARM is using don’t have support for the RPi’s built in video encoders, so I needed to compile the package myself and with some out-of-tree patches (that are due to be merged in soon, hopefully).

With a little lot of trial and error, I managed to get the capture device working with a Raspberry Pi at 1080p30, the maximum the RPi4 is rated for. It just barely keeps up, but potentially trans-rating (like transcoding, except changing the bitrate but not fully re-encoding the video) makes things faster. I managed to run a stream for hours without any dropped frames, other than those caused by WiFi glitches.

I tried to get RTMP running well, but it choked on a cellular connection, and even on a good wired ethernet connection, the latency made it unusable for live, real-time video. One thing I did experiment a lot successfully with is using SRT (Secure Reliable Transport) for streaming. Bonus, SRT will be adding bonding soon, claiming this month (Feb. 2020).

It lives! SRT streaming my dining room wall to VLC on my laptop via my main Linux environment. Dropped frames are from the WiFi, not on the sending end.

Other Things

The streaming on the GoPro Hero 7 Black is unusable. It stops after around an hour no matter what I do and is very annoying to get connected/reconnected, as it needs to use the app. There is a community project to implement an unofficial API, but it doesn’t look like it’ll fix the streaming issue. According to GoPro support it’s a hardware issue, which they won’t fix, instead telling me to buy a Hero 8 Black to replace my broken, but in-warranty 7. Boo.

HDMI switchers, meant to switch in puts on TVs are a bad idea. I bought one model that wasn’t the cheapest, but at best I got about a half second of black screen when it switched over, and at worst, it locked up the HDMI capture device necessitating a replug of the USB, or even needing to power-cycle the RPi4. I looked at higher-end HDMI switchers, but didn’t really find anything mobile, and definitely didn’t find anything inexpensive.

One, Two, N…

There’s a saying in computer circles that getting two of something working is just as much work as getting one of something working. And once you have two working, it as much work to generalize to n.

And it doesn’t have enough decoding power on the RPi4 to decode two streams, switch between them and re-encode them. It just can’t handle that.

Next I tried a custom build of ffmpeg that allows switching between two streams, something stock ffmpeg doesn’t really support. The results were, now that I’m writing this in hindsight, entirely expected. Garbled, unwatchable video for a few seconds until the next I-frame shows up due to smashing unrelated video frames together.

No More Raspberry Pi?

It’s painful to spend so much time on something and realize that it just won’t work out, but between the issues with the RPi4’s encoder just not being powerful enough to do two stream, and a major new issue I uncovered, I’m looking at alternatives.

What major new issue? Well, I was wondering if I could send 2x1080p30 streams at once and calculated how much the data was going to cost me. We’re talking in excess of $80 per hour at Canadian cell phone rates. Oof, so that’s not viable, but even $40 an hour is getting expensive.

I realized that H.264 is years old at this point, and the industry has generally started using H.265 (AKA HEVC). It claims double the efficiency, or better, so I should be able to target a 3Mb/s bitrate for the same quality as H.264’s 6Mb/s. With that in mind I’m trying to find SBCs that support H.265 encoding and have enough decoding performance to work with two streams. It might also be easier to get two raw capture devices, such as Elgato’s Camlink 4k, but this could require a beefy system to do.

I’ve looked at hundreds of boards at this point. Boards based on the RK3288 may work, but the driver situation looks awful. Another potential option is the LattePanda Delta, but it looks too hot and power hungry to be a good contender. There are other similar systems, such as the Atomic Pi, but they look like they may have similar issues with power usage.

So that’s where things are right now. I’m a bit stuck. Sorry RPi4, you’re good for a lot of things, but video isn’t one of them.

So close and yet so far…