The problem#
I wanted to stream video of myself and my screen at the same time. My plan was to put the video of myself on my screen and stream the entire screen, but I do not have a camera on my desktop. On the other hand, I do have a smartphone with a camera, so I needed a way to show the video from my phone's camera on my desktop's screen.
There are a few Android apps that promise to do so, but none of the ones I tried worked very well. But I know that video chat works just fine on my phone, including in a web browser using WebRTC which supports peer-to-peer video chat between two web browsers, so it should be easy to show the video from my phone's camera in a browser window on my desktop. Unfortunately, I couldn't find any straightforward solution for setting up just a peer-to-peer video link.
The solution#
Open minimal-webrtc
on the computer you
want the video streamed to. A QR code will appear; use your smartphone
to read it and after approving access to the camera, the video should
appear in the first browser window. This is intended to be used for
local connections, so it may not work if the two devices are not on the
same network. Only the signaling to set up the connection goes to the
minimal-webrtc
server, the actual video will be sent peer-to-peer over
the local network.
To get just the video as a bare window with no decorations, use
chromium --app=uri
to get rid of the address
bar, etc., and this script to remove the
rest.
To host it yourself, download the source code1
and use the included run_daphne.sh
script
(which assumes daphne
is installed) and
nginx configuration. As
WebRTC requires HTTPS, to run it on
your local network, you may need to
set up a self-signed certificate.
The details#
What's wrong with existing solutions?#
Short answer: nothing.
I could have just as well used Jitsi Meet. It will use a peer-to-peer connection for this scenario. Jitsi is open-source, so I could even self-host it if I wanted to keep the signaling local as well. Writing my own did give me more control over the UI, but Jitsi Meet even has an API for embedding it in your own site.
So, in the end, I mainly did it because I wanted experience with the WebRTC API and was bothered I couldn't find a WebRTC demo app that did what I wanted.
No signaling server#
There's a few WebRTC demos that omit a signaling server entirely.
After all, the server is just needed to exchange a couple relatively
short JSON strings; doing so manually isn't a lot of effort and
is a cool trick. serverless-webrtc
is one
such demo, but it only uses the connection for text, not video.
Another is html5-video-chat
which uses the
simple-peer
library.
Playing around with those, I found copying the JSON manually to be overly clumsy.2 I looked for ones that included a very simple signaling server and I couldn't find any.
Simplest possible signaling server#
As all the signaling server needs to do is pass JSON strings between
the two devices, it's very simple. In fact, the server side of
minimal-webrtc
is a minor modification of
the tutorial for Django Channels,
which is a very simple text chat app.
The main complication was that WebRTC requires HTTPS for access to the camera and microphone, so to host the signaling inside my local network, I had to set up a self-signed certificate.
Using WebRTC#
The actual WebRTC code is modified from these two tutorials. My changes were mainly to hook in my own signaling and adding logging messages so I could see what was happening on my phone where I didn't have access to web developer tools.
ICE candidates#
Since I wanted the connection to only go over my local network, I
thought I wouldn't need any of the code around ICE at all,
since it's about dealing with NAT. But it turns out even when
not specifying a STUN server, the same mechanism is used to
enumerate the multiple addresses a device can determine for itself
without asking a server. Without that list, it often would have no
available address or just localhost
, which may work when testing with
two different browsers on the same computer, but doesn't work to connect
two separate devices.
As there's no server to wait on, I modified the code from the
documentation on canTrickleIceCandidates
to not send the offer until gathering the candidates is complete, which
reduces the signaling to a single message in each direction.
Is it really local?#
How can you be sure the video really is being routed locally and not over the internet?
In firefox
, about:webrtc
shows details about
WebRTC connections. Click "show details" on the connection and the
top information is an "ICE Stats" table with one row highlighted
in green showing which connection is being used. For my connection
between my desktop and my phone, the "Local Candidate" in that row ends
with "[non-proxied]" and the "Remote Candidate" is an IPv6 address.
traceroute
tells me it is a single hop away, so the
connection is going over the local network, not the internet. Under
"show raw candidates" just below that table, you can also see which IP
address is being used for the local device.
In chromium
(or Google Chrome),
chrome://webrtc-internals/
gives much more
detail, but it the part we care about isn't presented as nicely.
There's a list of objects of different types associated with the
WebRTC connection. One of them is named RTCTransport_0_1
which has
information on how the connection is actually made. Its details include
a selectedCandidatePairId
of RTCIceCandidatePair_a1b2c3d4_z9y8x7w6
(the IDs at the end vary each time).
Elsewhere in the list, I see the corresponding entries
RTCIceCandidate_a1b2c3d4
and RTCIceCandidate_z9y8x7w6
. For the
connection I'm currently looking at RTCIceCandidate_z9y8x7w6
is
labeled as a "remote-candidate" and has the 192.168.x.x IP address of my
phone on my local network, indicating the connection is in fact local.
(I have no idea why chromium
ended up using IPv4 when I looked
this time; I've seen IPv6 addresses other times I've looked there.)
Video playing intermittently#
Once I got everything wired up, it was working only every third time or
so with no apparent pattern. It turned out the problem was the
browser policy to forbid autoplaying videos.
Since I was only streaming video and not audio, the solution was
straightforward: mute the audio by adding the muted
attribute to the <video>
tag, as muted videos are allowed to autoplay.
As audio is optional, if you do enable audio, minimal-webrtc
will
bring up an "unmute" button to enforce there being a user interaction
before playing audio.
-
See the
master
branch for changes made since this blog post was completed. ↩ -
Since writing this post, I did add a serverless mode to
minimal-webrtc
; it will be the topic of next week's blog post. (EDIT: Now posted.) ↩
Comments
Have something to add? Post a comment by sending an email to comments@aweirdimagination.net. You may use Markdown for formatting.
There are no comments yet.