A Weird Imagination

Virtual microphone using GStreamer and PulseAudio

The problem#

My previous post got the video from my smartphone to show up as a camera device on my desktop, but for a video chat, we probably also want audio. So, now the question is: how to build GStreamer pipelines that will allow minimal-webrtc-gstreamer to use virtual microphone and speaker devices that I can point a voice/video chat application at, allowing me to use my smartphone's microphone and speaker for applications on my desktop.

The solution#

The following requires that you are using PulseAudio as your sound server and have downloaded minimal-webrtc-gstreamer:

pactl load-module module-null-sink sink_name=virtspk \
    sink_properties=device.description=Virtual_Speaker
pactl load-module module-null-sink sink_name=virtmic \
    sink_properties=device.description=Virtual_Microphone_Sink
pactl load-module module-remap-source \
    master=virtmic.monitor source_name=virtmic \
    source_properties=device.description=Virtual_Microphone
./minimal-webrtc-host.py\
    --url "https://apps.aweirdimagination.net/camera/"\
    --receiveAudioTo device=virtmic\
    --sendAudio "pulsesrc device=virtspk.monitor"\
    --sendVideo false --receiveVideo false

You can reset your PulseAudio configuration by killing PulseAudio:

pulseaudio -k

You can make the PulseAudio settings permanent by following these instructions to put them in your default.pa file.

The details#

Read more…

Virtual web cam using GStreamer and v4l2loopback

The problem#

I want to make my smartphone's camera appear as an actual camera device on my desktop so any application (primarily Discord) can use it like it were a normal USB web cam.

My previous post introduced minimal-webrtc-gstreamer, which got as far as getting the video stream from any web browser into a GStreamer pipeline, which reduces the problem to outputting a GStreamer pipeline into a virtual web cam device.

The solution#

Download minimal-webrtc-gstreamer and install v4l2loopback. Then run

sudo modprobe v4l2loopback video_nr="42"\
    'card_label=virtcam'\
    exclusive_caps=1 max_buffers=2
./minimal-webrtc-host.py\
    --url "https://apps.aweirdimagination.net/camera/"\
    --receiveVideoTo /dev/video42\
    --sendAudio false

You can test by watching the stream with

gst-launch-1.0 v4l2src device=/dev/video42 ! autovideosink

Note that some applications, including the current desktop release of Discord may not support the virtual camera, showing a solid black square or failing to connect to it at all. It should work in the latest Chromium/Chrome browser, including for the Discord web app.

When done, remove the virtual camera device:

sudo modprobe -r v4l2loopback

The details#

Read more…

GStreamer WebRTC

The problem#

In my previous posts on minimal-webrtc, I set up a peer-to-peer connection between the web browsers on two different devices. For more flexibility, including making the remote camera and microphone appear as local camera and microphone devices, we need to handle the WebRTC connection outside of a web browser.

The solution#

minimal-webrtc-gstreamer is a command-line client for minimal-webrtc written in Python using the GStreamer library. It's mostly a modification of the webrtc-sendrecv.py demo script to use minimal-webrtc as the signaling server to make it easier for me to tinker with.

Run as follows:

./minimal-webrtc-host.py\
    --url "https://apps.aweirdimagination.net/camera/"\
    --receiveAudio --receiveVideo any

It will output a URL as text and QR code for the other device to connect to. With those options, the output from that device's camera will be shown on screen and the output from its microphone will be played through your speakers. That device will be sent video and audio test patterns. See ./minimal-webrtc-host.py --help for more information.

The details#

Read more…

Serverless WebRTC

The problem#

While in my last post, I said serverless WebRTC was too cumbersome, I wanted to try to see how streamlined I could make the process. While researching, I did find a variant of serverless-webrtc called serverless-webrtc-qrcode as well as a similar demo, webrtc-qr that both use QR codes as an easy way to transmit the offer strings. But both require that both sides have a camera to scan QR codes, while my use case is a WebRTC connection between my desktop without a camera and my smartphone.

The solution#

minimal-webrtc now has a checkbox to enable serverless mode. In that mode, the QR code shown by the host is a much longer URL that includes the initial WebRTC offer. Opening that URL on another device (or in another browser window) will show another QR code along with a "Copy" button. With the first device, either press the "Scan QR code" button and point it at the QR code or use some other mechanism to copy the text and paste it into the text area labeled "Paste offer here:".

To run it locally, download the source code and run a web server to serve the wwwroot/ directory. If both devices can run a web server, then you can just access it via localhost on each, but, as before, because WebRTC requires HTTPS, to run it on your local network, you may need to set up a self-signed certificate.

The details#

Read more…

Minimal WebRTC

The problem#

I wanted to stream video of myself and my screen at the same time. My plan was to put the video of myself on my screen and stream the entire screen, but I do not have a camera on my desktop. On the other hand, I do have a smartphone with a camera, so I needed a way to show the video from my phone's camera on my desktop's screen.

There are a few Android apps that promise to do so, but none of the ones I tried worked very well. But I know that video chat works just fine on my phone, including in a web browser using WebRTC which supports peer-to-peer video chat between two web browsers, so it should be easy to show the video from my phone's camera in a browser window on my desktop. Unfortunately, I couldn't find any straightforward solution for setting up just a peer-to-peer video link.

The solution#

Open minimal-webrtc on the computer you want the video streamed to. A QR code will appear; use your smartphone to read it and after approving access to the camera, the video should appear in the first browser window. This is intended to be used for local connections, so it may not work if the two devices are not on the same network. Only the signaling to set up the connection goes to the minimal-webrtc server, the actual video will be sent peer-to-peer over the local network.

To get just the video as a bare window with no decorations, use chromium --app=uri to get rid of the address bar, etc., and this script to remove the rest.

To host it yourself, download the source code1 and use the included run_daphne.sh script (which assumes daphne is installed) and nginx configuration. As WebRTC requires HTTPS, to run it on your local network, you may need to set up a self-signed certificate.

The details#

Read more…