GStreamer can do a lot. Most importantly it has the exact functionality I was looking for when I wanted to play a stereoscopic video on the Oculus Rift: Decoding a video stream and applying a GLSL fragment shader to it.
I found a few solutions that to try to achieve that goal, but they were very unsatisfactory. Mostly they failed to decode the video stream or didn’t start for other reasons. They are not maintained that well, since they are recent one man projects with compiled only releases on forums. And worst of all, they only support Windows.
Surprisingly, I experienced the best results with OculusOverlay and VLC Player. Which transforms a hardcoded part of your desktop in a very hacky way with XNA. Works also with YouTube.
VideoPal is a player written in Java and using JOGL. In theory it could work in Linux but:
Exception in thread “main” java.lang.UnsatisfiedLinkError: Could not load SWT library. Reasons:
no swt-win32-3740 in java.library.path
Yeah.. no time for jar reverse engineering and no link to the source. I was able to run it on Windows, but it couldn’t open a H264 video.
There is also OculusPlayer using libvlc but does not release the source. The idea is good, but it didn’t work.
VR Player is a GPLv2 licenced Player written in C#. It also couldn’t decode the stream.
Another player is VR Cinema 3D, which does not play a stereo video, but simulates a virtual cinema with a 2D film. Funny idea.
Get some Stereo Videos
You can search for stereoscopic videos on YouTube with the 3D search filter. There a tons of stereoscopic videos, like this video of Piranhas.
Download the YouTube video with the YouTube downloader of you choice, which supports 3D videos. For example PwnYouTube.
For convenient usage in the terminal you should rename the file to something short and without spaces.
The minimal GStreamer pipeline for playing the video stream of a mp4 file (QuickTime / H.264 / AAC) looks like this
$ gst-launch-1.0 filesrc location=piranhas.mp4 ! qtdemux ! avdec_h264 ! autovideosink
It contains the GStreamer elements for file source, QuickTime demuxer, H264 decoder and the automatic video sink.
If you want more information on the elements, try gst-inspect
$ gst-inspect-1.0 qtdemux
If you want audio you need to name the demuxer and add a audio queue with a decoder and an audio sink.
$ gst-launch-1.0 filesrc location=piranhas.mp4 ! qtdemux name=dmux ! avdec_h264 ! autovideosink dmux. ! queue ! faad ! autoaudiosink
Let’s add some Oculus Rift distortion now. We will use a GLSL fragment shader and the glshader element from the gst-plugins-gl package for that. Since the GStreamer GL Plugins are not released yet, you need to build them by yourself. You could use my Archlinux AUR package or the GStreamer SDK build system cerbero. Here is a tutorial how to build GStreamer with cerbero.
In the GLSL shader you can change some parameters like video resolution, eye distance, scale and kappa. This could also be done with uniforms and set by a GUI.
The final GStreamer pipeline looks like this. Since we are using a GL plugin, we need to use the glimagesink.
$ gst-launch-1.0 filesrc location=piranhas.mp4 ! qtdemux name=dmux ! avdec_h264 ! glshader location=distortion.frag ! glimagesink dmux. ! queue ! faad ! autoaudiosink
Seeking and full screen are features that could be achieved in a GStreamer python application.