In July 2015 I did a VR demo on the CV Tag at the University of Koblenz, which uses two Arch Linux PCs with two Oculus Rift DK2s and two Kinect v2s. It utilizes ROS‘s capabilities to stream ROS topics over the network.

To run Holochat you need to setup libfreenect2 and ROS first, explained in Viewing Kinect v2 Point Clouds with ROS in Arch Linux.

ROS Topics

You can list the ROS topics with following command

rostopic list

You will see that kinect2_bridge provides topics in 3 different resolutions: [ hd | qhd | sd ], and also different formats and the option for compression. The IR image is only available in SD, due to the sensor size.

To test their bandwidth you can use

$ rostopic bw /kinect2/hd/image_color
subscribed to [/kinect2/hd/image_color]
average: 101.23MB/s
 mean: 6.22MB min: 6.22MB max: 6.22MB window: 14
average: 96.92MB/s
 mean: 6.22MB min: 6.22MB max: 6.22MB window: 29

You will notice that the uncompressed color image is ~101.23MB/s , the calculated uncompressed depth image is ~125.66MB/s and the uncompressed IR image is only 12.75MB/s.

By default kinect2_viewer accesses /kinect2/qhd/image_color_rect and /kinect2/qhd/image_depth_rect. The IR mode has a lower bandwidth, since sd/image_ir_rect and sd/image_depth_rect combined require only ~28MB/s, and compressed ~17MB/s, which should be achievable over 100MBit/s LAN.

The depth buffer + IR point cloud will look like this

selfie

You can set topic options as explained on the help page

$ rosrun kinect2_viewer kinect2_viewer -h
/opt/ros/jade/lib/kinect2_viewer/kinect2_viewer [options]
 name: 'any string' equals to the kinect2_bridge topic base name
 mode: 'qhd', 'hd', 'sd' or 'ir'
 visualization: 'image', 'cloud' or 'both'
 options:
 'compressed' use compressed instead of raw topics
 'approx' use approximate time synchronization

Lets add some VR support to the viewer

In order to make this holographic I patched the kinect2_viewer with OculusSDK support. Since the SDK is no longer maintained, I recommend a modified version from jherico, which can be found in oculus-rift-sdk-jherico-git on the AUR.

At the time I wrote the patches OpenHMD was lacking head tracking support and libraries like Valve’s OpenVR and Razer’s OSVR were not around. They still are not really usable with the DK2 at the time I am writing this article.

My patched iai-kinect branch can be found on GitHub. I made an AUR package with the viewer and my VR patches.

$ pacaur -S ros-jade-kinect2-viewer-oculus

To run the patched viewer you need to have ovrd running. If this does not work out, try killing it and running it as root. It auto starts with your session and does not exit if it lacks the udev rights. Make sure you have oculus-udev installed.

Make sure your headset is available in

$ OculusConfigUtil

When the viewer starts, you need to manually maximize it on the VR headset 😉 The user base of the demo (me) thought this was sufficient.

ROS Networking

ROS prints host name and port when you start roscore

$ roscore
...
started roslaunch server http://kinecthost:43272/

If you have the above setup on two machines, you can run kinect2_bridge on the host as usual. On the client you need to provide the host’s ROS_MASTER_URI when running the viewer.

$ ROS_MASTER_URI=http://kinecthost:11311/ rosrun kinect2_viewer kinect2_viewer ir

Adding a simple audio stream using GStreamer

To run the following pipelines, you need to install the GStreamer Good Plugins. The pipeline use your PulseAudio default devices for recording and playback. You can set them for example in GNOME’s audio settings.

On the host run:

gst-launch-1.0 udpsrc port=5000 \
                      caps='application/x-rtp, media=(string)audio, clock-rate=(int)44100, encoding-name=(string)L16, encoding-params=(string)1, channels=(int)2, payload=(int)256' ! \
    rtpL16depay ! \
    pulsesink sync=false

On the client you can run following pipeline for playback. You need to change soundhost to the host name / IP of the sound host.

gst-launch-1.0 pulsesrc ! audioconvert ! \
    audio/x-raw, channels=1, rate=44100 ! \
    rtpL16pay ! udpsink host=soundhost port=5000

Putting everything together

Now you just need 2 DK2s, 2 Kinect v2s and to do this setup on 2 machines so you can have holographic video conferences. Have fun with that.

I

One thought on “Holochat – An Holographic Telecommunication Demo

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s