Using libfreenect2 and the IAI Kinect v2 ROS modules you can easily watch a point cloud generated with your Kinect v2 sensor.

You need a Kinect for Windows v2, which is a Xbox One Kinect with an adapter to standard USB 3 from Microsoft’s proprietary connecor and hence, a PC with an USB 3 port.

$ pacaur -S ros-jade-kinect2

Using my ros-jade-kinect2 AUR package, you can install all required dependencies, such as a ton of ROS packages, Point Cloud Library and libfreenect2, which are all available in the Arch User Repository.

Testing libfreenect2

After installing libreenect2 you can test your Kinect v2 with the following command

$ Pronect

If everything runs fine you will get an image like this from Pronectpronect

In the image above you can see the unprocessed infrared image (top left), the color sensor image mapped to the calculated depth (top right) , the unprocessed color image (bottom left) and the calculated depth image (bottom right).

By default Pronect uses an OpenGL to generate the depth image. To test libfreenect2’s different DepthPacketProcessor backends you can do

$ Protonect cl

Possible backends are: [gl | cl | cuda | cpu]

A possible error can be insufficient permissions for the USB device:

[Error] [Freenect2Impl] failed to open Kinect v2: @2:9 LIBUSB_ERROR_ACCESS Access denied (insufficient permissions)

libfreenect provides a /usr/lib/udev/rules.d/90-kinect2.rules file which gives the Kinect the udev tag uaccess to provide user access. The error is generated when this did not work out. It can be fixed with a relogin. udevadm control -R didn’t seem to work. Running Pronect with sudo will also help temporally.

Using ROS

You can enter your ROS environment with

$ source /opt/ros/jade/setup.bash

You probably should create an alias for this environment in your shell config.

Now you can launch the roscore and leave it in a separate shell.

$ roscore

Install ros-jade-rosbash for rosrun. You now can list the options of the kinect2_bridge module.

$ rosrun kinect2_bridge kinect2_bridge -h

The default options for kinect2_bridge are OpenCL registration and OpenGL depth method. You can start it like this

$ rosrun kinect2_bridge kinect2_bridge

Possible Problems

This fails for me on NVIDIA with

[ERROR] [Kinect2Bridge::start] Initialization failed!

This is due to the OpenCL registration method failing to initialize the OpenCL device.

A different error occurs with the beignet OpenCL implementation for Intel. It seems the OpenCL registration method does not support beignet’s shader compiler.

[ INFO] [DepthRegistrationOpenCL::init] devices:
[ INFO] [DepthRegistrationOpenCL::init] 0: Intel(R) HD Graphics Haswell Ultrabook GT3 Mobile (GPU)[Intel]
[ INFO] [DepthRegistrationOpenCL::init] selected device: Intel(R) HD Graphics Haswell Ultrabook GT3 Mobile (GPU)[Intel]
[ERROR] [DepthRegistrationOpenCL::init] [depth_registration_opencl.cpp](216) data->program.build(options.c_str()) failed: -11
[ERROR] [DepthRegistrationOpenCL::init] failed to build program: -11
[ERROR] [DepthRegistrationOpenCL::init] Build Status: -2
[ERROR] [DepthRegistrationOpenCL::init] Build Options:
[ERROR] [DepthRegistrationOpenCL::init] Build Log: stringInput.cl:190:31: error: call to 'sqrt' is ambiguous

This can be solved by using the CPU registration method

$ rosrun kinect2_bridge kinect2_bridge _reg_method:=cpu

The OpenCL depth method with beignet produces a black screen. This can be solved by using the OpenGL depth method, it works fine with mesa.

$ rosrun kinect2_bridge kinect2_bridge _reg_method:=cpu _depth_method:=opengl

Viewing the Point Cloud

Finally, open a shell in the ros environment and launch the viewer:

$ rosrun kinect2_viewer kinect2_viewer

This will show you a point cloud with the color sensor mapped to the depth buffer. It will look slightly shifted. You need to calibrate your Kinect in order to have a better mapping.

color

You can also run the viewer in ir mode to see only the depth sensor.

$ rosrun kinect2_viewer kinect2_viewer ir

ir

Congratulations, you can do Point Cloud Selfies now

selfie

For more information about ROS Jade on Arch Linux, see http://wiki.ros.org/jade/Installation/Arch

One thought on “Viewing Kinect v2 Point Clouds with ROS in Arch Linux

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s