I was able to attend SVVR 2016 last week, where I experienced many insightful VR impressions and saw where the industry is currently heading to. I also want to thank Collabora for giving me the possibility to do this. Note that the opinions in this article are my own and not Collabora’s.

Booths

SculptrVR

Since I was always enthusiastic about voxel graphics, I needed to check out SculptrVR’s HTC Vive demo. It utilizes the Vive’s SteamVR controller and the Lighthouse full room tracking system to achieve an unique minecraft-esque modeling experience. The company is a startup dedicated to this one product. The editor is currently capable of creating voxels with different cube sizes and colors, and more importantly to destroy them dynamically with rockets. The data structure used is a Sparse Voxel Octree implemented in C++ on top of the Unreal 4 engine. It is capable of export the models to an OBJ vertex mesh. Voxel format export like MagicaVoxel is not yet supported. The prototype was implemented in the Unity engine with Leap Motion input and Oculus DK2 HMD support, but the developer dropped it in favour of Vive and Unreal 4, which gave him more solid tracking and rendering. Product development will continue in the direction of social model sharing and game support. Their software is available on Steam for 20$.

Whirlwind VR

One of the most curious hardware accessories was presented by Whirlwind VR, which is basically a fan with an USB connector and Unity engine plugin. It obviously adds immersion to demos involving rapid movement in an open air vehicle. Other cases utilize the fan’s heating system in order to simulate a dragons breathing fire into your face. A wide spread market is questionable for this product, but I encourage to explore every bit of uncaptured human sense left.

IMG_20160427_170344.jpg

Sixense

The creators of the first consumer available VR controller Razor Hydra were presenting their new full room body scale tracking system STEM, including VR controllers for your hands and 3 boxes fot feet and head. Targeted to Oculus Rift CV1 customers, which do not receive this functionality out of the box, in contrast to the HTC Vive. The demo setup was a fun 2-player medieval bow shooting game, created with the Unreal 4 engine. One PC was using the standard Vive Lighthouse tracking, the other Oculus CV1 with Sixense’s product prototype. Tracking results were very comparable, where I would prefer the Vive due to the controller’s more finished user experience and feel. Even more if you think about the $995 price tag for the full 5 tracker system, compared to the HTC Vive’s $300 difference to the Oculus CV1.

mimesys

A personally very interesting demo was presented by Paris located company mimesys, which specializes in telepresence, or “holographic telecommunication”. They used a Kinect v2 for capturing a point cloud, reconstructing a polygon mesh and sending it compressed over the internet. Which is comparable to ideas in my prior work Holochat. In their live demo you were able to use the SteamVR controller to draw in the air and “Holoskype” with their colleague being live in Paris. The quality of mesh and texture streaming was in a very early stage from my point of view, knowing there is more potential in a Kinect v2 point cloud. Overall the network latency of the mesh was pretty high, but unnoticable since voice chat was the primary communication method (which was achieved over Skype btw). The company’s product is a Unity engine plugin, implementing vertex and texture transfers over the internet. The usage of video codecs for textures would improve this kind of data transfer, which was not implemented as far as I noticed.

Tactical Haptics

A very well researched haptic feedback controller prototype was presented by Californian company Tactical Haptics. You can notice the academic background of mechanical engineering and haptics when you talk to professor and founder William Provancher. He knew numbers like 1000Hz, which is required for haptic feedback to trick the human skin for being real time, in contrast to only 60Hz for the human eye. Their physics engine running only at ~100Hz (don’t know the exact number anymore) was more than sufficient for their first class haptics system. With the interactions in the demo being rather rough, like juggling cubes with Jedi powers or shooting drones with a bow, the latency was more than appropriate to have an immersive haptic experience unique on the expo. Their product Reactive Grip achieves to replicate the “skin sensations of actually holding an object” and has an imaginable wide spread use in future VR controllers for action experiences and workouts.

Ricoh THETA

Japanese hardware vendor Ricoh was presenting their already marketed consumer targeted spherical camera. Being able to seamlessly stitch it’s two wide angle sensors (> hemispherical) on the hardware and sending it to the phone providing a very user friendly interface. A client agnostic REST like interface is provided to control the camera with the Android and iOS clients. Streaming video is also possible, but requires a PC with their proprietary driver for Windows and MacOS to be stitched in real time. The camera costs only about 400 bucks and will very soon flood the internet with many amateur spherical photos and videos. Spherical video and audio was a big topic on the expo, but I have a problem with the marketing term for it being 360° video, since degrees are for 2D angles, and in 3D we deal with solid angles. So please call it 4π Steradian Video from now on if you address the full sphere, or 2π sr for the hemisphere.

Nokia Ozo

Nokia has stopped making unpopular phones and started targeting the professional spherical video artist with their 8 sensor spherical stereoscopic camera available for affordable 60 grand. The dual sensor setup in every direction, separated by average eye distance was told to provide perfect conditions for stereo capturing. They also provide the editing software to be used with it. It seemed that for editing purposes the raw sensor data is stored in 2×4 circular tiles on a planar video. The video can surely be exported into 2 (because we have stereo) spheres with the commonly used equirectangular mapping on a plane, which is more storage efficient, since we do not have tons of black borders. Their live demo where you could view the “live” camera output with a DK2 was rather disappointing, because of a latency of 3s (yes, full seconds) and very noticable seams. Their software does not target live video processing yet, but there wasn’t a finished rendering available either. The camera looks like a cute android though.

IMG_20160428_105143

High Fidelity

San Francisco based future driven company High Fidelity wants to build the software for VR what the apache server is for web. In the keynote founder Philip Rosedale talked about the “Hyperlink for VR”. They provide an in house game engine with multiplayer whiteboard drawing and voice chat support. The client is open source software, as is the server, and meant to run on Linux, Windows and MacOS. Telepresence is a very important topic for VR, but High Fidelity yet lacks the possibility of including depth sensors like Kinect or spherical cameras into their world and represents humans by virtual models, which directly brings you in the uncanny valley. Especially when skeleton animation is buggy and your model is doing unhealthy looking yoga merged into the floor. Great stuff though, looking forward to build it for myself and fix some Linux client issues 🙂

OSVR

Razor was presenting their open source middleware which wraps available drivers into their signal processing framework and does things like sensor fusion using available computer vision algorithms like the ones in OpenCV. They also provide their OSVR branded headset, marketed as Hacker Development Kit, which has the freedom of having a replaceable display and other repairable components. The headset does 1920×1080@60Hz, which is a slightly worse frame rate than achieved by the Oculus DK2. The most remarkable factor for this headset was the visible lack of screen door effect at this resolution, which was achieved by a physical distortion filter on the display. If you don’t own a DK2 and don’t have the € for a Vive, the 300$ OSVR is a very solid option. Also using OSVR as software will make supporting not only headset hardware easier, since it also provides support for controllers like SteamVR and tracking systems like Leap Motion. You only need to implement OSVR once, instead of wrapping all HMD and input APIs in your application. Valve’s OpenVR is also an attempt to do that, but lacked presence at the conference.

IMG_20160429_143624

NVIDIA

Graphical horsepower could be experienced in both NVIDIA demos, running on current high end NVIDIA GTX980 cards. I was a little disappointed they did not bring one of their yet to be released Pascal architecture cards though. NVIDIA had both high end consumer headsets Oculus CV1 and HTC Vive on its show floor. The Oculus demo was gaming oriented, and since the Vive is more capable of VR productivity demos, due the full room tracking and use of a VR controller, it was used to do more creative things.

Oculus CV1 + Eve: Valkyrie

The first demo was showcasing the Oculus CV1 with space ship shooter Eve: Valkyrie. The rendering was smooth due to the CV1’s 90Hz frequency and screen door effect was also eliminated by the HMD’s 2160×1200 resolution. Experienced VR users will very quickly note the small tracking area, being basically limited to a chair in front of a desk, in contrast to full room tracking with HTC Vive. The user experience is also very traditional, only using a Xbox game pad. With the lack of an VR controller, users very unintuitively experience a lack of their hands in VR and cannot interact in 3D space. Hands are much better for this than an analogue stick, as we will see further below in this article.

Rx4X1

Classical game pads also have the problem of changing button order for every manufacturer, which makes it very complicated for experienced gamers to know which button is accept and which one is back. This issue also guided me in Eve: Valkyrie’s payment acceptance screen, which I needed to be guided out of by the booth exhibitor.

HTC Vive + Google Tilt Brush

One of the most influential VR experiences for me was Google’s Tilt Brush on the HTC Vive, which is 2160×1200@90Hz as well btw. The ability to draw freely in 3D space with full room tracking and the Vive’s immersive display capabilities provides a very unique experience which feels like a new medium for artists. The user interface is very simple, having a palette on the left controller and a brush on the right. Of course you can switch hands easily, if you are left handed. This natural input and camera movement enabled the user to be creative in 3D space without the steep learning curve of contemporary “2D interface” CAD software. The creative process of expression possible with Tilt Brush is a good reason itself for getting a HTC Vive for home already. Looking forward to stuff artists can do now.

The other demo on the productivity booth was a point cloud of NVIDIA’s new headquarter’s construction site, recorded by drones with depth sensors. The scene’s resolution and rendering was not remarkable, but yet fun to navigate in with SteamVR controllers. I can definitely see the application of architects and construction engineers planning their stuff in VR.

IMG_20160428_134637

Noitom

Another one of the top three demos I was able to experience was presented by #ProjectAlice under the name Follow the White Rabbit. They are utilizing Noitom’s tracking system to achieve a remarkable multiplayer VR demo, where real objects like garbage cans are perfectly tracked into the virtual world and can interact with virtual objects. They were using regular Wiimote controllers with plastic markers to showcase the potential of their own tracking. Their demo emphasizes how important interaction is in the virtual world and how natural it needs to feel. The scale of the tracking system is rather for public installations that home environments, but I would love to see more of real / virtual world interaction in consumer products, which can also be achieved with consumer trackers like HTC Vive’s Lighthouse. Also note the hygienic face masks they’ve offered for public HMDs. They were the ninjas of VR.

IMG_20160428_135728.jpg

IMG_20160428_135847

Talks

Keynote

Keynote Speakers showed their current products and market visions. AltspaceVR, NVIDIA, Nokia, High Fidelity and SpaceVR gave presentations. SpaceVR will launch a spherical camera into space, which you can stream from home onto your headset. Road to VR editor Benjamin Lang gave his insights about the so far development of the industry and his 100 year forecast to achieve perfect immersion. I think it we will be there in <30.
Palmer Lucky was also there to drop some game title names from the Oculus store and quickly ran out of the expo terrain after his session, to avoid much public interaction.IMG_20160428_100038

Light Fields 101

Compared to conventional CMOS sensors, light field cameras like Lytro are able to receive photon rays from multiple directions and offer a data set where things like focusing can be done in post production or in live VR environments. Ryan Damm showed insights into his understanding of and research in Light Field technology, where according to him many things happen secretively and mentioned companies like Magic Leap, which are still in the process of developing their product.

How Eye-Interaction Technology Will Transform the VR Experience

Jim Marggraff presented his long professional experience with eye tracking and how necessary it is in natural interfaces. He showed a whack-a-mole on Oculus DK2 where head tracking and eye tracking were compared. A random person from the audience obviously could select the moles faster with eye tracking than by moving his head. He also showed a full featured eye tracking operating system interface where everything could be done with just your eyes, from buying something on Amazon to chatting. Also password authentication is drastically simplified with eye tracking, since you get free retina recognition with it. I think eye tracking will be as essential as hand tracking in future VR experiences, since human interaction and natural input are the next important steps that need to achieve perfection, after we have 4k@144Hz HMDs.

WebVR with Mozilla’s A-Frame

Not only was Ben Nolan the right guy to look for if you’re into authentic American craft beer bars in San Jose, but also a JavaScript developer with enthusiasm for VR in the web browser. He showed A-Frame, an open source JavaScript engine with HMD support which is already available in the popular browser’s nightly builds. The engine contains a XML based model format and scene graph, physics and shading extensibility utilizing the WebGL standard. The big benefit of having VR support in the browser is clearly ease of distribution and quick content generation. He pointed out that a minimal A-Frame project is as small as 1kB, where a Unity web build is at least ~0.5MB. Type aframe.io into your Android phone and try it for yourself.IMG_20160429_110200

Apollo 11 VR – (Designed to Demo, Developed to Educate)

David Whelan of Ireland based VR Education Ltd pointed out how important VR will be in future education. He had a very good point that experiencing tends to have a higher impact in memorizing facts than sitting in a classroom. He showed the development process of their Kickstarter funded Apollo 11 VR demo, and their current work Lecture VR, which both are available on Steam already.

The Missing Link: How Natural Input Creates True Immersion for VR/AR

One of the most amazing on spot talks was given by Leap Motion co-founder David Holz, who pointed out the necessity of natural input in VR. Leap Motion is a ~100$ high FOV infrared camera, which is already available since 2013. But their major technological achievement is observable since only this year, after they released the second iteration of their tracking software, code named Orion. Holz showcased the stability and precision of their hand tracking on stage, which was quite remarkable. But he got the audience when he showed their current Blocks demo, where objects can be manipulated with a physically plausible interaction engine. Natural gestures are used to create objects, grab them and throw them around. If you didn’t see the demo try it at home, you just need a DK2 and a Leap Motion sensor. It feels to be a generation further than the ordinary VR demo and points out how much immersion is gained by seeing your hands and even using them for interaction. He also showed user interface designs for VR, which are projected onto the body and in the room. Conventional 2D interfaces where we need to stack windows and tabs seem very primitive in comparison. He also talked about how VR/AR interfaces will eliminate the necessity of having a work desk and chair, since all meetings and work can be done in the forest or a lounge.

Conclusion

The expo pointed out how important novel human interaction methods are in VR, since it is obvious to replace the keyboard, mouse and game pad with natural body movement tracking as it was to replace conventional displays with HMDs.

A big part the industry also focuses on spherical video, since it’s currently the quickest method of bringing the real world into VR.

All in all, exciting stuff, thanks for reading.

TL;DR Get your hands on a HTC Vive + Tilt Brush and Leap Motion Blocks

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s