Monday, August 3, 2015

[MM] Plan for Direct Body Placement in VR

I'm Michael Mara, graduate student at Stanford University under the advisement of Pat Hanrahan and researcher at NVIDIA. For the past week I had been waffling back and forth between a large amount of possible projects for the VR Jam. This is a partial list of what I seriously considered:


  • AR by scanning room with kinect beforehand and importing model into VR scene.
  • Music Visualization
  • Big screen TV in VR
  • Full virtual machine desktop in VR
  • Godzilla stomping buildings (using kinect to track legs and a VR headset for display)
  • Video Conferencing in VR (use a kinect or stereo camera to get color+depth, render a 3D floating head/full body with only moving head)
  • VR Remote Control car/drone
  • RTS game where you are explicitly a commander using VR to control troops (this plays into the limitations of current VR, but would require many jams worth of prereqs (screens in VR, gesture control))
  • Soccer Header Game
  • Short 3D film in VR movie theatre, with something completely coming out of the screen
  • Getting a barbershop haircut
  • VR Cards
  • VR Ping Pong
  • Cast magic spells
  • Dodge bullets in the Matrix (Obviously would be better with bodytracking…)


Searching around earlier today for other projects to perhaps spur me towards one of these ideas I found this video:


I encourage you to watch the video, but the gist is that this person set up three Microsoft Kinects (color+depth cameras) in an equilateral triangle around his body while using an Oculus Rift. The depth and color data is then used to reconstruct a low quality mesh of his body every frame.


Blurry, "glitchy", but your actual hands, in VR!

What this gives you is a blurry glitchy mess of an avatar... that is your real body, fully articulated, and has very low latency (just the Kinect latency for giving a frame and a frame to process it client side).

This was amazing to me, and I decided to drop my other ideas until I have implemented a rudimentary version of this. Matt Fisher, a postdoc in my research group at Stanford will be providing some assistance. I also want to take it a bit further, and further play into the "glitchy" feel of the resulting avatar by adding rendering effects that play this up and make the experience feel like one from a movie or game. A couple examples of "glitchy" non-photorealistic rendering (NPR) below:

Glitch effect from the 2012 movie Wreck it Ralph
Sample image from ImageGlitcher

There are a lot of moving parts in this project (and I will be setting up one of my dev environments from scratch concurrently with the jam, so I will be keeping my goals for the jam modest, with a very healthy list of optional goals that would keep me busy for at least a month just in case everything goes amazingly smoothly.

Goal: Render a properly positioned avatar of myself in real time on a DK2 with low-latency, fairly accurate tracking, and non-photorealistic rendering.


Tools:
Mandatory Steps:
  1. Get color and depth data out of a single Kinect within a G3D VRApp.
  2. Render a point cloud based on the Kinect data
  3. Remove background points from the point cloud
  4. Calibrate the transform between the Kinect's camera space and the Oculus's
  5. Design a simple NPR shader to add "glitchiness" to the now existent avatar
Optional Steps:
  • Add a second Kinect
  • Add a third Kinect
  • Investigate the method used by Oliver Kreylos to calibrate all three Kinect's and implement it
  • Investigate the method used by Oliver Kreylos to construct a mesh from the multiview depth data, and implement it.
  • Test using screen-space raytracing for rendering the avatar directly
  • Use a Kinect v2 or other higher-quality sensor
  • Improve the NPR shader
  • Apply shadowing to the virtual scene using the rendered avatar
  • Apply AO to the virtual scene using the rendered avatar
  • Calculated Spherical Harmonic lighting for the real world, and subtract it out in the rendering, apply virtual lighting instead.

Several potential problems:

  • I will be setting up a dev environment in parallel with the jam, at a location 15 minutes away from my current dev machine.
  • Just getting the Kinect to play nicely; I tried for a long time today to get the Kinect v2 working with my computer before giving up and going back
  • Calibration is Hard. Very hard to get high accuracy; I wonder how accurate you need to be for presence?
  • NPR shaders are easy to make, but not necessarily easy to make look good.

No comments:

Post a Comment