Monday, August 3, 2015

[MM] Update, point clouds

Lots of hardware issues and some software ones mean this update isn't the most exciting, but it is obviously better than before! I am rendering a point cloud generated from one kinect (which wasn't the one I started with... hardware issues...) stably in world space. Here I did a programmer dance for you:


Next steps:

  1. Turn off the computer
  2. Instead of taking a break, do wire management and figure out a decent setup for capture in my apartment.
  3. Manually align oculus frame and head position in point cloud
  4. Profile and optimize so that we run at 75 Hz.
  5. Make the point cloud flicker in a cooler way.
  6. Claim Victory.
  7. Add automatic calibration between oculus and kinect
  8. Add a second kinect
  9. Add a third kinect
  10. Experiment with other rendering techniques.

2 comments:

  1. Ok, programmer dancing is way better than programmer art. That alone makes this the winning blog post of the day. You should be able to record and play back the point clouds, right? So we could see your dancing in 3D here on the East Coast?

    Hmm...we could even stream it live via G3D networking, right? There would be latency, but we'd see a real holo-avatar of you in realtime then.

    ReplyDelete
  2. Yes! In fact the project I'm ripping off during this jam was designed for exactly that. They used some clever encoding so that 3 simultaneous 3d videos (each recorded off 3 kinects each) worked fairly well. If all you want is playback and not realtime streaming, there's also the new Microsoft work (https://www.youtube.com/watch?v=kZ-XZIV-o8s).

    ReplyDelete