Microsoft Kinect
Laszlo took the initiative to get an exciting opportunity to play with the the Kinect. There seems to be no shortage of gesture-based coding going on and Gesture Recognition gets simplified thanks to the Kinect SDK.Laszlo quickly introduced the Kinect and it's infra-red projector and camera as well as the fact that it was extremely easy to register for data-ready notifications. Depth and vector data is provided at intervals.
He went on vacation before I asked him for code-level details so I had a look at another project for details.
Setup the runtime:
r = new Nui.Runtime()Register callbacks:
r.DepthFrameReady += DepthFrameReady
r.SkeletonFrameReady += SkeletonFrameReady
Ask for notification:
r.Initialize(UseDepth | UseSkeletalTracking)
Laszlo then demonstrated the limitations of the device (800mm to 4096mm) as well as the limits of the devices (it's not a mouse!).
He is working on a trial-and-error basis to discover what gestures work. He created a pluggable design to switch experiments quickly. He was surprised to discover that some worked better in the large conference room and some worked worse in the visually noisy environment.
He demonstrated navigating a cursor within a box and clicking that box. Beyond that he didn't dream much in the meeting but I did...I'd like to know some wood-chopping stats:
- How much wood did I chop?
- How many chops before it split?
- How long between chops?
My take-away was the Kinect as-is cannot replace the mouse or voice but it makes a lot of sense as a gaming device and maybe Laszlo will figure out how to make it reliably click a button.
No comments:
Post a Comment