Grasshopper

algorithmic modeling for Rhino

Multi-touch Grasshopper Interaction with Kinect: Open Source

We have finally cleaned up our code, documented, and commented enough to release our depth-as-touch interaction with the Microsoft Kinect into the public domain as free, opensource, copyleft, GPL-licensed code.  As our demo interaction was built around our favorite software (Grasshopper), we thought it best to announce it here.  The source is written in C/C++ linked against Microsoft's Kinect for PC SDK.  We've provided the Visual Studio solution files, our code, but you will have to link it against Microsoft's SDK if you want to make changes.  We have, however, released builds of the SurfaceCalibration, ExtentsCalibration and MissionMode executables...so if you have a Kinect, a projector, Grasshopper, and a finger (it helps to have between 2 and 10!) you can test it out right away.

The source and builds can be downloaded here...

http://lmnts.lmnarchitects.com/interaction/grasshopper-canvas-with-...

The code walkthrough is here...

http://lmnts.lmnarchitects.com/interaction/grasshopper-canvas-with-...

Happy Holidays and New Year!

Views: 3949

Comment

You need to be a member of Grasshopper to add comments!

Comment by lmnts on December 21, 2011 at 11:07am

Thanks Andy!

It would be great to see RGBD sensors get cheaper and more modular...that's likely in the nearish term, right?  We can imagine plugging a little RGBD sensor into an Arduino, grabbing the depth buffer, doing the CV and particle analysis, and then passing events to the OS or application.  Might even fit in nicely to Firefly that way (?).  The more accessible, the better.  At the moment, we'd be worried about speed...that's the main reason we implemented this as compiled code.

Comment by Andy Payne on December 21, 2011 at 10:42am

Very nice work.  Congrats.

About

Translate

Search

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service