dp.kinect (Max external for Microsoft Kinect)

Dale Phurrough will soon release the dp.kinect external which can be used within the Cycling ’74 Max development environment. Watch http://hidale.com/dp-kinect/ for details.

  • It is based on the Microsoft Kinect SDK v1.5.
  • It supports multiple Kinects on the same PC.
  • It was developed and tested against Max 6.0.5. It is untested against earlier versions.
  • It is almost 100% backwards compatible with my OpenNI based external for the Kinect called jit.openni. Migrating to this object should be very easy.

Licensing

dp.kinect is free for evaluation and non-commercial use. For all other uses, including commercial applications, you need to arrange for a license.

I do support creative endeavors and artists. I am an artist myself. As such, I believe that artists should be compensated for their work. No starving artists! When an artist builds on the work of another, I believe that one artist should recognize the other. That recognition can come in many forms (e.g. attribution, compensation, or a beer). The license is simply the means to formalize that recognition in a durable way.

Here are examples which will likely be granted a no or low-cost license::

  • Installation artist creating a work for a gallery installation
  • VJ performing in a local club

Here are examples which will require a paid license:

  • Bundled as part of a software solution sold to multiple customers
  • Used in the touring show of a performer, band, or DJ

jit.openni (Max and Jitter for OpenNI access to your Kinect)

I have written a rich Max Jitter external called jit.openni which allows usage of sensors like the Microsoft Kinect and ASUS X-tion PRO in your patchers. It exposes almost all the functionality from sensors like the Kinect in an easy to use native Max Jitter external for Windows computers. It has support for:

  • Configuration of OpenNI by an OpenNI XML configuration file; see OpenNI documentation for format (an example is provided in the install)
  • ImageMap of RGB24 output in a 4-plane char jitter matrix
  • DepthMap output in a 1-plan long, float32, or float64 jitter matrix
  • IrMap output in a 1-plan long, float32, or float64 jitter matrix
  • UserPixelMap output in a 1-plan long, float32, or float64 jitter matrix
  • User events (e.g. user seen, user lost, etc.)
  • Center of mass for identified users
  • Floor identification
  • Skeleton joints with optional orientations
  • User events, center of mass, and skeleton joints in an native OSC format, max route friendly format, OSCeleton legacy format (as of 2011-07-25 OSCeleton codebase)
  • Attributes to filtering data based on position or orientation confidence, display or not the orientation data, and smooth skeleton data using OpenNI’s smoothing API
  • Camera field of view
The main page for jit.openni is here. Full wiki documentation and installation instructions are available on GitHub. Or jump straight to the downloads.