Dale Phurrough http://hidale.com Exploring the creative and professional of interactive multimedia art, software engineering, and participatory experiences Wed, 05 Oct 2016 14:25:33 +0000 en-US hourly 1 http://hidale.com/wp-content/uploads/2016/02/cropped-dpkinect2face-32x32.png Dale Phurrough http://hidale.com 32 32 Stereoscopic 3D demo w/ tracking http://hidale.com/2013/05/stereoscopic-3d-demo-w-tracking/ Tue, 21 May 2013 00:46:55 +0000 http://hidale.com/?p=557 Continue reading ]]> I finished my first version of a stereoscopic 3D rendering technique that I can use for my interactive installations. My technique uses Kinect tracking to adjust the 3D image based on the real-world position of the viewing person. This allows for a richer interactive 3D experience.

You can view this specific demo video without any special red/blue glasses. You only need to:

  • View it fullscreen
  • View it in HD
  • Cross your eyes to view a merged image in the middle
  • Adjust your viewing distance somewhere between 0.75 – 1.5 meters. The optimal distance depends on the physical width of your monitor and the space between your eyes.

Codemotion Berlin 2013 Speaker http://hidale.com/2013/05/codemotion-berlin-2013-speaker/ Wed, 08 May 2013 13:28:54 +0000 http://hidale.com/?p=535 Continue reading ]]> I have been selected as a speaker for this year’s Codemotion Berlin 2013. My talk is Interaction and OpenGL 3D graphics using Microsoft Kinect and Cycling ’74 Max. A 25% discount is available by buying your ticket with this link.

Codemotion Berlin 2013During this university style interactive session, you will learn easy methods for connecting to the Microsoft Kinect 3D depth sensor using patching tools like Cycling ’74 Max. After briefly learning the basics, you will see how two streams of data from the Kinect can be manipulated easily using Max and displayed in an interactive OpenGL 3D graphics environment.

You will leave the session with access to the slides and patches slides and code examples from which you can learn and reuse. Prior knowledge of the Kinect, Max, or OpenGL graphics is not required. Questions from you, the participants, are encouraged.

Slides and Patches

Kinect physics in Max with OpenGL graphicsThe slides and patches are available for download. The patches in this ZIP file require the dp.kinect external which provides the connection between Max and the Microsoft Kinect. For those on the Mac OS X platform, these patches also work by using the jit.openni external.

jit.openni (Max and Jitter for OpenNI access to your Kinect) http://hidale.com/2011/10/jit-openni-max-and-jitter-for-openni-access-to-your-kinect/ Tue, 18 Oct 2011 23:56:05 +0000 http://hidale.com/?p=139 Continue reading ]]> I have written a rich Max Jitter external called jit.openni which allows usage of sensors like the Microsoft Kinect and ASUS X-tion PRO in your patchers. It exposes almost all the functionality from sensors like the Kinect in an easy to use native Max Jitter external for Windows computers. It has support for:

  • Configuration of OpenNI by an OpenNI XML configuration file; see OpenNI documentation for format (an example is provided in the install)
  • ImageMap of RGB24 output in a 4-plane char jitter matrix
  • DepthMap output in a 1-plan long, float32, or float64 jitter matrix
  • IrMap output in a 1-plan long, float32, or float64 jitter matrix
  • UserPixelMap output in a 1-plan long, float32, or float64 jitter matrix
  • User events (e.g. user seen, user lost, etc.)
  • Center of mass for identified users
  • Floor identification
  • Skeleton joints with optional orientations
  • User events, center of mass, and skeleton joints in an native OSC format, max route friendly format, OSCeleton legacy format (as of 2011-07-25 OSCeleton codebase)
  • Attributes to filtering data based on position or orientation confidence, display or not the orientation data, and smooth skeleton data using OpenNI’s smoothing API
  • Camera field of view
The main page for jit.openni is here. Full wiki documentation and installation instructions are available on GitHub. Or jump straight to the downloads.