The original movement sensing system, first called EYES, was conceived in the mid 70's by Assistant Dance Professor Gram DeFreitus as a music program driven by human movement. The goal was to have a dancer move in front of a video camera and create music. With $2000 from a faculty grant-in-aid program at ASU, Gram DeFreitus and engineer Mark Goldstein created a system which could sense light and dark and send signals to an analog synthesizer. In 1979 work was continued by Dance Professor David Gregory and funded by the Arizona Commission on the Arts over three consecutive years. The system in this form was implemented on an Apple IIe with a Dithesizer digitizing board (resolution 26X27 pixels, 32 levels of gray, 20 fps). The system could then play a set score on the computer's speakers when a person stepped in front of the camera.
In the 90's Robb ported eyes to a Silicon Graphics workstation using SGI's frame grabbing technology (galileo, video studio, and built-in video). This system retained basic movement and presence sensing. During this period Robb added the ability for the system to use multiple cameras to provide three dimensional information. In one mode, the system uses multiple triggers in two image planes to construct a "two and a half Dimensional" sensor. In another the system extracts objects from two image planes and matches them up through asumptions made about the environment. These two objects then combine together to calculate a three dimensional location and size. During this time period the system became sensitve to color information in the environment.
Toward the end of the 1999, the system was ported to work within the MAX environment. This system is now available through SquishedEyeball.com.