This paper describes a sensing domain application that is tailored to be use for live performance.  It contains several key components that these kinds of applications need to have in order to be integrated into the live performance context.

 

The primary focus of the program is to extract and provide meaningful information about what is happening in the space based upon the sensor used, the context of its use, and the techniques the application exposes to the user.  However, an essential component needs to be the integration of the application into the environment in which it is working.  The application needs to be adaptable to many protocols and interfaces to be able to do this.  Eyecon targets Ethernet, MIDI, OSC protocols as being important to be able to link to applications like Director, Max/MSP, Reactor, and Isadora. 

 

Another important concept is that the application doesn't think of itself as a primary controller, but as a social element, able to receive and give command from and to other applications.

 

Another useful conceptual idea is to divide the problem into two different parts: sensing and actions.  This is critical for organized structuring of interactions.

 

The ability to visualize the results of sensing and see how it is functioning is critical to debugging interactions.

 

A scripting feature that allows the interactive scenes to be sequenced is critical to the integration of sensing into live performance.  This allows the sensing to be transitioned  from moment to moment and scene to scene.

 

A disadvantage of the system is that it doesn't work like conman or max or Isadora and is a fixed capability architecture.  This kind of sensing is explored in Eyesweb and ARIA.