Eitan Mendelowitz and Jeff Burke, "Kolo and Nebesko: A Distributed Media Control Framework for the Arts." First Intl. Conference on Distributed Frameworks for Multimedia Applications (DFMA ’05), February 6-9, 2005, Besanćon, France.


Abstract: This paper describes the architecture of a new control system and associated

scripting language currently under development in a collaboration between

computer scientists, engineers, and artists. The system is designed to facilitate

the creation of real-time relationships between people and media elements in

live performance and installation artworks. It draws on the experience of the

UCLA HyperMedia Studio in producing media-rich artistic works and

suggests an approach also useful for prototyping “interactive” and “smart”

spaces for entertainment and education.

Key words: Live performance, interactivity, theater, scripting, control, smart rooms,

intelligent environments.




This paper describes a scripting language approach to creating and controlling a live interactive performance.  It introduces the system and describes an example performance where it was used called "Macbett".  Macbett is described as a theater production that involves control of lighting and sound based on performer-centered interactions.


The paper notes that the digital media is separated by conventions and protocols and that these are limited by the software available not sensors or control technologies.  These software limitations often are where the most time is spent creating bridges between technologies.  The paper refers to this as "gluing together" components of the interactive system.


Noted environments that the author has experimented with in connection making:  Visual Basic, Max/MSP, C/C++, Macromedia Director. (Doesn't identify Isadora)  They propose that the different internal approaches and configuration interfaces were a problem for integration.  Their solution is to provide a common control system and scripting language.


Three requirements for the control system:


1.  Flexible-  must be able to incorporate new hardware interfaces.

2.  Malleable-  Must be able to be modified at run time so that the system can be modified during rehearsal to meet requirements of the process.

3.  Understandable-  Usable by technically minded non-programmers.


Note: They formally recognize the traditional idea of grouping for synchronization.


Implementation high points:


Organization/Creation side: (called organizational objects)

networked attribute-value style nodes.  Each node controls some aspect of the system.

naming convention is path based.

registry used to keep track of configuration.

grouping of objects

synchronization need recognized.

Control/Glue side: (called control objects)

Control objects modify nodes. takes sensor data and maps it into a relationship with the control nodes.

Identifies the problem of conflict resolution for control resources.

Arbitration objects resolve conflicts.

Sensor/Input side: no representation of uncertainty around sensor data.


Scripting language implemented as a finite state machine.


Scripting language examples given:

(script controlLight

(state lightAudience () ; start state


light.intensity = audience.size ; adds relationship)


(((hamlet.pos.x– audience.boundingRect.downstage) < 1)


(state lightActor ()



1/(hamlet.pos.x– audience.boundingRect.downstage)

; adds relationship))))



My Thoughts:


            This paper presents some of the fundamental issues around using interactive media in live performance.  It also addresses issues around how a live authoring and presenting system's requirements differ from other kinds of authoring and presenting systems.  Ultimately, the paper is about the creation of a scripting language that will allow non-programmers to author real-time interactions with media in live performance situations.

            However, Jeff and Eitan do not demonstrate how replacing one programming or scripting language with another will allow non-programmers access to this difficult programming space. But within a framework where more technically capable programmers are involved, this avenue could prove fruitful.

            Distributing objects across machines is a definite positive aspect and so is allowing arbitrary groupings of media of differing types for synchronization.    Another good idea is the categorization of functional components into object types that have specific purposes or functionality.

While information about how control scripts interact with organizational nodes, no offering is provided that says how control scripts are combined and interact together.  Implementation details are thin.  For instance, how does a script take control from another script?  How is sequencing control accomplished?  How is time visualized and represented?  How is sensor data source information identified and entered into the system?

            The use of a finite state machine as a model for the design of a show provides a lot of power to the user.  But the addition of this kind of flexibility creates a lot of complexity that must be visualized and visually manipulated to be of use to non-programmers.

            Another shortfall is the impact of uncertainty within the system.  Sensor data is inherently messy and inconsistent and the system is designed as if it was as reliable as a media controller.