This paper describes a toolkit that helps with authoring augmented reality experiences. The system is based on the Director paradigm but extends it for use with sensors and three dimensional graphics. the Director model mixes visual layout, direct maniplulation of objects, and scripting. This allows two levels of authoring: simple graphics based ordering of events, and in depth creation of complex interactions using the scripting mechanisms.
Director is primarily based on the timeline model of multimedia control with a stage area for previewing and rendering of the experience. Sensing is primarily accomplished through input from specific types of technologies that are prevalent in augmented reality systems. Primary extraction of meaning occurs in the programming before the data reaches the authoring system. Integration of multiple channels of sensing is accomplished within the lingo scripting mechanism of Director.
useful notes:
on preparing media:
As Landay and his colleagues have shown (in a variety of domains),
the key to supporting creativity and design exploration is to encourage a
gradual transition from informal (i.e., sketched) to formal content as the
design is explored and refined (e.g., [8]). We view 3D content creation as one
step in the shift from initial design ideas and storyboards to working
experiences, and (following Landay) believe it is important to support sketched
content during early design.
On the difficulty in using tracking technologies:
Sensing and reasoning
technologies are expensive (in time and money) to create and deploy. Most interaction in AR is implicit and
dependent on the specific applications (e.g., is the user looking at the
statue?
Real time debugging:
Working in real-time is
difficult. When many things are
happening in a split second, and when interactions are based on possibly
noisy sensor data, debugging and understanding an experience can be difficult
or impossible.
All time-based scripts in DART are based on the DARTClock, an abstract wrapper around the Director
clock. This wrapper allows the designer to control time by pausing the
experience time, changing its speed, or moving the time backwards or forwards
to a specified point. While live video and sensor data cannot be paused or
stepped through, the rest of the experience can. When recorded data is being
used, everything (video, tracking and actor scripts) can be paused and stepped
through at whatever pace the designer needs.
The need for site testing and debugging:
Having to actually work
(develop) in the physical world can be prohibitively difficult. An often-overlooked impediment to
developing AR experiences is the need to be physically present during the
development cycle in the environment being augmented, and to get up and move
around the physical space during testing.
Schell and Shochet
have shown (in the context of mixed-reality
theme park rides)
that design ideas for experiences that integrate physical and virtual worlds
must be tested in the target site as soon and as often as possible [14].