<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Unit: Modular Development of Distributed Interaction Techniques for Highly Interactive User Interfaces</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Alex Olwal</string-name>
          <email>aolwal@cs.columbia.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Steven Feiner</string-name>
          <email>feiner@cs.columbia.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Computer Science Columbia University New York</institution>
          ,
          <country country="US">USA</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Department of Numerical Analysis and Computer Science Royal Institute of Technology Stockholm</institution>
          ,
          <country country="SE">Sweden</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>The Unit framework uses a dataflow programming language to describe interaction techniques for highly interactive environments, such as augmented, mixed, and virtual reality. Unit places interaction techniques in an abstraction layer between the input devices and the application, which allows the application developer to separate application functionality from interaction techniques and behavior.</p>
      </abstract>
      <kwd-group>
        <kwd>interaction techniques</kwd>
        <kwd>dataflow programming</kwd>
        <kwd>visual programming</kwd>
        <kwd>augmented reality</kwd>
        <kwd>mixed reality</kwd>
        <kwd>virtual reality</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>Unit's modular approach leads to the design of reusable
application-independent interaction control components,
portions of which can be distributed across different
machines. Unit makes it possible at run time to experiment
with interaction technique behavior, as well as to switch
among different input device configurations. We provide
both a visual interface and a programming API for the
specification of the dataflow. To demonstrate how Unit
works and to show the benefits to the interaction design
process, we describe a few interaction techniques
implemented using Unit. We also show how Units distribution
mechanism can offload CPU intensive operations, as well
as avoid costly special-purpose hardware in experimental
setups.</p>
      <p>
        INTRODUCTION
Despite tremendous improvements in computer systems
over the past several decades, designing and developing
interaction techniques is still a difficult task, especially for
highly interactive immersive 3D environments, such as
Augmented Reality (AR), Mixed Reality (MR) and Virtual
Reality (VR). Interaction in immersive environments
involves many different types of user input and many devices
with which that input is provided, such as position and
orientation trackers, voice input, and haptic devices, in
addition to conventional mice, trackballs, touch screens, and
keyboards. While there is an increasing number of
wellknown metaphors for immersive interaction, such as the
virtual hand [3], ray pointer [3], and flashlight pointer
[
        <xref ref-type="bibr" rid="ref13">12</xref>
        ], there is still much variation in how these metaphors
are implemented.
      </p>
      <p>
        Interaction techniques involve the mapping of data from
input devices to application semantics. Therefore, we find
it particularly attractive to use a dataflow approach to the
design of interaction techniques, in which data is processed
through a customizable network. We introduce the Unit
framework [
        <xref ref-type="bibr" rid="ref17">16</xref>
        ], which allows users to specify 2D and 3D
interaction techniques as dataflows and to modify them in
running programs. We use our framework to abstract
interaction techniques from applications that use them, as well
as from input devices that control them. This allows users
to flexibly configure and dynamically change interaction
technique behavior, independent of both input devices and
applications. Furthermore, in our framework, a dataflow
can be easily distributed over multiple machines to create
distributed interaction techniques, as well as distributed
applications. As shown in Figure 1, we use a
directmanipulation, visual-programming representation to
specify the behavior of the dataflow in the Unit User Interface
(Unit UI), which itself is implemented with Unit.
In the remainder of this paper, we first present related work
in Section 2, followed by brief introductions to the Unit
framework and the prototype Unit UI in Sections 3 and 4.
To explain how Unit can be used, we describe some
example interaction techniques that we have developed with it in
Sections 5, 6 and 7: a novel flexible pointer for selecting
objects in 3D environments (Figure 2) [
        <xref ref-type="bibr" rid="ref16">15</xref>
        ], an
experimental setup for analyzing non-verbal features of speech [
        <xref ref-type="bibr" rid="ref18">17</xref>
        ],
and a quickly prototyped rotationally sensitive mouse
(Figure 3), created from a pair of conventional mice. We
describe our implementation in Section 8, and present our
conclusions and future work in Section 9.
2
      </p>
      <p>
        RELATED WORK
Data flow programming and directed-graphbased visual
programming languages have been used together by a
number of researchers to design 2D UIs and interaction
techniques. Projects that take this approach have included
Smiths InterCONS [
        <xref ref-type="bibr" rid="ref20">19</xref>
        ], Bornings ThingLab [2], and
Maloney and Smiths Morphic user interface framework [
        <xref ref-type="bibr" rid="ref15">14</xref>
        ]
for Self [
        <xref ref-type="bibr" rid="ref22">21</xref>
        ]. A key issue here is the recognition that
interaction techniques essentially map the outputs (and inputs)
of interaction devices to the inputs (and outputs) of
applications; this observation has long been an underlying theme
of work on building formal models of abstract graphical
input devices, which in turn can be composed together in
graphs to create hierarchical input devices [1,4].
Most 3D interaction techniques can be conceptualized this
way, and 3D toolkits that embody these techniques (e.g.,
[
        <xref ref-type="bibr" rid="ref11 ref23 ref24 ref25">10,22,23,24</xref>
        ]) typically use abstract input devices. We
have chosen to abstract the interaction technique
components in a similar fashion to BodyElectric [
        <xref ref-type="bibr" rid="ref12">11</xref>
        ], ICON [5]
and InTml [6] (See Figure 4.) As in these systems, our
components are dataflow graphs, which are assembled into
customized interaction techniques. In contrast to InTml,
whose developers emphasize their XML-based
specification language, we have chosen to focus on an interactive
design process in which interaction techniques can be
modified at runtime. We have also been more concerned
with design issues that are typical for highly interactive
distributed environments, such as AR/MR/VR, which
distinguishes our framework from systems targeted for a user
on a single computer, such as ICON. While BodyElectric
also addressed 3D virtual environments, its dataflow
operated on only a single machine (a Macintosh that controlled
one or more SGI workstations). In contrast, Unit's dataflow
graph can be spread across multiple machines.
      </p>
      <p>
        Our approach allows the development of flexible
interaction techniques, portions of which can be distributed, as
well as replaced and remapped at runtime, and we provide
a user interface for visual dataflow programming of these
behaviors, as well as a programming API. While the
directmanipulation creation of 3D widgets is an appealing
approach [
        <xref ref-type="bibr" rid="ref26">25</xref>
        ], we have chosen a dataflow language for its
clear depiction of the mapping between device outputs and
application inputs. While our work on Unit primarily
applies this approach to the design of experimental 3D
interaction techniques, we also believe it is of significant
relevance to the design of new input devices (e.g.,
[
        <xref ref-type="bibr" rid="ref14 ref19 ref21">7,8,20,13,18,9</xref>
        ]).
3
      </p>
      <p>THE UNIT FRAMEWORK
The Unit framework uses the concept of units to represent
the nodes in the data flow. Each unit has any number of
properties and any number of connections to properties in
other units, as shown in Figure 5. When a property is
updated, a special method is called, which by default updates
all connected properties. Customized units typically
override this method with their own data processing, and when
done, typically redirect to the default method. These
connections can also be made over the network, allowing each
unit in the dataflow to be able to share its properties and
listen to property changes, anywhere on the network.</p>
      <p>These simple rules allow the design of flexible and
customizable units that specify the desired behavior through their
combination into Unit Graphs, much like electrical circuits.
Units
Properties
As mentioned above, the key components of a unit are its
properties and the ability to maintain and update
connections to properties in other units.</p>
      <p>Properties are attribute-value pairs, in which the value can
be a pointer to any Java Object.</p>
      <p>Connections
Connectivity is peer-to-peer, where only the involved units
are aware of their connections and are solely responsible
for administering their relationships. Connections between
two units are typically accomplished by reference, with the
data pointer of the source property initially copied to the
target property. When an update is made, the target unit is
notified that the data has changed. We also provide the
ability to make connections by value, in which each update
replicates the data in the source unit; however, connections
by reference are typically preferred for efficiency. In
addition, connections can be created through a chain of
references to fields and methods in the property value, provided
that the resulting source and target values are of the same
data type. These references are more expensive since they
require dereferencing, evaluation, and replication.
Figure 6 shows a simple example in which two mice are
used to provide sixdegree-of-freedom control of an
application.</p>
      <p>Distribution
Connections can be transparently distributed over Java
Remote Method Invocation (RMI), with the addition of the
hostname in the specification of the connection. Following
the peer-to-peer approach, each unit is individually
distributed through the RMI registry and directly accessible to
other units.</p>
      <p>This distribution approach allows parts of Unit Graphs to
be distributed over an arbitrary number of applications
running on an arbitrary number of machines. A common
problem in immersive environments is the use of hardware,
such as sixdegree-of-freedom trackers, that have a
permanent physical connection to a single machine. Unit not only
allows cross-platform access to platform-specific devices,
but also simplifies the sharing of machine-specific devices.
Our framework allows several Unit Graphs running on
different machines or within different programs on the same
machine, to communicate as a single graph, providing
transparent access to data and flow control from anywhere.
Core components
Unit, the core class, which is the superclass of all units,
provides the general unit functionality, which most
importantly is the handling of properties, connections, and
distribution. Two units are connected by specifying the source
unit, source property, target unit, and target property, and
an optional host name (for remote connections).</p>
      <p>We have also implemented a set of core units that provide
additional functionality. These include units for flow
control, such as switches and multiplexers, units for scalar and
vector operations, and units for I/O (multiple mice,
keyboards, sixdegree-of-freedom sensors, speech
recognition/synthesis).</p>
      <p>The units are arranged in a class hierarchy under the Unit
superclass. It is easy to implement new units, which
typically involves overriding the changeProperty method that
is called on every property update. Most core units have a
set of reserved property names that are used for their
specific input and output properties.</p>
      <p>THE UNIT USER INTERFACE
We created the Unit UI, shown in Figure 6, to allow users
to design, manipulate, and visualize the dataflow in a Unit
Graph.</p>
      <p>The Unit UI lets the user add, modify and delete units,
properties and connections, as well as load and save Unit
Graphs. Besides visually displaying the dataflow as
directed graphs with units, properties and connections, live
data propagation is visualized by highlighting a property
for a short time after it is updated. The user can also switch
to live views of values of interest.</p>
      <p>
        The Unit UI was implemented with units, using the same
dataflow approach as the interaction techniques it
manipulatesdemonstrating that our framework does not restrict
itself to interaction technique specification, but also applies
to traditional application logic. Unit UI is a 3D application,
and can thus coexist in the immersive environment,
sideby-side with the interaction techniques whose behavior it
controls. However, because of the limited field-of-view and
low (800×600) resolution of our head-worn displays, we
find it more productive to interact with the Unit UI in 2.5D
on high-resolution (1920×1200) 24 desktop displays.
In the following sections, we describe our experience with
Unit by presenting some of the experimental interaction
techniques that we have developed with it.
We have implemented a novel interaction technique, called
the Flexible Pointer [
        <xref ref-type="bibr" rid="ref16">15</xref>
        ], which is an extension of existing
ray-casting techniques for selection in immersive
environments. The flexible pointer allows the user to point around
objects, with a curved arrow, for selection of fully or
partially obscured objects, as well as to more clearly point out
objects of interest to other users in a collaborative
environment. The flexible pointer, shown in Figure 8, reduces
ambiguity by avoiding obscuring objects, which would
have been selected with traditional ray-casting techniques.
The flexible pointer also has a visual advantage in
situations in which it is easy to point out an object, without
obstructing the object of interest, while still providing a
continuous line from the user to the target.
      </p>
      <p>The problems that we address with the Unit framework are
how users can control the pointer, and how we can
interactively modify and tweak this mapping, at runtime and
during the design phase.</p>
      <p>Implementation
First, we have to decide on a representation for the
geometry of the flexible pointer. We choose a Quadratic Bézier
spline, where position, length, and curvature of the pointer
are controlled by three points in space.</p>
      <p>Secondly, we implement a corresponding, customized unit
that listens to changes in its position, end point, and control
point properties, and updates the geometry accordingly. We
now have a mechanism for listening to, and updating the
values of this unit, both locally and over the network. Any
component in our framework is thus able to listen to
changes or update the geometry of the pointer, by accessing
these properties. For increased precision, our prototype
flexible pointer utilizes a two-handed approach, where the
hands are tracked with two sixdegree-of-freedom trackers,
the distance between the hands map to the length of the
pointer, and the relative bending of the hands determines
the curvature characteristics of the pointer. We implement
this control behavior as a separate Unit Graph that updates
the properties of the above-mentioned unit that is
controlling the geometry.</p>
      <p>Design Process
One of the hardest tasks in interaction technique design is
the assignment of appropriate values to constants, and as
with most interaction techniques, there are several such
constants for the flexible pointer (e.g., the scale factor for
the mapping of the distance between the users hands to the
length of the pointer).</p>
      <p>Thanks to Units modularity, separate Unit Graphs can be
used for interactive tweaking and debugging of the running
interactive technique. We constructed a new graph that
takes input from a small handheld presentation mouse with
a thumb-controlled joystick. A button click alternates
between the constants that are modified and pushing the
joystick up/down increases/decreases the value of the current
constant, as shown in Figure 7. Although we could place
the graph in the same program as the flexible pointer,
avoiding the mix of interaction technique and tweaking
code seemed reasonable, and we found it more
advantageous to run it in a separate program. In fact, the ease of
distribution made us place it on a separate machine, which
gave us an exclusive environment for developing the
tweaking code, as shown in Figure 9. The behavior of our
interaction technique can be modified in real time as soon
as the graph is connected to the flexible pointer. More
importantly, we can have the flexible pointer running
constantly, while modifying, recompiling, and restarting the
tweaking code. When satisfied with the behavior of the
interaction technique, the tweaking code is removed,
simply by not running it. This example shows how we can use
Unit to abstract the interaction techniques from the input
devices and the application, and also how two interaction
techniques (the flexible pointer and the tweaking code) can
be abstracted from each other.
6</p>
      <p>
        DISTRIBUTED SPEECH RECOGNITION,
ANALYSIS, AND LOCALIZATION
We found Unit very useful in a recent experimental setup
for a user interface based on speech analysis and audio
localization [
        <xref ref-type="bibr" rid="ref18">17</xref>
        ]. We intended to explore the use of
nonverbal features of the users speech for implicit or explicit
program control. Additionally, we planned to use multiple
microphones to approximate the users head position, by
comparing the audio from the different microphones.
Running CPU-intensive speech recognition on multiple
microphones
First, we needed a mechanism for getting input from
multiple microphones, so we considered the following
approaches:
1)
2)
3)
      </p>
      <p>Using multiple general-purpose sound cards on one
computer. One would have to be careful to not run into
hardware conflicts, since an ordinary PC is not
designed to have many simultaneously active sound
cards.</p>
      <p>Using a special-purpose sound card with multiple
audio inputs. One of these cards would be too expensive
for our low-budget experimental setup.</p>
      <p>Using a special-purpose array microphone for audio
localization, where the signal processing is done in
hardware. The few such inexpensive consumer-level
microphones we found did not provide programming
API access to inferred positional data. These
microphones also put restrictions on the setup, limited by the
characteristics of the microphone, and we found it
neither feasible nor cost effective to build our own
microphone.</p>
      <p>Second, speech recognition is CPU intensive, and running
several instances of speech recognition software on the
same machine used for the visualization would
significantly affect the frame rate.
Realizing that we had many available machines in our lab,
equipped with standard sound cards, we decided to take
advantage of Units distribution mechanism to offload the
CPU-intensive speech recognition to other machines on the
network. Each of these machines could then support one
microphone, without the need for any special-purpose
hardware or alteration of the hardware configuration.
We designed our Unit graph such that the speech is
analyzed locally on each speech server, with the recognized
speech and the extracted speech features communicated
over Ethernet to the application server. The Unit dataflow
in the application server fuses the input and adjusts the
behavior of the application accordingly. Our experimental
setup is shown in Figure 10.</p>
      <p>It might sound contradictory that we find it more
costefficient and convenient to use a separate computer, instead
of a special-purpose sound card, to host a microphone.
However, the important point here is that Unit allowed us
to use our currently available general-purpose hardware for
rapid prototyping of an experimental user interface, without
having to deal with the hardware-related issues that would
play a central role in designing a practical product. While
Unit made it possible to easily develop a distributed
dataflow for our purposes, its transparent distribution
mechanism also makes it straightforward and simple to
reconfigure the application to run on a single machine (e.g., with
multiple sound-cards or a multi-input sound card).
7</p>
      <p>COMPOSITE INPUT DEVICES
The Unit framework has made it easy for us to develop
rudimentary prototype input devices, assembled from
arrangements of two or more input devices. Figure 3 shows
one of the simplest examples of a composite input device: a
threedegree-of-freedom mouse created from two
off-theshelf wireless optical mice that are rigidly attached to
provide an additional degree of freedom (rotational
acceleration in the plane of the surface on which they are used).
Unit provides simple means for specifying the relations
between the two mouse sensors, and thus allows the
behavior of this composite input device to be visually
programmed, completely in software, as shown in Figure 1.
Unit makes it possible to build composite input devices that
consist of hierarchies of different input devices and
interaction techniques, while providing unified application-level
APIs to these devices.
8</p>
      <p>IMPLEMENTATION
The Unit framework is implemented with Java and Java3D,
and thus runs across multiple platforms. Units current
implementation supports conventional pointing devices (e.g.,
mice, trackballs, touchpads, trackpoints, and touchscreens)
and keyboards, as well as several sixdegree-of-freedom
sensors (Ascension Flock of Birds, InterSense IS600 Mark
2 Plus, and InterSense IS900) and speech recognition and
speech synthesis (through the Java Speech API and IBM
ViaVoice). RMI is used for distribution over TCP/IP. We
have used a heterogeneous machine pool during the
development, with machines ranging from a Celeron 400 MHz,
with 192 MB RAM, running Windows 98, to a Dual Xeon
2.8 GHz, with 1 GB RAM, running Windows XP. The
lowend machines can be used for running Unit Graphs and
input device handling, while the more powerful machines
with 3D acceleration hardware are needed for 3D graphics.
9</p>
      <p>CONCLUSIONS AND FUTURE WORK
We have presented Unit, a system that uses a visual
dataflow programming language to support designing and
experimenting with 2D and 3D interaction techniques. We are
actively using Unit to develop interaction techniques, and
have demonstrated its utility through a set of examples
created with the system.</p>
      <p>As we have showed, Unit allows the flexible specification
of interaction techniques, while effectively avoiding
problems related to specific hardware setups in experimental
systems through a peer-to-peer distribution mechanism.
Besides abstracting interaction techniques from input
devices and applications, Units modularity has also proven
convenient, since it allows debugging components to be
developed in a stand-alone fashion outside the interaction
technique of interest.</p>
      <p>ACKNOWLEDGMENTS
This research was funded in part by Office of Naval
Research Contracts N00014-99-1-0249 and
N00014-99-10394, NSF Grant IIS-00-82961 and IIS-01-21239, and gifts
from Intel, Microsoft Research, and Alias | Wavefront.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>Anson</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          <article-title>The Device Model of Interaction</article-title>
          .
          <source>Proc. SIGGRAPH 82 (ACM Comp. Graph.</source>
          ,
          <volume>16</volume>
          (
          <issue>3</issue>
          ),
          <year>July 1982</year>
          ), Boston, MA, July
          <volume>26</volume>
          
          <fpage>30</fpage>
          ,
          <year>1982</year>
          ,
          <volume>107</volume>
          
          <fpage>114</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <surname>Borning</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <article-title>The Programming Language Aspects of ThingLab, a Constraint-Oriented Simulation Laboratory</article-title>
          .
          <source>ACM Trans. on Prog. Langs. and Sys</source>
          ,
          <volume>3</volume>
          (
          <issue>4</issue>
          ),
          <year>October 1981</year>
          ,
          <volume>343</volume>
          
          <fpage>387</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>Bowman</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hodges</surname>
            ,
            <given-names>L.F.</given-names>
          </string-name>
          <article-title>An Evaluation of Techniques for Grabbing and Manipulating Remote Objects in Immersive Virtual Environments</article-title>
          .
          <source>Proc. Symp. on Interactive 3D Graph.</source>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <surname>Duce</surname>
            , D., van Liere,
            <given-names>R.</given-names>
          </string-name>
          , and ten Hagen,
          <string-name>
            <surname>P.</surname>
          </string-name>
          <article-title>An Approach to Hierarchical Input Devices</article-title>
          .
          <source>Comp. Graph. Forum</source>
          ,
          <volume>9</volume>
          (
          <issue>1</issue>
          ),
          <volume>15</volume>
          
          <fpage>26</fpage>
          .
          <year>1990</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <surname>Dragicevic</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Fekete</surname>
            ,
            <given-names>J.D.</given-names>
          </string-name>
          <article-title>Input Device Selection andInteraction Configuration with ICON</article-title>
          .
          <source>Proc. IHM-HCI</source>
          <year>2001</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <surname>Frontiers</surname>
          </string-name>
          , Lille, France, Springer Verlag.
          <volume>543</volume>
          -
          <fpage>448</fpage>
          .
          <year>2001</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <surname>Figueroa</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Green</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hoover</surname>
            ,
            <given-names>H.J.</given-names>
          </string-name>
          <article-title>InTml: A Description Language for VR Applications</article-title>
          .
          <source>Proc. 3D Web Technology.</source>
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <surname>Fitzmaurice</surname>
            ,
            <given-names>G.W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ishii</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Buxton</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          <article-title>Bricks: Laying the Foundations for Graspable User Interfaces</article-title>
          .
          <source>Proc. Human Factors in Comp. Sys. (CHI '95)</source>
          .
          <volume>442</volume>
          
          <fpage>449</fpage>
          .
          <year>1995</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <surname>Greenberg</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Fitchett</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <article-title>Phidgets: Easy Development of Physical Interfaces Through Physical Widgets</article-title>
          .
          <source>Proc. ACM Symp. on User Interface Software and Tech. (UIST 01)</source>
          , Orlando, FL,
          <year>2001</year>
          ,
          <volume>209</volume>
          
          <fpage>218</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          <string-name>
            <surname>Hinckley</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Sinclair</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Touch-Sensing Input</surname>
            <given-names>Devices</given-names>
          </string-name>
          ,
          <source>Proc. Conf. on Human Factors in Comp. Sys. (CHI 99)</source>
          ,
          <volume>223</volume>
          
          <fpage>230</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          10.
          <string-name>
            <surname>Kessler</surname>
            ,
            <given-names>G.D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kooper</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Verlinden</surname>
            ,
            <given-names>J.C.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Hodges</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <source>The Simple Virtual Environment Library, Version</source>
          <volume>2</volume>
          .0,
          <string-name>
            <surname>User's Guide</surname>
          </string-name>
          , http://www.cc.gatech.edu/gvu/virtual/SVE/docV2.0/sve.book_1.html.
          <source>Technical Report</source>
          , Graphics, Visualization, and Usability Center, Georgia Institute of Technology,
          <year>1997</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          11.
          <string-name>
            <surname>Lanier</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Grimaud</surname>
            ,
            <given-names>J-J</given-names>
          </string-name>
          , Harvill,
          <string-name>
            <given-names>Y.</given-names>
            ,
            <surname>Lasko-Harvill</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Blanchard</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            ,
            <surname>Oberman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Mark.</given-names>
            ,
            <surname>Teitel</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <article-title>Method and system for generating objects for a multi-person virtual world using data flow networks</article-title>
          .
          <source>United States Patent 5588139</source>
          .
          <year>1993</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          12.
          <string-name>
            <surname>Liang</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Green</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>JDCAD: A Highly Interactive 3D Modeling System</article-title>
          .
          <source>Comp. and Graph</source>
          .,
          <volume>18</volume>
          (
          <issue>4</issue>
          ).
          <volume>499</volume>
          
          <fpage>506</fpage>
          .
          <year>1994</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          13.
          <string-name>
            <surname>MacKenzie</surname>
            ,
            <given-names>I. S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Soukoreff</surname>
            ,
            <given-names>R. W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pal</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <article-title>A Two-ball Mouse Affords Three Degrees of Freedom. Extended Abstracts Human Factors in Comp</article-title>
          . Sys.
          <source>(CHI 97)</source>
          .
          <volume>303</volume>
          
          <fpage>304</fpage>
          .
          <year>1997</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          14.
          <string-name>
            <surname>Maloney</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Smith</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          <string-name>
            <surname>Directness</surname>
          </string-name>
          and
          <article-title>Liveness in the Morphic User Interface Construction Environment</article-title>
          .
          <source>Proc. ACM Symp. on User Interface Software and Tech. (UIST 95)</source>
          , Pittsburgh, PA,
          <year>1995</year>
          ,
          <volume>21</volume>
          
          <fpage>28</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          15.
          <string-name>
            <surname>Olwal</surname>
            ,
            <given-names>A</given-names>
          </string-name>
          and Feiner,
          <string-name>
            <surname>S.</surname>
          </string-name>
          <article-title>The Flexible PointerAn Interaction Technique for Selection in Augmented and Virtual Reality</article-title>
          . To appear
          <source>in Extended Abstracts of ACM Symp. on User Interface Software and Tech. (UIST 03)</source>
          , Vancouver, BC.
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          16.
          <string-name>
            <surname>Olwal</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <article-title>UnitA Modular Framework for Interaction Technique Design, Development and Implementation</article-title>
          .
          <source>MS Thesis</source>
          , Dept. of Num. Anal. and Comp. Sci.,
          <source>Royal Inst. of Tech</source>
          ., Stockholm, Sweden.
          <year>2002</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          17.
          <string-name>
            <surname>Olwal</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Feiner S</surname>
          </string-name>
          .
          <article-title>Using Prosodic Features of Speech and Audio Localization in Graphical User Interfaces</article-title>
          .
          <source>Technical Report CUCS-016-03</source>
          , Department of Computer Science, Columbia University, New York, NY, June 26,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          18.
          <string-name>
            <surname>Resnick</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>Behavior Construction Kits</article-title>
          .
          <source>Communications of the ACM</source>
          ,
          <volume>36</volume>
          (
          <issue>7</issue>
          ).
          <volume>64</volume>
          
          <fpage>71</fpage>
          .
          <year>July 1993</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          19.
          <string-name>
            <surname>Smith</surname>
            ,
            <given-names>D.N.</given-names>
          </string-name>
          <article-title>Building Interfaces Interactively</article-title>
          .
          <source>Proc. ACM SIGGRAPH Symp. on User Interface Software</source>
          , Banff, Alberta,
          <source>October</source>
          <volume>17</volume>
          
          <fpage>19</fpage>
          ,
          <year>1988</year>
          ,
          <volume>144</volume>
          
          <fpage>151</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          20.
          <string-name>
            <surname>Suzuki</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kato</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <article-title>AlgoBlock: A Tangible Programming Language, A Tool for Collaborative Learning</article-title>
          .
          <source>Proc. 4th European Logo Conf., August</source>
          <year>1993</year>
          ,
          <volume>297</volume>
          
          <fpage>303</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          21.
          <string-name>
            <surname>Ungar</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Smith</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          <article-title>Self: The Power of Simplicity</article-title>
          .
          <source>Proc. OOPSLA 87</source>
          ,
          <string-name>
            <surname>Orlando</surname>
          </string-name>
          , FL,
          <year>October 1987</year>
          ,
          <volume>227</volume>
          
          <fpage>241</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          22.
          <string-name>
            <surname>Virtual Realty</surname>
          </string-name>
          <article-title>Consulting (VRCO) Inc</article-title>
          .,
          <source>CaveLib</source>
          . Chicago, IL. http://www.vrco.com/products/cavelib/ cavelib .html
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          23.
          <article-title>Virtual Reality Peripheral Network (VRPN), UNC</article-title>
          , Chapel Hill, http://www.cs.unc.edu/Research/vrpn.
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          24.
          <string-name>
            <given-names>VR</given-names>
            <surname>JugglerOpen Source Virtual Reality Tools</surname>
          </string-name>
          , Virtual Reality Applications Center, Iowa State University, http://www.vrjuggler.org/.
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          25.
          <string-name>
            <surname>Zeleznik</surname>
            ,
            <given-names>R. C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Herndon</surname>
            ,
            <given-names>K. P</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Robbins</surname>
            ,
            <given-names>D. C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Huang</surname>
          </string-name>
          , N., Meyer, T.,
          <string-name>
            <surname>Parker</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hughes</surname>
            ,
            <given-names>J.F.</given-names>
          </string-name>
          <article-title>An Interactive 3D Toolkit for Constructing 3D Widgets</article-title>
          .
          <source>Proc. SIGGRAPH '93. 81 84</source>
          .
          <year>1993</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>