<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>On Applying Soni cation Methods to Convey Business Process Data</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Tobias Hildebrandt</string-name>
          <email>tobias.hildebrandt@univie.ac.at</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Simone Kriglstein</string-name>
          <email>SKriglstein@sba-research.at</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Stefanie Rinderle-Ma</string-name>
          <email>stefanie.rinderle-ma@univie.ac.at</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>SBA Research</institution>
          ,
          <addr-line>Vienna</addr-line>
          ,
          <country country="AT">Austria</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>University of Vienna, Austria, Faculty of Computer Science</institution>
        </aff>
      </contrib-group>
      <abstract>
        <p>Visualization of business process models with large numbers of process activities and running instances in di erent execution states shows various limitations. Hence, using soni cation methods becomes a promising idea to enable users to gain insight into process information that cannot be conveyed by using purely visual means. This visionary short paper aims to envision the usage of soni cation methods in order to represent business process-related data in all phases of the process life cycle. Soni cation methods are presented and analyzed in terms of their potential suitability for representations of process data. Overall, this paper aims at breaking new ground for designing and applying multimodal approaches for making process information more accessible to users.</p>
      </abstract>
      <kwd-group>
        <kwd>Business Process Management</kwd>
        <kwd>Soni cation</kwd>
        <kwd>Process Representation</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>During the individual life cycle phases of business processes (design &amp; simulation,
operation and analysis), di erent kinds of process model- and process
instancerelated data accumulate. For all phases, especially for the design &amp; simulation
phase, graphical user interfaces and visualization methods are widespread.
However, visualization techniques can reach their limits. As an example, process
models in the design and simulation phase can have a huge number of events
and activities, which can make it di cult to visually identify deadlocks or
weaknesses in process models. In the operation and evaluation phases, processes can
have thousands of simultaneously running process instances, which can make
it hard to nd deviations from regular process execution paths or monitor the
health or processes and process instances using only visual means.</p>
      <p>
        The usage of data soni cation as an enhancement to process visualizations
might be able to tackle some of these challenges. Soni cation can be de ned
as the "presentation of data using sound" [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. This presentation is usually
intended to support the listener or user to gain new insights on the presented data.
Although many reasons appear to apply soni cation for representing business
process-related data, only very few approaches have addressed this issue so far
(e.g., [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]). Gregory Kramer et al. [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] found out, that the auditory perception is
especially sensitive to temporal change. Furthermore, soni cation, in contrast to
a static visualization, can only exist in time. As process instances per de nition
can only exist in time as well, soni cation naturally lends itself to this area. Georg
Spehr [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] further concludes, that soni cation is more suitable than visualization
for complex, irregular or even chaotic data. This promises advances when
trying to convey process exceptions and changes to users. Moreover, soni cations
can very well be recognized, remembered and recalled later on [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. Studies, such
as the one conducted by Salvador et al. [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ], point out that soni cation can,
under certain conditions, yield better results than visualization in terms of the
accuracy and e ciency of data exploration while interfaces that combine both
modalities yield signi cantly better results than each of the modalities alone.
Beside the usage of soni cation to enhance visual means, it is also applied in
situations where the visual focus and attention are needed elsewhere (e.g., in
cockpits or operating rooms) or to support blind or visually impaired people.
      </p>
      <p>This visionary short paper provides an analysis of soni cation methods with
respect to their suitability in the area of business processes. Existing basic soni
cation methods are discussed and their potential to convey information in
consideration of the business process life cycle is analyzed. This is a rst step towards a
multi-modal approach in order to make information related to business processes
more accessible to users.</p>
      <p>The paper is structured as follows. Section 2 presents basic soni cation
methods. Possible solutions regarding how soni cation methods can be used to make
process-related data more accessible to users are discussed in Section 3. Finally,
Section 4 concludes the paper and gives an outlook on future work.
2</p>
    </sec>
    <sec id="sec-2">
      <title>Basic Soni cation Approaches</title>
      <p>
        Nowadays, there is a growing amount of research in the elds of areas of
applications, methods, and perception of soni cation. Regarding the application of
soni cation there is, among others, research in the elds of astronomy, volcano
activity, ice glaciers, RNA structures, brain activities or weather data. Other
examples are soni cations of software code and sonically enhanced data mining.
Furthermore, there are a few examples in the elds of social sciences: soni cations
for population developments and election outcomes, sport sciences or economics
(e.g., soni cation of stock market data [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]). McKinney et al. [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] describe a
system that aurally and visually presents peer-to-peer networking tra c. There
is several research (e.g., [
        <xref ref-type="bibr" rid="ref1 ref13 ref6">13,1,6</xref>
        ] that investigates methods for interactive
sonications. Other research aims towards multi-modal soni cation (systems that
combine soni cation with graphical user interfaces and visualization techniques)
in di erent application areas (e.g., [
        <xref ref-type="bibr" rid="ref7 ref9">9,7</xref>
        ]).
      </p>
      <p>
        After this brief introduction into the eld of soni cation, in the following the
four probably most widely used and researched soni cation methods (audi
cation, auditory icons, earcons and parameter mapping) will be introduced.
Audi cation. An audi cation is the direct conversion of data points into sound.
It interprets data sequences as an audio waveform by mapping the data to sound
pressure levels. Therefore, a very high number of data sets is needed in order to
produce audible results, which is limiting the eld of possible modes of operation.
[
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]
Auditory Icons. Auditory icons are everyday sounds that directly represent
the events that are being soni ed [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. As an example, the sound of a paper basket
being emptied can be stated that is played back upon emptying the metaphorical
paper basket in the Windows operating system. Pure audi cations convey only
the information that certain events have occurred, not other quantitative data
that might be connected to those events.
      </p>
      <p>
        Earcons. Earcons are non-verbal audio messages consisting of motives, which
are short rhythmic sequences of pitched tones with variable timbre, pitch and
amplitude. Timbre describes the basic properties of sounds and is a subjective
characteristic that enables the di erentiation of two sounds, even though they
might have the same loudness and pitch [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
      </p>
      <p>The concept of earcons is similar to that of auditory icons with the di erence,
that auditory icons are everyday sounds that directly represent the event that
is being soni ed, whereas earcons can be abstract symbols that are not similar
to the real world sound of the represented event or object.</p>
      <p>
        Parameter Mapping. Parameter mapping is the mapping (either direct or
by scaling) of data values to speci c attributes of sound. These attributes are
typically volume, pitch, panning (the position of a sound in the stereo eld)
or timbre. Other possibilities to map parameters of sound are repetitions and
pauses between distinct sound events in loops. Other approaches are to lter
out speci c ranges of frequencies according to data. Due to these characteristics
parameter mapping is often being said to be the sonic pendant to a scatter plot
diagram. [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]
Parameterized Auditory Icons and Earcons. Parameterized auditory icons
and earcons are mixtures of parameter mapping and auditory icons/earcons and
combine the simple event-occurrence method of auditory icons/earcons with
parameter mapping. In these, sounds convey the occurrence of events, but at the
same time quantitative data can be mapped to these sounds. This mapping is
analogous to the parameter mapping way of mapping data to sound attributes.
Parameterized earcons usually provide more extensive means to map data to
sound attributes in comparison to parameterized auditory icons. [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]
      </p>
    </sec>
    <sec id="sec-3">
      <title>Applying Soni cation Methods to Business Processes</title>
      <p>
        Even though a substantial amount of soni cation research has accumulated, so
far there seems to be no research concerning the application of soni cation in the
di erent life cycle phases of business processes. One of the few examples of the
usage of soni cation concerning processes in business environments is Grooving
Factory [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. It explored the soni cation of production process-related data in
order to nd bottlenecks and improve logistics. Evaluation showed, that the
developed prototypes ful lled these requirements. Gaver et al. [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] explore with
their "ARKOLA Simulation" the production processes of a bottling plant in a
multi-modal representation that combines visual and auditory means. It soni es
events during the production process (such as spills of liquid) using real-world
recordings of such events. They concluded that the auditory feedback helped
in diagnosing problems in the production process. There are a few publications
that explored the soni cation of di erent process data (e.g., [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]). Besides the
mentioned projects "Grooving Factory" and the "ARKOLA Simulation" there
seem to be however no publications that deal with soni cations of process data in
a business environment. This leads to the assumption that there is a substantial
amount of untackled research potential in this area.
      </p>
      <p>This section will discuss possible solutions in terms of how soni cation can
be used to make process-related data more accessible to users and to reduce the
potential limitations of process visualizations. Therefore, the respective process
life cycles phases are analyzed in terms of which soni cation techniques might be
suited best to support the tasks users typically have to perform in those phases.
Soni cation in Process Design. Audi cation relies on a huge number of
quantitative data, which makes this technique seem unsuitable in cases where
no or little quantitative data needs to be soni ed, but instead merely events.
Auditory icons, on the other hand, seem very suitable for soni cations in the
process design phase: for process instances that are created during the
simulation of process models, the sonic pendants of the involved activities and events
could be played back upon their incidences. Depending on the industry and the
type of processes, there is often a variety of self-explanatory sounds that can be
used in order to sonify the respective events and activities. In auditory-based
soni cations, the relevant events and activities could be soni ed by using sounds
that naturally represent these events and activities as accurately as possible. As
an example, the sound of a shopkeepers bell could signify the reception of a new
order. Analogous, the process event "customer has payed his invoice" could be
conveyed by playing the sound of a cash register being opened, while the activity
"delivery" could be soni ed by applying motor sounds. Fig. 1 shows a schematic
overview of how such a soni cation of a process instance could be realized. The
x-axis is the time axis, whereas each row on the y-axis contains a sound le that
represents one activity or event. These sound les are played back sequentially
from left to right.</p>
      <p>In this example, the sounds that convey events have been assigned a xed
length of 1.5 seconds (as the time axis in the lower part of the gure shows). The
lengths of the audio signals that convey activities, on the other hand, represent
the actual duration of the represented activities. Thus, a motor sound with a
duration of three seconds could, depending on the scaling, for example signify
that the transport took three days. Analogous, a silence between activities and
events could signify a waiting period. If, for example, there is a gap of two seconds
between the playback of the sounds that represent the activities "production"
and "packaging", one can conclude that there has been a waiting period of two
days between the production of goods and the transport of said. This could be
a hint that there are ine ciencies in the process.</p>
      <p>Audio les of three di erent instance soni cations of this example process
are available online1. The audio le "Example one - average process instance"2,
shows a soni cation of an average process instance. The audio le "Example
two - no payment"3 is a soni cation of a process instance, in which no incoming
payment has been registered. The audio le "Example three - Production and
transport delayed"4 is a soni cation of a process instance, in which the
activities production and transport have been delayed (which can be recognized by
the pauses before the respective audio signals). This simple example tries to
show, that auditory icons are able to point out deviations in process instances.
1 http://soundcloud.com/tobias hildebrandt/
2 direct link: http://soundcloud.com/tobias hildebrandt/business-process-soni cation
3 direct link: http://soundcloud.com/tobias hildebrandt/business-process
4 direct link: http://soundcloud.com/tobias hildebrandt/business-process-1
However, due to its simplicity, this example cannot serve as a comprehensive
demonstration of the strengths of soni cation (like its ability to convey complex
or irregular data). Further prototypes that combine visual and auditory means
and base on more complex process models will help in evaluating, if the inherent
features of the auditory perception (like the sensitivity to rhythm and its ability
to recognize even small changes in sounds over time) can help to convey process
instance-related information better than purely visual means.</p>
      <p>Earcons are in a similar fashion suitable for process data soni cations but
more exible. For some process events it could prove di cult to nd
real-worldsonic analogies. For example, it could be a challenge to nd sounds that are sonic
analogies to the states "customer is already registered" and "new customer".
This di erentiation would therefore be hard to convey using auditory icons, so
the usage of earcons might solve that problem (even though studies suggest that
earcons are harder to recognize than auditory icons). By using parameterized
auditory icons or earcons, not only the information can be conveyed that a
certain event has occurred, but also one or several quantitative data attributes
that are connected to that event. For example, one could imagine an auditory
icon that conveys the occurrence of an event "incoming payment", while the sum
of the payment is mapped to the pitch of that auditory icon.</p>
      <p>Parameter mapping might not be the most obvious choice for the process
design phase - parameter mapping relies on quantitative data that varies over
time, rather than on information on events and their sequences of occurrence,
as it is typically the case for process instance-related data.</p>
      <p>Soni cation in Process Operation and Analysis. Soni cations that aim
to assist users during the process operation phase and the process analysis phase
probably have to ful ll similar requirements. In both, potential users might want
to obtain aggregated information about processes and associated instances and
analyze conspicuous phenomena in detail.</p>
      <p>Audi cation does not seem not very suitable for the soni cation of data that
is related to the process operation and analysis phases, as it seems in exible in
terms of sound design and the structure and format of input data (high amounts
of quantitative data that lie within a speci c range).</p>
      <p>Auditory icons should o er the possibility to recognize deviations of process
instances from the process model by the fact that the respective sounds are
being played in a di erent order, or in a di erent rhythm while monitoring and
analyzing individual process instances. The same is true for (parameterized)
earcons, analogous to what has been said for the process design phase.</p>
      <p>Parameter-mapping soni cations might be especially useful during the
process operation and the process analysis phases. During these phases, usually
quantitative data accumulates that might be mapped to one or several sound
streams. These sound streams might then, for example in the process operation
phase, be played back continuously which should make it feasible for the user to
recognize patterns and modi cations as well as to get an overview of the general
health of individual processes or a complete system. During process operation,
an advantage over the usage of purely visual means would be that users would
not need to focus their visual attention to speci c displays. Thus, they would
be able to work on other things while at the same time they could be informed
about background activities. Of course, such a soni cation would have to be
designed in a non-disruptive way.</p>
      <p>In the process analysis phase one could imagine a sonic summary of a certain
time period (for example a shortened soni cation of the last 24 hours). In such
a soni cation it should, after a learning phase, be possible to detect deviations
or critical situations during the execution of process instances.
4</p>
    </sec>
    <sec id="sec-4">
      <title>Conclusion</title>
      <p>Soni cation is gaining more and more importance in various disciplines. Due to
the existing limitations of visualizations (e.g., keeping track of high numbers of
running process instances in di erent execution states), the authors of this paper
propose to introduce it as an enhancement to visualization methods to convey
business process information. To achieve this goal it is rst necessary to
understand how soni cation methods can be used to represent business process-related
data. The motivation of this paper was to give a rst overview of di erent
sonication methods. Possible directions and solutions concerning how soni cation
can be used to make process-related data more accessible to users were discussed.</p>
      <p>Of the presented soni cation methods, parameterized earcons and
parameterized auditory icons seem to be best suited for soni cations during the process
design phase, while the monitoring in the operation phase and the analysis in the
evaluation phase seem to be well suited for a parameter mapping soni cation.</p>
      <p>A multi-modal combination of visualization and soni cation should consider
the individual strengths and weaknesses of both methods. In general, it should
use the abilities of visualization to convey exact information and of soni cation
to convey changes in temporal developments.</p>
      <p>In future work, we will develop and combine soni cation methods with
visualization methods { particularly for scenarios where visualization techniques
come to their limits { that can best be combined into an integrated multi-modal
approach for each phase of the business process life cycle. Further, we plan to
conduct user studies to evaluate the design of the multi-modal approach and the
ndings of the evaluations will in uence the further design process to improve
our approach iteratively.</p>
    </sec>
    <sec id="sec-5">
      <title>Acknowledgments</title>
      <p>The research was partly funded by COMET K1, FFG - Austrian Research
Promotion Agency.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Beilharz</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ferguson</surname>
            ,
            <given-names>S.:</given-names>
          </string-name>
          <article-title>An interface and framework design for interactive aesthetic soni cation</article-title>
          . In: Jensen,
          <string-name>
            <given-names>M.A.</given-names>
            ,
            <surname>Kronland-Martinet</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            ,
            <surname>Ystad</surname>
          </string-name>
          ,
          <string-name>
            <surname>S.</surname>
          </string-name>
          ,
          <source>Kristo er (eds.) Proc. of the 15th International Conference on Auditory Display (ICAD2009)</source>
          .
          <source>Re:New Digital Arts Forum</source>
          , Copenhagen, Denmark (
          <year>2009</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Ciardi</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>sMax: A multimodal toolkit for stock market data soni cation</article-title>
          .
          <source>In: Proc. of the 10th International Conference on on Auditory Display</source>
          . ICAD, International Community for Auditory Display (
          <year>2004</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Gaver</surname>
          </string-name>
          , W.W.:
          <article-title>Using and creating auditory icons. SFI studies in the sciences of complexity, Addison Wesley Longman (</article-title>
          <year>1992</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Gaver</surname>
            ,
            <given-names>W.W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Smith</surname>
            ,
            <given-names>R.B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>O'Shea</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          :
          <article-title>E ective sounds in complex systems: the ARKOLA simulation</article-title>
          .
          <source>In: Proc. of the SIGCHI Conference on Human Factors in Computing Systems: Reaching through technology (CHI'91)</source>
          . pp.
          <volume>85</volume>
          {
          <fpage>90</fpage>
          .
          <string-name>
            <surname>ACM</surname>
          </string-name>
          (
          <year>1991</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5. Hermann, T.:
          <article-title>Soni cation for Exploratory Data Analysis (PhD Thesis)</article-title>
          .
          <source>Universitat Bielefeld</source>
          (
          <year>2002</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6. Hermann,
          <string-name>
            <surname>T.</surname>
          </string-name>
          :
          <article-title>An introduction to interactive soni cation</article-title>
          .
          <source>IEEE MultiMedia 12(2)</source>
          ,
          <year>2024</year>
          (
          <year>2005</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7. Hermann,
          <string-name>
            <given-names>T.</given-names>
            ,
            <surname>Hansen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            ,
            <surname>Ritter</surname>
          </string-name>
          , H.:
          <article-title>Combining visual and auditory data exploration for nding structure in high-dimensional data</article-title>
          .
          <source>Technical Report on McMC soni cations</source>
          (
          <year>2001</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8. Hermann,
          <string-name>
            <given-names>T.</given-names>
            ,
            <surname>Niehus</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            ,
            <surname>Ritter</surname>
          </string-name>
          , H.:
          <article-title>Interactive visualization and soni cation for monitoring complex processes</article-title>
          .
          <source>In: Proc. of the 2003 International Conference on Auditory Display</source>
          . ICAD, International Community for Auditory Display (
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Kasakevich</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Boulanger</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bischof</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Garcia</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Augmentation of visualisation using soni cation: A case study in computational uid dynamics</article-title>
          .
          <source>In: Proc. of the IPT-EGVE Symposium. The Eurographics Association</source>
          (
          <year>2007</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Kramer</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Walker</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bonebright</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cook</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Flowers</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Miner</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Neuho</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bargar</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Barrass</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Berger</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Evreinov</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fitch</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Grohn</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Handel</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kaper</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Levkowitz</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lodha</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shinn-Cunningham</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Simoni</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tipei</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Soni cation report: Status of the eld and research agenda - report prepared for the national science foundation by members of the international community for auditory display (</article-title>
          <year>1999</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Loeb</surname>
            ,
            <given-names>R.G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fitch</surname>
          </string-name>
          , W.T.:
          <article-title>A laboratory evaluation of an auditory display designed to enhance intraoperative monitoring</article-title>
          .
          <source>Anesthesia &amp; Analgesia</source>
          <volume>94</volume>
          (
          <issue>2</issue>
          ),
          <volume>362</volume>
          {
          <fpage>368</fpage>
          (
          <year>2002</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>McKinney</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Renaud</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Leech: BitTorrent and music piracy soni cation</article-title>
          .
          <source>In: SMC 2011: 8th Sound and Music Computing Conference</source>
          ,
          <volume>06</volume>
          -
          <issue>09</issue>
          <year>July 2011</year>
          , Universita di Padova, Padova, Italy (
          <year>2011</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Pauletto</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hunt</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>A toolkit for interactive soni cation</article-title>
          .
          <source>In: Proc. of the 10th International Conference on Auditory Display. International Community for Auditory Display (ICAD)</source>
          (
          <year>2004</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Salvador</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Minghim</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Levkowitz</surname>
            ,
            <given-names>H.:</given-names>
          </string-name>
          <article-title>User evaluations of interactive multimodal data presentation</article-title>
          .
          <source>In: Ninth International Conference on Information Visualisation</source>
          ,
          <year>2005</year>
          . Proceedings. pp.
          <volume>11</volume>
          {
          <fpage>16</fpage>
          .
          <string-name>
            <surname>IEEE</surname>
          </string-name>
          (
          <year>2005</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Smith</surname>
            ,
            <given-names>D.R.:</given-names>
          </string-name>
          <article-title>E ects of Training and Context on Human Performance in a Point Estimation Soni cation Task (Masters Thesis)</article-title>
          .
          <source>Georgia Institute of Technology</source>
          (
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Spehr</surname>
          </string-name>
          , G.:
          <article-title>Funktionale Klange: Mehr als ein Ping</article-title>
          . transcript Verlag (
          <year>2008</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Windt</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Iber</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Klein</surname>
          </string-name>
          , J.:
          <article-title>Grooving factory - bottleneck control in production logistics through auditory display</article-title>
          . In: Brazil,
          <string-name>
            <surname>E</surname>
          </string-name>
          . (ed.)
          <source>Proc. of the 2010 International Conference on Auditory Display (ICAD)</source>
          .
          <source>International Community for Auditory Display</source>
          , Washington,
          <string-name>
            <given-names>D.C.</given-names>
            ,
            <surname>USA</surname>
          </string-name>
          (
          <year>2010</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>