<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Automation Space: Towards a Design Space for Everyday Automation</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Peter Fröhlich</string-name>
          <email>peter.froehlich@ait.ac.at</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>AIT Austrian Institute of Technology Vienna</institution>
          ,
          <country country="AT">Austria</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Jessica Bongard ZHAW Zurich University of Applied Sciences Zurich</institution>
          ,
          <country country="CH">Switzerland</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Matthias Baldauf FHS St.Gallen University of Applied Sciences St.Gallen</institution>
          ,
          <country country="CH">Switzerland</country>
        </aff>
      </contrib-group>
      <fpage>43</fpage>
      <lpage>48</lpage>
      <abstract>
        <p>No longer only experts are confronted with (semi-)automated systems, yet automation has founds its way into our everyday lifes in various forms and applications. In this paper, we introduce our ongoing work towards a design space for “Everyday Automation” to uncover the dimensions of respective approaches, identify research gaps and promising future applications as well as to allow for transferring experiences and knowledge between different types of automated systems. Based on a literature review, we derived first dimensions for such a dedicated design space, such as the domain, the task type, the type of user interaction, and the automation level. For a visual presentation of this “Automation Space”, we propose a so-called morphological box which might provide a suitable tool for overviewing the diverse manifestations of automation in everyday life and for supporting ideation of novel approaches.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>________________________________________________________
Workshop proceedings Automation Experience across Domains
In conjunction with CHI'20, April 26th, 2020, Honolulu, HI, USA
Copyright © 2020 for this paper by its authors. Use permitted under
Creative Commons License Attribution 4.0 International (CC BY 4.0).
Website: http://everyday-automation.tech-experience.at</p>
    </sec>
    <sec id="sec-2">
      <title>Author Keywords</title>
      <p>Automation; design space; automation domain; automation
level.</p>
    </sec>
    <sec id="sec-3">
      <title>CCS Concepts</title>
      <p>•Human-centered computing ! Human computer
interaction (HCI);</p>
    </sec>
    <sec id="sec-4">
      <title>Introduction</title>
      <p>
        Whether a fully automatic vacuum cleaner in the living room
or a self-sufficient service for municipal information and
applications: automation appears in numerous forms in our
everyday life and is constantly evolving. Everyday
Automation [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] is a very broad and complex topic, which is
particularly driven by recent advances in Artificial Intelligence (AI)
and “smart” devices at affordable prices.
      </p>
      <p>
        Everyday automation can be understood as a union of the
definitions of automation and everyday life. It is a process
in which individual functions or entire activities are
transferred from humans to machines [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ], and which focuses
on a person in their immediate, everyday environment. The
immediate everyday environment of a person is determined
in particular by routines and habits, but also by mobility and
social interactions, for example.
      </p>
      <p>
        From a scientific perspective, a categorization of the
numerous appearances of automation in our everyday life
is relevant in order to provide a comprehensive overview
and structure and to uncover possible gaps in research and
promising future applications. By analyzing existing
automation approaches for everyday tasks and by identifying
potential variants, we strive to unfold the so-called “design
space” of everyday automation. Design spaces have a long
history in HCI research. Examples include the work by
Buxton who introduced a taxonomy of input devices [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] and the
work by Ballagas et al. [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] who presented a design space
for using smartphones for ubiquitous input. Other examples
include a design space for driver-based automotive user
interfaces by Kern and Schmidt [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] and a design space for
interactive public displays by Müller et al. [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ].
      </p>
      <p>In the following, we introduce our ongoing work on
creating the Automation Space, a design space for Everyday
Automation. We report on the method, preliminary core
dimensions identified so far as well as a promising
visualization approach.</p>
    </sec>
    <sec id="sec-5">
      <title>Method</title>
      <p>
        In order to determine this design space and the dimensions
of Everyday Automation, we started to conduct a literature
research. Used sources and search engines for scientific
works include the ACM Digital Library, the IEEE Xplore
Digital Library, the AIS eLibrary, as well as Research Gate and
Google Scholar. At the center of this review are keywords
and keyword combinations which were derived from
contributions to last year’s CHI workshop on Everyday
Automation Experience [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] and included “everyday automation”,
“smart technology”, “smart devices”, “everyday interaction”,
“digital assistants”, “home automation” and “smart city”. The
search terms are expanded during search with newly
acquired knowledge. The main inclusion criterion for a study
was that the source must contain recent approaches and
examples for everyday automation.
      </p>
      <p>
        The analysis of the documents was done according to the
Quantitative Content Analysis (cf. [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]). This method is
considered to be particularly appropriate as the method aims at
structuring certain themes and contents and filters out and
summarizes aspects of the material.
      </p>
    </sec>
    <sec id="sec-6">
      <title>Preliminary Dimensions</title>
      <p>From this literature research, we identified and selected five
preliminary core dimensions for a design space of
Everyday Automation. In the following, we briefly introduce these
dimensions and present corresponding examples from
literature.</p>
      <sec id="sec-6-1">
        <title>Presence of the System</title>
        <p>
          Everyday Automation applications can be differentiated
according to the presence of a physical system. Based on
the analysis of the examples, a division of the applications
into virtual and physical presence could be determined.
For example, virtual systems are smart assistance systems
in cars [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ], digital representatives [
          <xref ref-type="bibr" rid="ref17">17</xref>
          ] or virtual reality
indoor navigation systems [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ]. Autonomous drones [
          <xref ref-type="bibr" rid="ref18">18</xref>
          ],
fully-automated coffee makers [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ] and automatic vacuum
cleaner robots [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ] are examples of physical systems, i.e.
physical representations of the automated object.
        </p>
      </sec>
      <sec id="sec-6-2">
        <title>Domain</title>
        <p>
          The following application domains of Everyday Automation
were identified from the literature examples: Education,
Health and Sports, Shopping and Restaurant,
Transportation, Home, Security and Government. Frequently, several
different areas of application are mentioned for the same
example. For instance, food recognition of smart
refrigerators (for automating ordering processes, e.g.) can be
used at home, but also in restaurants (e.g., [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ]).
Furthermore, interaction with displays based on eye movement can
serve as a public information display in the museum, but
can also be used as a game for waiting areas in the
hospital (e.g., [
          <xref ref-type="bibr" rid="ref24">24</xref>
          ]).
        </p>
      </sec>
      <sec id="sec-6-3">
        <title>Automated Task</title>
        <p>
          Based on the analysis of the Everyday Automation
examples, it was found that key words identified for the
automated task are covered by the dimensions suggested by
Parasuraman et al. [
          <xref ref-type="bibr" rid="ref21">21</xref>
          ]: Information acquisition, information
analysis, decision and action selection and action
implementation. While information acquisition describes purely
sensory functions for capturing data from the environment,
information analysis deals with processing the captured
data. The decision or action selection deals with the
derivation of further action steps and the action implementation
includes the actual execution of an action selection and
usually replaces the hand or voice of a person. For
example, an autonomous delivery droid (e.g., [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ]) takes over
the complete delivery of orders, while a sports wearable
(e.g., [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ]) only signals flow state feedback and
recommendations for further activity to the wearer.
        </p>
      </sec>
      <sec id="sec-6-4">
        <title>User Interaction</title>
        <p>
          Six different user interactions for Everyday Automation
applications were identified: stationary or mobile external
device, hardware buttons, touch interface, hand gestures,
voice interface and eye gestures. Stationary or mobile
external devices include in particular computers, tablets or
smartphones, as well as hardware controllers, cameras,
and wearables. Examples of the various interaction
modalities include automated passport control in a stationary
sluice (e.g., [
          <xref ref-type="bibr" rid="ref4">4</xref>
          ]), autonomous drone control by hand
gestures (e.g., [
          <xref ref-type="bibr" rid="ref18">18</xref>
          ]), and automatic language translation via a
voice interface (e.g., [
          <xref ref-type="bibr" rid="ref20">20</xref>
          ]).
        </p>
      </sec>
      <sec id="sec-6-5">
        <title>Automation Level</title>
        <p>
          According to Parasuraman et al. [
          <xref ref-type="bibr" rid="ref21">21</xref>
          ], the degree of
automation is divided into three areas: fully manual,
semiautomated and fully automated. Where manual means
that a task is carried out exclusively by humans and is
therefore only listed for the sake of completeness.
Semiautomated means that a task is carried out by combining
the advantages of human skills with the advantages of the
machine [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ]. In an AR-based system helping patients to
test their blood at home, a combination of human action
and machine support takes place (e.g., [
          <xref ref-type="bibr" rid="ref3">3</xref>
          ]). Fully automatic
means that a task is completed completely and exclusively
by the machine. At a sans-checkout grocery store such as
Amazon Go, the scanning of the items and the payment
process are carried out completely automatically (e.g., [
          <xref ref-type="bibr" rid="ref22">22</xref>
          ]).
        </p>
      </sec>
    </sec>
    <sec id="sec-7">
      <title>Visualization</title>
      <p>
        Everyday Automation covers a diverse and complex range
of applications. Therefore it is not trivial to find an
appropriate and suitable form for visualizing the corresponding
design space. We propose a representation of the design
space based on the concept of the morphological box,
which has its origin in the creativity techniques. This form
of visualization is based on the division of a subject into its
elementary components, whereby the dimensions for each
component are determined and a combination of the
elements is ultimately displayed [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ]. The aspects mentioned
above reflect parallels and central elements of a design
space. This form of visualization is therefore considered to
be particularly suitable for compactly visualizing a design
space with many dimensions and manifestations.
The red line symbolizes an automatic vacuum cleaner
robot [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ], the green line an AR-based system helping
patients to test their blood at home [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] and the blue line a
virtual reality indoor navigation system [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. Each additional
path through the dimensions might inspire a novel Everyday
Automation application.
      </p>
    </sec>
    <sec id="sec-8">
      <title>Conclusion and Outlook</title>
      <p>In this paper, we presented our ongoing work on creating
a design space for “Everyday Automation”. From a
literature review, we identified first core dimensions: presence
of the system, domain, automated task, user interaction,
and automation level. For visualizing these dimensions and
the various existing specifications, we proposed a
morphological box. This approach provides a compact overview
of manifestations and particularly supports the ideation of
novel applications.</p>
      <p>In future work, we will complete this first version of the
Automation Space by further dimensions. Additionally, we
plan to evaluate complementary alternative visualization
approaches beyond the currently used morphological box.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>R.</given-names>
            <surname>Ballagas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Borchers</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Rohs</surname>
          </string-name>
          , and
          <string-name>
            <given-names>J. G.</given-names>
            <surname>Sheridan</surname>
          </string-name>
          .
          <year>2006</year>
          .
          <article-title>The smart phone: a ubiquitous input device</article-title>
          .
          <source>IEEE Pervasive Computing</source>
          <volume>5</volume>
          ,
          <issue>1</issue>
          (
          <year>2006</year>
          ),
          <fpage>70</fpage>
          -
          <lpage>77</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>William</given-names>
            <surname>Buxton</surname>
          </string-name>
          .
          <year>1983</year>
          .
          <article-title>Lexical and Pragmatic Considerations of Input Structures</article-title>
          .
          <source>SIGGRAPH Comput. Graph</source>
          .
          <volume>17</volume>
          ,
          <issue>1</issue>
          (Jan.
          <year>1983</year>
          ),
          <fpage>31</fpage>
          -
          <lpage>37</lpage>
          . DOI: http://dx.doi.org/10.1145/988584.988586
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>Tom</given-names>
            <surname>Djajadiningrat</surname>
          </string-name>
          ,
          <string-name>
            <surname>Pei-Yin</surname>
            <given-names>Chao</given-names>
          </string-name>
          , SeYoung Kim, Marleen Van Leengoed,
          <string-name>
            <given-names>and Jeroen</given-names>
            <surname>Raijmakers</surname>
          </string-name>
          .
          <year>2016</year>
          .
          <article-title>Mime: An AR-based System Helping Patients to Test their Blood at Home</article-title>
          .
          <source>In Proceedings of the 2016 ACM Conference on Designing Interactive Systems</source>
          .
          <volume>347</volume>
          -
          <fpage>359</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>Abdulsalam</given-names>
            <surname>Dukyil</surname>
          </string-name>
          , Ahmed Mohammed, and
          <string-name>
            <given-names>Mohamed</given-names>
            <surname>Darwish</surname>
          </string-name>
          .
          <year>2016</year>
          .
          <article-title>An optimization approach for a RFID-enabled passport tracking system</article-title>
          .
          <source>In Proceedings of the 4th International Conference on Control, Mechatronics and Automation</source>
          .
          <volume>189</volume>
          -
          <fpage>194</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>Peter</given-names>
            <surname>Fröhlich</surname>
          </string-name>
          , Matthias Baldauf, Thomas Meneweger, Ingrid Erickson, Manfred Tscheligi, Thomas Gable, Boris de Ruyter, and
          <string-name>
            <given-names>Fabio</given-names>
            <surname>Paternò</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>Everyday Automation Experience: Non-Expert Users Encountering Ubiquitous Automated Systems</article-title>
          .
          <source>In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (CHI EA '19)</source>
          .
          <article-title>Association for Computing Machinery</article-title>
          , New York, NY, USA,
          <source>Article Paper W25</source>
          ,
          <article-title>8 pages</article-title>
          . DOI: http://dx.doi.org/10.1145/3290607.3299013
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>Xiaoyan</given-names>
            <surname>Gao</surname>
          </string-name>
          , Xiangqian Ding, Ruichun Hou, and
          <string-name>
            <given-names>Ye</given-names>
            <surname>Tao</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>Research on Food Recognition of Smart Refrigerator Based on SSD Target Detection Algorithm</article-title>
          .
          <source>In Proceedings of the 2019 International Conference on Artificial Intelligence and Computer Science</source>
          .
          <volume>303</volume>
          -
          <fpage>308</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>Marc</given-names>
            <surname>Hassenzahl</surname>
          </string-name>
          and
          <string-name>
            <given-names>Holger</given-names>
            <surname>Klapperich</surname>
          </string-name>
          .
          <year>2014</year>
          .
          <article-title>Convenient, clean, and efficient?: the experiential costs of everyday automation</article-title>
          .
          <source>In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun</source>
          , Fast, Foundational. ACM,
          <volume>21</volume>
          -
          <fpage>30</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>Hayati</given-names>
            <surname>Havlucu</surname>
          </string-name>
          , Terry Eskenazi, Baris¸ Akgün, Mehmet Cengiz Onbas¸lı, Aykut Cos¸kun, and Og˘uzhan Özcan.
          <year>2018</year>
          .
          <article-title>Flow state feedback through sports wearables: A case study on tennis</article-title>
          .
          <source>In Proceedings of the 2018 Designing Interactive Systems Conference</source>
          .
          <volume>1025</volume>
          -
          <fpage>1039</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>Darlis</given-names>
            <surname>Herumurti</surname>
          </string-name>
          , Anny Yuniarti, Imam Kuswardayan, Wijayanti Nurul Khotimah, and
          <string-name>
            <given-names>Wahyu</given-names>
            <surname>Widyananda</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>Virtual reality navigation system in virtual mall environment</article-title>
          .
          <source>In Proceedings of the 3rd International Conference on Communication and Information Processing</source>
          .
          <fpage>209</fpage>
          -
          <lpage>213</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>Elin</given-names>
            <surname>Janebäck</surname>
          </string-name>
          and
          <string-name>
            <given-names>Matilda</given-names>
            <surname>Kristiansson</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>Friendly robot delivery: Designing an autonomous delivery droid for collaborative consumption</article-title>
          .
          <source>Master's thesis.</source>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Katharina</surname>
            <given-names>Keller</given-names>
          </string-name>
          , Kim Valerie Carl, Hendrik Jöntgen,
          <string-name>
            <surname>Benjamin M Abdel-Karim</surname>
            ,
            <given-names>Max</given-names>
          </string-name>
          <string-name>
            <surname>Mühlhäuser</surname>
            , and
            <given-names>Oliver</given-names>
          </string-name>
          <string-name>
            <surname>Hinz</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>" KITT, where are you?" why smart assistance systems in cars enrich people's lives</article-title>
          .
          <source>In Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers</source>
          .
          <fpage>1120</fpage>
          -
          <lpage>1132</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>Dagmar</given-names>
            <surname>Kern</surname>
          </string-name>
          and
          <string-name>
            <given-names>Albrecht</given-names>
            <surname>Schmidt</surname>
          </string-name>
          .
          <year>2009</year>
          .
          <article-title>Design Space for Driver-Based Automotive User Interfaces</article-title>
          .
          <source>In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '09)</source>
          .
          <article-title>Association for Computing Machinery</article-title>
          , New York, NY, USA,
          <fpage>3</fpage>
          -
          <lpage>10</lpage>
          . DOI:http://dx.doi.org/10.1145/1620509.1620511
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>Marcel</given-names>
            <surname>Langer</surname>
          </string-name>
          and
          <string-name>
            <given-names>Dirk</given-names>
            <surname>Söffker</surname>
          </string-name>
          .
          <year>2011</year>
          .
          <article-title>Human guidance and supervision of a manufacturing system for semi-automated production</article-title>
          .
          <source>In 2011 IEEE Jordan Conference on Applied Electrical Engineering and Computing Technologies (AEECT)</source>
          .
          <source>IEEE</source>
          , 1-
          <fpage>6</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>Hyunsoo</given-names>
            <surname>Lee</surname>
          </string-name>
          and
          <string-name>
            <given-names>Amarnath</given-names>
            <surname>Banerjee</surname>
          </string-name>
          .
          <year>2015</year>
          .
          <article-title>Intelligent scheduling and motion control for household vacuum cleaning robot system using simulation based optimization</article-title>
          .
          <source>In 2015 Winter Simulation Conference (WSC)</source>
          . IEEE,
          <fpage>1163</fpage>
          -
          <lpage>1171</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>Dietrich</given-names>
            <surname>Manzey</surname>
          </string-name>
          .
          <year>2012</year>
          . Systemgestaltung und Automatisierung. Springer Berlin Heidelberg, Berlin, Heidelberg,
          <fpage>333</fpage>
          -
          <lpage>352</lpage>
          . DOI: http://dx.doi.org/10.1007/978-3-
          <fpage>642</fpage>
          -19886-1_
          <fpage>19</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>Philipp</given-names>
            <surname>Mayring</surname>
          </string-name>
          and
          <string-name>
            <given-names>Thomas</given-names>
            <surname>Fenzl</surname>
          </string-name>
          .
          <year>2014</year>
          . Qualitative Inhaltsanalyse. Springer Fachmedien Wiesbaden, Wiesbaden,
          <fpage>543</fpage>
          -
          <lpage>556</lpage>
          . DOI: http://dx.doi.org/10.1007/978-3-
          <fpage>531</fpage>
          -18939-0_
          <fpage>38</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>Hila</given-names>
            <surname>Mehr</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H</given-names>
            <surname>Ash</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D</given-names>
            <surname>Fellow</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>Artificial intelligence for citizen services and government</article-title>
          .
          <source>Ash Cent. Democr. Gov. Innov. Harvard Kennedy Sch</source>
          ., no.
          <source>August</source>
          (
          <year>2017</year>
          ),
          <fpage>1</fpage>
          -
          <lpage>12</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Silvia</surname>
            <given-names>Mirri</given-names>
          </string-name>
          , Catia Prandi, and
          <string-name>
            <given-names>Paola</given-names>
            <surname>Salomoni</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>Human-Drone Interaction: state of the art, open issues and challenges</article-title>
          .
          <source>In Proceedings of the ACM SIGCOMM 2019 Workshop on Mobile AirGround Edge</source>
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Jörg</surname>
            <given-names>Müller</given-names>
          </string-name>
          , Florian Alt, Daniel Michelis, and
          <string-name>
            <given-names>Albrecht</given-names>
            <surname>Schmidt</surname>
          </string-name>
          .
          <year>2010</year>
          .
          <article-title>Requirements and Design Space for Interactive Public Displays</article-title>
          .
          <source>In Proceedings of the 18th ACM International Conference on Multimedia (MM '10)</source>
          .
          <article-title>Association for Computing Machinery</article-title>
          , New York, NY, USA,
          <fpage>1285</fpage>
          -
          <lpage>1294</lpage>
          . DOI: http://dx.doi.org/10.1145/1873951.1874203
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>Hermann</given-names>
            <surname>Ney</surname>
          </string-name>
          .
          <year>2001</year>
          .
          <article-title>Stochastic modelling: from pattern classification to language translation</article-title>
          .
          <source>In Proceedings of the workshop on Data-driven methods in machine translation-Volume 14. Association for Computational Linguistics</source>
          ,
          <fpage>1</fpage>
          -
          <lpage>5</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <surname>Raja</surname>
            <given-names>Parasuraman</given-names>
          </string-name>
          , Thomas B Sheridan, and
          <string-name>
            <given-names>Christopher D</given-names>
            <surname>Wickens</surname>
          </string-name>
          .
          <year>2000</year>
          .
          <article-title>A model for types and levels of human interaction with automation</article-title>
          .
          <source>IEEE Transactions on systems, man, and cybernetics-Part A: Systems and Humans</source>
          <volume>30</volume>
          ,
          <issue>3</issue>
          (
          <year>2000</year>
          ),
          <fpage>286</fpage>
          -
          <lpage>297</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>Alex</given-names>
            <surname>Polacco</surname>
          </string-name>
          and
          <string-name>
            <given-names>Kayla</given-names>
            <surname>Backes</surname>
          </string-name>
          .
          <year>2018</year>
          .
          <article-title>The amazon go concept: Implications, applications, and sustainability</article-title>
          .
          <source>Journal of Business and Management</source>
          <volume>24</volume>
          ,
          <issue>1</issue>
          (
          <year>2018</year>
          ),
          <fpage>79</fpage>
          -
          <lpage>92</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <given-names>Christian</given-names>
            <surname>Schawel</surname>
          </string-name>
          and
          <string-name>
            <given-names>Fabian</given-names>
            <surname>Billing</surname>
          </string-name>
          .
          <year>2018</year>
          . Morphologischer Kasten. Springer Fachmedien Wiesbaden, Wiesbaden,
          <fpage>219</fpage>
          -
          <lpage>221</lpage>
          . DOI: http://dx.doi.org/10.1007/978-3-
          <fpage>658</fpage>
          -18917-4_
          <fpage>57</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <surname>Mélodie</surname>
            <given-names>Vidal</given-names>
          </string-name>
          , Andreas Bulling, and
          <string-name>
            <given-names>Hans</given-names>
            <surname>Gellersen</surname>
          </string-name>
          .
          <year>2013</year>
          .
          <article-title>Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets</article-title>
          .
          <source>In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing</source>
          .
          <volume>439</volume>
          -
          <fpage>448</fpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>