<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Not Too Close and Not Too Far: Comfort-Distance Towards Virtual Humans and Anthropomorphic Robot</article-title>
      </title-group>
      <contrib-group>
        <aff id="aff0">
          <label>0</label>
          <institution>Laboratory of Cognitive Science and Immersive Virtual Reality, CS-IVR, Department of Psychology, University of Campania Luigi Vanvitelli</institution>
          ,
          <addr-line>Viale Ellittico, 31, 81100 Caserta</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>1807</year>
      </pub-date>
      <fpage>0000</fpage>
      <lpage>0001</lpage>
      <abstract>
        <p>Nowadays, machines like humanoid and non-humanoid robots no longer belong to an imagined futuristic scenario but to many real contexts of our time. This leads to the reconsideration of the relationship between humans and machines from a human-centered perspective. Indeed, despite the fact that evidence has shown that people perceive robots as social beings and that robots are designed to ensure a pleasant and positive atmosphere, little is known about the human comfort (i.e. positive or negative proxemics preferences) in interacting with human-like machines. More specifically, here we wondered whether the robot's appearance could serve as a socio-emotional cue to influence individuals' proxemics preferences. The present study tried to address this issue considering the Interpersonal-comfort space (IPS) as a reliable measure of the quality of social interactions. To this end participants were asked to provide the comfort-distance judgement while being approached by an anthropomorphic robot, a nonanthropomorphic robot and male and female virtual confederates showing positive or negative facial expressions. Results suggest that participant's comfort distance can be ideally ordered along a line from a negative pole with non-anthropomorphic robot and angry humans to a positive pole with neutral and happy humans. It is interesting to note that the comfort distance from the anthropomorphic robot seems to be between these two poles.</p>
      </abstract>
      <kwd-group>
        <kwd>Human-Robot Interaction</kwd>
        <kwd>Comfort space</kwd>
        <kwd>IVR</kwd>
        <kwd>Anthropomorphic Characteristics</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        The rapid and wide spread of modern technologies and solutions (e.g., virtual assistants,
chatbots, machines learning, AI models etc.) in our daily life let us to deal with an
Copyright © 2020 for this paper by its authors. Use permitted under Creative
Commons License Attribution 4.0 International (CC BY 4.0).
unedited scenario of new advantages, possibilities, limits and changes [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. This
becomes more relevant if we look at the evidence that entities like robots, both
anthropomorphic and non-anthropomorphic, no longer belong to an imagined futuristic scenario,
but to many real contexts of our time [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ]. Nowadays, indeed, robots are used in
industries, offices, domestic environments as well as they have a role of welcoming in
variety of services including retail, hotels, public transport and so forth [
        <xref ref-type="bibr" rid="ref4 ref5 ref6">4–6</xref>
        ]. As a
consequence, the co-existence of humans and robots in the same environment has to be
taken into account and reviewed from a human-centered perspective, because people
and robots have to effectively (and pleasantly) interact [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>
        According to proxemics studies, the use of space during interactions is a good
measure of the quality of social interactions [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. The IPS represents the optimal distance
between ourselves and others, that is our emotional ‘private space’ [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. In the social
psychology literature, a typical task to assess the size of interpersonal space is based on
comfort-distance judgments provided through the ‘stop-distance’ paradigm:
participants have to stop the interactant at the point where they still feel comfortable with the
other’s proximity [
        <xref ref-type="bibr" rid="ref10 ref8 ref9">8–10</xref>
        ]. Usually, people react by increasing the IPS in
uncomfortable/threatening situations and by reducing the IPS in comfortable/safe situations [
        <xref ref-type="bibr" rid="ref11 ref8">8, 11</xref>
        ].
In line with proxemics studies, evidence has demonstrated that IPS is modulated by the
socio-emotional characteristics of the interactant such as age, gender, and facial
expression also when the interactant is a virtual character (e.g. [
        <xref ref-type="bibr" rid="ref12 ref13">12, 13</xref>
        ]). Similar to IPS among
humans, people tend to maintain their personal/private zone when interacting with
robots [
        <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
        ] or virtual robots [
        <xref ref-type="bibr" rid="ref16 ref17">16, 17</xref>
        ]. Although several human-related and
robot-related factors can affect proxemics in Human-Robot interaction (HRI) (e.g. individuals’
familiarity with robots and the robot’s form), previous studies have emphasized the role
of robot appearance in proxemics preferences (e.g. [
        <xref ref-type="bibr" rid="ref15 ref18">15, 18</xref>
        ]). Indeed, a higher degree
of anthropomorphism is linked to higher expectations of adherence to human proxemics
norms (e.g. [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ]). Although there are many studies on robots [
        <xref ref-type="bibr" rid="ref18 ref19">18, 19</xref>
        ], to our knowledge
no study has investigated how much the anthropomorphic appearance in itself can
represent a socio-emotional value in proxemics terms.
      </p>
      <p>
        Here we wondered if, based on the anthropomorphic appearance, individuals tend to
consider robots as friendly interlocutors to get in touch with or as potentially
unpleasant/threatening interlocutors to stay away from. A possible way to address this issue is
to compare individual's comfort-distance when interacting with human confederates
showing positive, negative, or neutral emotions and with anthropomorphic and
nonanthropomorphic robots. Comparing humans with facial expressions with the two
different robots allowed us to better understand how people conceive proxemically the
anthropomorphic appearance. To this end, we devised an IVR study in which
participants were asked to determine the comfort-distance (distance people prefer from other
persons) while being approached by anthropomorphic and non-anthropomorphic robots
and virtual confederates with happy, angry or neutral facial expressions. The IPS
sensitivity to social cues could be revealing of subjective dispositions towards the
interactant (virtual human, robot) [
        <xref ref-type="bibr" rid="ref13 ref20">13, 20</xref>
        ].
      </p>
      <p>
        We expect that the humanoid appearance should induce participants to reduce the
distance compared to the non-anthropomorphic robot. However, we cannot hypothesize
to what socio-emotional value (facial expression) the anthropomorphic appearance can
be considered similar (e.g. [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]).
2
2.1
      </p>
    </sec>
    <sec id="sec-2">
      <title>Materials and Methods</title>
      <sec id="sec-2-1">
        <title>Participants</title>
        <p>Thirty-one participants (16 females), were recruited (mean age = 23.03, SD= 3.03). In
this study all participants had normal or correct to normal vision. Nobody claimed
discomfort or vertigo during the immersive virtual reality (IVR) experience. Participants
gave their written consent to take part in this study. Experiment and testing were in
conformity with the 2013 Declaration of Helsinki and the in accordance with the criteria
established by the Local Ethics Committee (Dept. of Psychology, University of
Campania L. Vanvitelli).
2.2</p>
      </sec>
      <sec id="sec-2-2">
        <title>Immersive Virtual Reality (IVR) equipment and Setting</title>
        <p>The experiment was conducted in the Laboratory of Cognitive Science and Immersive
Virtual Reality (CS-IVR), Department of Psychology, University of Campania L.
Vanvitelli-Caserta (Italy). The IVR was installed in a rectangular room (5 m x 4 m x 3 m)
and includes the Vizard Virtual Reality Software Toolkit 4.10 (WorldViz, LLC, USA)
with the Oculus Rift DK2 as head-mounted display (HMD), having two OLED displays
for stereoscopic depth (images = 1920 x 1080; 90° horizontally, 110° diagonally). The
IVR system allowed for the continuous tracking and recording of participant’s position
by means of a marker placed on the HMD; visual information was updated in real time.
Graphics modelling of all virtual stimuli were created with the 3D Google Sketch Up
7.0 free software and 3DS Max (Autodesk). The position and orientation tracking
systems allowed the participants to realistically experience dynamic and stereoscopic
visuo-motor input as if they were in front of natural stimuli.
2.3</p>
      </sec>
      <sec id="sec-2-3">
        <title>Virtual stimuli and Virtual environment</title>
        <p>The virtual room (3 x 2.4 x 3 m) consisted of green walls, white ceiling, and a grey
floor with a 3 m white dashed line from the initial position of the participants to the end
of the virtual room.</p>
        <p>
          A total of six confederates (half female) with angry, happy, and neutral facial
expression were selected among a colony of highly realistic virtual humans (Vizard
Complete Characters, WorldViz; USA). The emotional expression of the face was obtained
by modelling the virtual faces with 3DS Max (Autodesk) following the KDEF database
[
          <xref ref-type="bibr" rid="ref21">21</xref>
          ]. The sample of virtual confederates was selected on the basis of a previous pilot
study [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ] in which 14 participants rated, on a 9-point Likert scale, how much the faces
presented on the PC appeared happy/unhappy, friendly/threatening, angry/peaceful,
and annoying/quite. Following this evaluation, twelve virtual confederates were
selected whose facial expressions were: happy (two males and two females) angry (two
males and two females) and neutral (two males two female) (Fig. 1).
        </p>
        <p>
          The selected virtual confederates represented male and female adults aged about 30
years and perceived as representation of typical Italian citizens. The virtual
confederates kept their arms extended along the body (see in Fig.1). An anthropomorphic robot
and non-anthropomorphic robot were also used (Fig.1). The height of the virtual
stimuli (that is, male and female virtual confederates,
anthropomorphic/non-anthropomorphic robot) was 175 cm. Walking speed (0.5 ms -1) and approach trajectory was
constant for all virtual stimuli [
          <xref ref-type="bibr" rid="ref22">22</xref>
          ].
        </p>
        <p>After presenting the IVR devices and an initial exploration of the virtual world to
familiarize with the IVR equipment and the environment all participants received
written instructions about the comfort-task. These instructions were also then orally
repeated by the experimenter. Participants wore the HMD and invited to freely explore
the virtual room. Through the HMD, participants could see the virtual stimuli fully
immersed in the virtual scene. After this familiarization phase, participants were guided
by the experimenter to a pre-marked starting position holding a key-press device in
their dominant hand. Throughout the entire experimental session, participants were
with their arms extended along the body, like the posture assumed by the virtual
confederates and the anthropomorphic robot.</p>
        <p>The experimental flow comprised a four-trial training session to allow the participant
to familiarize with the task. After that, the tasting phase began again with a short
presentation of the instructions (2 s) followed by a fixation cross (300 ms) then a virtual
stimulus (i.e. a male/female virtual confederate, an anthropomorphic or
non-anthropomorphic robots) appeared. Participants stood still and saw each virtual stimulus moving
towards them at a constant speed, until they stopped it by pressing the button. Indeed,
the comfort-task instructions were: ''Press the button as soon as the distance between
you and the virtual stimulus makes you feel uncomfortable''. After the button press, the
virtual stimulus disappeared, and the next trial was presented. Each virtual stimulus was
randomly presented (i.e., virtual human confederate showing happy, angry, and neutral
facial expression 4 times each, anthropomorphic and non-anthropomorphic robot 4
times each; total of 32 trials).
4</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Data analysis</title>
      <p>The distance at which the participants stopped the virtual stimuli was measured (cm).
The participant's arm length was then subtracted from the average distance.</p>
      <p>A one-way ANOVA for within-subject design was used to analyze mean distance
(cm) between the participants and the virtual stimuli as a 5-level factor
(Non-anthropomorphic Robot, Anthropomorphic Robot, Angry, Neutral, Happy). Data points outside
M±2.5 SD (0.04%) were discarded. The Tukey post-hoc test was used. The magnitude
of significant effects was expressed by partial eta-squared (η²p).
5</p>
    </sec>
    <sec id="sec-4">
      <title>Results</title>
      <p>The results showed a significant main effect of Virtual stimuli, F (4,30)= 4.50, P&lt;0.01,
η²p= 0.14. The Tukey post-hoc test showed that participants preferred a significantly
larger comfort distance from the non-anthropomorphic robot than virtual humans
looking happy (P=0.01) or with a neutral expression (P&lt;0.5). Similarly, they preferred a
larger comfort distance from virtual humans looking angry than happy (P=0.01) or with
a neutral expression (P&lt;0.5). There was no significant difference between angry virtual
humans and non-anthropomorphic robot (P= 1). The only virtual stimulus that showed
no significant difference from the non-anthropomorphic robot and the humans was the
anthropomorphic robot. As shown by the related means, the comfort distance can be
ideally ordered along a line going from a negative pole with non-anthropomorphic
robot (mean= 169.23, SD= 50.96) and angry humans (mean= 168.89, SD= 32.13) to a
positive pole with neutral (mean= 142.94, SD= 53.80) and happy (mean= 137.48, SD=
24.05) humans. As also shown in Fig.2, the comfort distance from the anthropomorphic
robot seems in between these two poles (mean= 154.07, SD= 65.52).
For explorative purposes, four t-tests for dependent samples (corrected for multiple
comparisons, α= 0.0125) were performed contrasting the factor “Robot” with all others.
The analyses showed that only the comparison that approached significance was
between the anthropomorphic and the non-anthropomorphic robot (t (30) = -0.545, P=
0.016. In contrast, no significant difference from all human stimuli emerged.
6</p>
    </sec>
    <sec id="sec-5">
      <title>Discussion</title>
      <p>
        To our knowledge, despite the fact that Human-Robot Interaction (HRI) researchers
studied a variety of opening encounters and factors that can influence human proxemics
preferences towards robots, no study has investigated how much the anthropomorphic
characteristic can represent a socio-emotional value in proxemics terms [
        <xref ref-type="bibr" rid="ref1 ref2 ref23 ref24">1, 2, 23, 24</xref>
        ].
Namely, here we wondered if people treat robots as pleasant and friendly interactant or
as disturbing/unpleasant interactant when sharing the same social environment on the
basis of their anthropomorphic appearance. Therefore, we considered the IPS
comfortdistance as a reliable measure of the quality of social interactions [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], measured by
asking participants to determine interpersonal comfort-distance while interacting with
an anthropomorphic robot, virtual confederates expressing happy, angry or neutral
facial expressions and a non-anthropomorphic robot.
      </p>
      <p>
        In line with the literature, the results showed that the comfort-distance was larger
with angry virtual confederates and the non-anthropomorphic robot compared to happy
and neutral ones [
        <xref ref-type="bibr" rid="ref13 ref16">16, 13</xref>
        ]. More interestingly for our purposes, results revealed that
participants' preferred comfort-distance towards the anthropomorphic robot lies
between angry confederates, non-anthropomorphic robot and the virtual humans
exhibiting neutral and happy facial expressions. In other words, in proxemics terms
participants kept the anthropomorphic robot to a distance in between negative (i.e. angry
confederates and non-anthropomorphic robot) and positive (i.e. happy and neutral
confederates) virtual stimuli. Therefore, our data reflect the social and safety components of
the IPS [
        <xref ref-type="bibr" rid="ref13 ref16">16, 13</xref>
        ]. Compared to non-anthropomorphic robot, the humanoid appearance
evoked a more human-like interaction [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]. Indeed, the necessity to humanizing the
robot refers to the tendency of individuals to see non-human agents as human beings
and to attribute to them intentions, motivations, or goals similar to human ones. This
strongly influences the way people treat these agents [
        <xref ref-type="bibr" rid="ref25">25</xref>
        ].
      </p>
      <p>
        This pattern of results suggests that each social interaction implies approach and
avoidance behaviors that provoke the optimal regulation of interpersonal distance [
        <xref ref-type="bibr" rid="ref26">26</xref>
        ].
In line with the goal of the present study, our results would indicate that during
humanrobot interaction the anthropomorphic appearance plays a relevant socio-emotional role
in regulating the optimal IPS distance. As in human-human interaction it is common
for individuals to form quick impressions about their unknown interactant, in
HumanRobot interaction people tend to have the same idea about their interactant and the
robot’s appearance has proved particularly important [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ]. Indeed, the anthropomorphic
characteristics increase the robot’s familiarity and the perception to be in touch with a
“human-social entity” [
        <xref ref-type="bibr" rid="ref27">27</xref>
        ].
      </p>
      <p>
        To conclude, the present findings may contribute to better understand the human
necessity to create “socially interactive robots” with human-like characteristics [
        <xref ref-type="bibr" rid="ref28">28</xref>
        ].
      </p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Li</surname>
          </string-name>
          , R., van Almkerk, M.,
          <string-name>
            <surname>van Waveren</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Carter</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leite</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          :
          <article-title>Comparing Human-Robot Proxemics between Virtual Reality and the Real World</article-title>
          . In: ACM/IEEE International Conference on Human-Robot
          <string-name>
            <surname>Interaction</surname>
          </string-name>
          (
          <year>2019</year>
          ). https://doi.org/10.1109/HRI.
          <year>2019</year>
          .
          <volume>8673116</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Dautenhahn</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Walters</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Woods</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Koay</surname>
            ,
            <given-names>K.L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nehaniv</surname>
            ,
            <given-names>C.L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sisbot</surname>
            ,
            <given-names>E.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Alami</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Siméon</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          :
          <article-title>How may I serve you? A robot companion approaching a seated person in a helping context</article-title>
          .
          <source>In: HRI 2006: Proceedings of the 2006 ACM Conference on Human-Robot Interaction</source>
          (
          <year>2006</year>
          ). https://doi.org/10.1145/1121241.1121272
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Sadka</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Giron</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Friedman</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zuckerman</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Erel</surname>
          </string-name>
          , H.:
          <article-title>Virtual-reality as a Simulation Tool for Non-humanoid Social Robots</article-title>
          .
          <source>In: Extended Abstract of the 2020 CHI Conference on Human Factors in Computing Systems</source>
          (
          <year>2020</year>
          ). https://doi.org/10.1145/3334480.3382893
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Anderson-Bashan</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Megidish</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Erel</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wald</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hoffman</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zuckerman</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Grishko</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>The Greeting Machine: An Abstract Robotic Object for Opening Encounters</article-title>
          .
          <source>In: RO-MAN 2018 - 27th IEEE International Symposium on Robot and Human Interactive Communication</source>
          (
          <year>2018</year>
          ). https://doi.org/10.1109/ROMAN.
          <year>2018</year>
          .
          <volume>8525516</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Ivanov</surname>
            ,
            <given-names>S.H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Webster</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Berezina</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          :
          <article-title>Adoption of robots and service automation by tourism and hospitality companies</article-title>
          .
          <source>Revista Turismo &amp; Desenvolvimento</source>
          <volume>27</volume>
          ,
          <fpage>1501</fpage>
          -
          <lpage>1517</lpage>
          (
          <year>2018</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Kanda</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shiomi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Miyashita</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ishiguro</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hagita</surname>
            ,
            <given-names>N.:</given-names>
          </string-name>
          <article-title>An affective guide robot in a shopping mall</article-title>
          .
          <source>In: Proceedings of the 4th ACM/IEEE International Conference on HumanRobot Interaction</source>
          , HRI'
          <volume>09</volume>
          (
          <year>2008</year>
          ). https://doi.org/10.1145/1514095.1514127.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Spencer</surname>
            ,
            <given-names>R.E.: Book</given-names>
          </string-name>
          <string-name>
            <surname>Reviews: The Hidden Dimension by Edward</surname>
            <given-names>T.</given-names>
          </string-name>
          <string-name>
            <surname>Hall</surname>
          </string-name>
          . New York: Doubleday and Company, Inc.,
          <year>1966</year>
          . Pp.
          <source>xii + 193. Educational and Psychological Measurement</source>
          . (
          <year>1966</year>
          ). https://doi.org/10.1177/001316446602600462.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Hayduk</surname>
            ,
            <given-names>L.A.</given-names>
          </string-name>
          :
          <article-title>Personal space: Where we now stand</article-title>
          .
          <source>Psychological Bulletin</source>
          . (
          <year>1983</year>
          ). https://doi.org/10.1037/
          <fpage>0033</fpage>
          -
          <lpage>2909</lpage>
          .
          <year>94</year>
          .2.293.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Aiello</surname>
            ,
            <given-names>J.R.</given-names>
          </string-name>
          :
          <article-title>Human Spatial Behavior</article-title>
          . In: Stokols,
          <string-name>
            <given-names>D.</given-names>
            &amp;
            <surname>Altman</surname>
          </string-name>
          , I. (ed.)
          <article-title>Handbook of environ-mental psychology</article-title>
          . pp.
          <fpage>389</fpage>
          -
          <lpage>504</lpage>
          . John Wiley and Sons, New York (
          <year>1987</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Dosey</surname>
            ,
            <given-names>M.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Meisels</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Personal space and self-protection</article-title>
          .
          <source>Journal of Personality and Social Psychology</source>
          .
          <volume>11</volume>
          , (
          <year>1969</year>
          ). https://doi.org/10.1037/h0027040.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Kennedy</surname>
            ,
            <given-names>D.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gläscher</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tyszka</surname>
            ,
            <given-names>J.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Adolphs</surname>
          </string-name>
          , R.:
          <article-title>Personal space regulation by the human amygdala</article-title>
          .
          <source>Nature Neuroscience</source>
          . (
          <year>2009</year>
          ). https://doi.org/10.1038/nn.2381.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Iachini</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coello</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Frassinetti</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Senese</surname>
            ,
            <given-names>V.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Galante</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ruggiero</surname>
          </string-name>
          , G.:
          <article-title>Peripersonal and interpersonal space in virtual and real environments: Effects of gender and age</article-title>
          .
          <source>Journal of Environmental Psychology</source>
          . (
          <year>2016</year>
          ). https://doi.org/10.1016/j.jenvp.
          <year>2016</year>
          .
          <volume>01</volume>
          .004.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Ruggiero</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Frassinetti</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coello</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rapuano</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>di Cola</surname>
            ,
            <given-names>A.S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Iachini</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          :
          <article-title>The effect of facial expressions on peripersonal and interpersonal spaces</article-title>
          .
          <source>Psychological Research</source>
          .
          <volume>81</volume>
          , (
          <year>2017</year>
          ). https://doi.org/10.1007/s00426-016-0806-x.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Sardar</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Joosse</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Weiss</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Evers</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          :
          <article-title>Don't stand so close to me: Users' attitudinal and behavioral responses to personal space invasion by robots</article-title>
          .
          <source>In: HRI'12 - Proceedings of the 7th Annual ACM/IEEE International Conference on Human-Robot Interaction</source>
          (
          <year>2012</year>
          ). https://doi.org/10.1145/2157689.2157769.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Takayama</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pantofaru</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>Influences on proxemic behaviors in human-robot interaction</article-title>
          .
          <source>In: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems</source>
          ,
          <string-name>
            <surname>IROS</surname>
          </string-name>
          <year>2009</year>
          (
          <year>2009</year>
          ). https://doi.org/10.1109/IROS.
          <year>2009</year>
          .
          <volume>5354145</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Iachini</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coello</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Frassinetti</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ruggiero</surname>
          </string-name>
          , G.:
          <article-title>Body space in social interactions: A comparison of reaching and comfort distance in immersive virtual reality</article-title>
          .
          <source>PLoS ONE</source>
          .
          <volume>9</volume>
          , (
          <year>2014</year>
          ). https://doi.org/10.1371/journal.pone.
          <volume>0111511</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Peters</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yang</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Saikia</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Skantze</surname>
          </string-name>
          , G.:
          <article-title>Towards the use of mixed reality for hri design via virtual robots</article-title>
          .
          <source>In: Proceedings of the 1st International Workshop on Virtual</source>
          , Augmented, and
          <article-title>Mixed Reality for HRI (VAM-</article-title>
          <string-name>
            <surname>HRI)</surname>
          </string-name>
          (
          <year>2018</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Syrdal</surname>
            ,
            <given-names>D.S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dautenhahn</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Walters</surname>
            ,
            <given-names>M.L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Koay</surname>
            ,
            <given-names>K.L.</given-names>
          </string-name>
          :
          <article-title>Sharing spaces with robots in a home scenario - Anthropomorphic attributions and their effect on proxemic expectations and evaluations in a live HRI trial</article-title>
          .
          <source>in Proceedings of the AAAI Fall Symposium</source>
          (
          <year>2008</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          19.
          <string-name>
            <surname>Koay</surname>
            ,
            <given-names>K.L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Syrdal</surname>
            ,
            <given-names>D.S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ashgari-Oskoei</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Walters</surname>
            ,
            <given-names>M.L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dautenhahn</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          :
          <article-title>Social Roles and Baseline Proxemic Preferences for a Domestic Service Robot</article-title>
          .
          <source>International Journal of Social Robotics</source>
          . (
          <year>2014</year>
          ). https://doi.org/10.1007/s12369-014-0232-4.
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          20.
          <string-name>
            <surname>Cartaud</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ruggiero</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ott</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Iachini</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coello</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          :
          <article-title>Physiological response to facial expressions in peripersonal space determines interpersonal distance in a social interaction context</article-title>
          .
          <source>Frontiers in Psychology. 9</source>
          , (
          <year>2018</year>
          ). https://doi.org/10.3389/fpsyg.
          <year>2018</year>
          .
          <volume>00657</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          21.
          <string-name>
            <surname>Lundqvist</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Flykt</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ohman</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>The Karolinska directed emotional faces (KDEF), (</article-title>
          <year>1998</year>
          ). https://doi.org/10.1017/S0048577299971664.
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          22.
          <string-name>
            <surname>Bailenson</surname>
            ,
            <given-names>J.N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Blascovich</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Beall</surname>
            ,
            <given-names>A.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Loomis</surname>
            ,
            <given-names>J.M.:</given-names>
          </string-name>
          <article-title>Interpersonal distance in immersive virtual environments</article-title>
          , (
          <year>2003</year>
          ). https://doi.org/10.1177/0146167203029007002.
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          23.
          <string-name>
            <surname>Kulić</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Croft</surname>
            ,
            <given-names>E.A.</given-names>
          </string-name>
          :
          <article-title>Safe planning for human-robot interaction</article-title>
          .
          <source>Journal of Robotic Systems</source>
          . (
          <year>2005</year>
          ). https://doi.org/10.1002/rob.20073.
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          24.
          <string-name>
            <surname>Kulić</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Croft</surname>
            ,
            <given-names>E.A.</given-names>
          </string-name>
          :
          <article-title>Strategies for Safety in Human-Robot Interaction</article-title>
          .
          <source>In: IEEE International Conference on Advanced Robotics</source>
          (
          <year>2003</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          25.
          <string-name>
            <surname>Leichtmann</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nitsch</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          :
          <article-title>How much distance do humans keep toward robots? Literature review, meta-analysis, and theoretical considerations on personal space in human-robot interaction</article-title>
          , (
          <year>2020</year>
          ). https://doi.org/10.1016/j.jenvp.
          <year>2019</year>
          .
          <volume>101386</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          26.
          <string-name>
            <surname>Argyle</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dean</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          :
          <string-name>
            <surname>Eye-Contact</surname>
          </string-name>
          ,
          <article-title>Distance and Affiliation</article-title>
          . Soins; la revue de référence infirmière. (
          <year>1965</year>
          ). https://doi.org/10.2307/2786027.
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          27.
          <string-name>
            <surname>Fink</surname>
          </string-name>
          , J.:
          <article-title>Anthropomorphism and Human Likeness in the Design of Robots and Human-Robot Interaction</article-title>
          . (
          <year>2012</year>
          ). https://doi.org/10.1007/978-3-
          <fpage>642</fpage>
          -34103-8_
          <fpage>20</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          28.
          <string-name>
            <surname>Walters</surname>
            ,
            <given-names>M.L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dautenhahn</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Koay</surname>
            ,
            <given-names>K.L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kaouri</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Boekhorst</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nehaniv</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Werry</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lee</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          :
          <article-title>Close encounters: Spatial distances between people and a robot of mechanistic appearance</article-title>
          .
          <source>In: Proceedings of 2005 5th IEEE-RAS International Conference on Humanoid Robots</source>
          (
          <year>2005</year>
          ). https://doi.org/10.1109/ICHR.
          <year>2005</year>
          .
          <volume>1573608</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>