<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Investigating the relationship between empathy and attribution of mental states to robots</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Alberto Lillo</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Alessandro Saracco</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Elena Siletto</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Claudio Mattutino</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Cristina Gena</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Computer Science, University of Turin</institution>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>This paper describes an experimental evaluation aimed at detecting the users' perception of the robot's empathic abilities during a conversation. The results have been then analyzed to search for a possible relationship between the perceived empathy and the attribution of mental states to the robot, namely the user's perception of the robot's mental qualities as compared to humans. The involved sample consisted of 68 subjects, including 34 adults and 34 between teenagers and children. By conducting the experiment with both adult and child participants, make possible to compare the results obtained from each group and identify any diferences in perception between the various age groups.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Human Robot Interaction</kwd>
        <kwd>Empathy</kwd>
        <kwd>Mental State Attribution</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>of human emotions? And how does robotic empathy
impact human perception of these new social entities?
Simultaneously, advancements in HRI research highlight
the importance of developing increasingly intuitive and
natural human-robot interfaces [11, 12, 13], enabling fluid
and bidirectional communication. In this sense, robot
design cannot overlook a holistic approach that
considers not only functional aspects but also emotional and
relational ones, designing machines capable of
”understanding” and adapting to the human context in which
they are inserted.</p>
      <p>The advent of Artificial Intelligence and robotic
technologies has ushered in an era of extraordinary potential for
humanity, radically transforming the way we live, work,
and interact. Within this context, Human-Robot
Interaction (HRI) [1] emerges as a critically important field of
study, at the crossroads of technological innovation and
humanistic understanding. As technological progress
leaps forward, crucial issues arise not only about how
robots can assist humans in their daily activities, such
as work [2], school [3], home [4], cleaning and caring
for vulnerable persons [5], but also about how they can
harmoniously integrate into the social dynamics that 2. State of the art
characterise our existence. Thus, the primary challenge
of HRI is not merely technical but also profoundly rela- Empathy is an intrinsically human capacity to perceive
tional: how to design robots that are not perceived as and respond to others’ emotions, represents one of the
mere machines, but as social companions, capable of em- fundamental pillars in the advancement of Human-Robot
pathy and meaningful interaction with human beings Interaction (HRI). Historically confined to human
interac[6, 7], supporting them in their daily lives and their pref- tions, the concept of empathy has progressively extended
erential choices [8]. into the HRI field, aiming to make machines not only
This question opens the way for a broader reflection on more intelligent but also more sensitive to the human
the meaning of empathy in the robotic domain [9, 10] emotional context. In scientific and technological
literand the role it can play in facilitating an efective and ature, empathy in robots has emerged as a crucial area
positive social integration of robots. Empathy, tradition- of research, reflecting a paradigm shift from mere
funcally understood as the ability to comprehend and share tional eficiency towards the socio-emotional integration
the feelings of others, becomes a desirable quality for of robots into society.
robots as well, especially in areas where the human-robot Advances in artificial intelligence, particularly in
marelationship is crucial, such as elderly care, education, chine learning and computer vision, have led to the
creand therapeutic support. The design of empathic robots, ation of systems capable of recognizing certain emotional
however, raises complex questions, not only of a tech- states, paving the way for more natural and engaging
nological nature but also philosophical and ethical: is it robot-human interactions. However, deeply
understandpossible for a machine to possess a true understanding ing and genuinely responding to complex emotional
dynamics remain ambitious goals, given the heterogeneity
and subtlety of human emotional expressions. Enabling
zWaonrok,sJhuonpeR3orbdo, t2s0f2o4r Humans 2024, Advanced Visual Interfaces, Aren- robots to empathize presents significant challenges, such
© 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License as accurately interpreting human emotional signals,
inAttribution 4.0 International (CC BY 4.0).
cluding facial expressions, gestures, and vocal prosody, empathic abilities during a conversation. The results
and generating appropriate behavioral responses. have been then analyzed to search for a possible
Studies have shown that robots that can adjust their be- relationship between the perceived empathy and the
haviour according to the afective state or personality attribution of mental states to the robot, namely the
of a user are more accepted as interaction partners [14] user’s perception of the robot’s mental qualities as
and are seen as more friendly, caring, sympathetic, sup- compared to humans. The involved sample consisted
portive and trustworthy. Therefore, several empathic of 68 subjects, including 34 adults and 34 between
models for social robots have been proposed [15, 16]. teenagers and children. By conducting the experiment
Succeeding in this daunting challenge can have profound with both adult and child participants, make possible
positive efects on users’ attitudes towards social robots. to compare the results obtained from each group and
Responding to the user’s afective experience in a socially identify any diferences in perception between the
appropriate manner is considered crucial to achieving various age groups.
user trust and satisfaction. In one experiment it was Methodology and experimental design. To achieve
found that the ability of a robot to respond with its em- the most significant results, it’s crucial to adhere to a
pathic system in a situationally appropriate manner is clear and well-defined methodology. The experiment’s
more important for comforting the user than a sophisti- organization was meticulously planned to ensure its
cated and detailed recognition of afect [ 17]. efective and eficient execution, leading to valid and
However, reacting empathically requires the robot’s reliable outcomes. The planning process began with
recognition of the user’s emotional state. This knowledge identifying the user categories participating in the
is challenging as it requires an evaluation of a deeply per- experiment. The ‘adults’ category includes users aged
sonal and individual experience and for these reasons 18 and above, while the ‘children and young people’
errors are likely to occur. This reinforces the importance category comprises users aged 17 and below.
of understanding how people respond to empathic capa- Each user category, the adults and the youth and
bilities if a robot behaves incongruently with the user’s children, will be split into two groups: the control
emotional experience. Inaccurate emotional responses group and the experimental group. The control group
may indeed have negative consequences on users’ eval- won’t be subjected to any changes in the independent
uations of an agent. Furthermore, virtual agents that variable, serving as a key reference point. This setup
display emotions incongruent to the situation are also allows for the evaluation of the manipulation’s impact
less appreciated by users than those that do not express by comparing the results of the two groups.
any emotion at all [18]. Experimental group. Users in this group will interact
Research on the subject, however, has yet to fully uncover with an expressive robot that can respond to the user’s
the efects of empathic behaviour in diferent situations, emotions and express its own state. The robot will also
including possible inaccurate responses [18]. Accord- make movements during the conversation to facilitate
ing to several neurological and psychological researches non-verbal interaction.
([19, 20] the involvement of the mirror neuron system Control group. Users in the control group will interact
is implicated in neurocognitive functions, such as social with an apathetic robot, which is programmed to
cognition, language, empathy and the Theory of Mind complete tasks without showing empathy towards the
(ToM) [21, 22], which is a human-specific ability to at- user’s emotions. The robot will exhibit a less enthusiastic
tribute mental states - intentions, thoughts, desires and and more static demeanor, with no specific movements
emotions - to oneself and others to explain and predict be- to aid non-verbal interaction.
haviour. Specifically, attribution of mental states (AMS) Independent variable. The independent variable is the
has been defined as ” the cognitive ability to reflect on one’s social and emotional skill level that will be implemented
own and others’ mental states, such as beliefs, desires, feel- in the NAO virtual robot. Specifically, there are
ings, and intentions” [23]. In [24], the authors presented two conditions for the conduct of the experiment: i)
an experimental study showing that the humanoid robot Emotional and empathic robot; ii)n Apathetic robot.
NAO is able to stimulate the attribution of mental states In the course of the experiment, this variable will be
towards itself when it stimulate empathy. This result manipulated in order to test the interaction with the two
suggests a possible correlation between empathy toward types of robots and to record the diferences in users’
the robot and humans’ attribution of mental states to it. perceptions.</p>
      <p>Dependent variables. The primary dependent variable
is the users’ reactions to the diferent experimental
3. The experiment conditions, encompassing all measurements and
observations of users’ responses post-interaction with the
In this Section we introduce an experimental evaluation robot. These responses will be primarily gauged through
aimed at detecting the users’ perception of the robot’s a structured interview. Another potential dependent
variable is task performance, with a focus on whether the five mental state categories. T-tests will be calculated for
user successfully completes the intended task. External both the experimental and control groups as a whole,
variables that could impact the results should also be and separately for children and adults. This allows for
considered. Analyzing the dependent variables’ data data interpretation across both groups and distinct age
will enable the assessment of the independent variable categories.
manipulation’s efect on the research.</p>
      <p>Sample Selection. When conducting experiments, it’s
crucial to choose a representative sample to prevent 3.1. Experimental plan
bias. However, for this experiment, the sample wasn’t
randomly selected but was chosen from a readily Each participant, numbered 1 to 34 based on their
paravailable and willing group. Particularly, most children ticipation order, will be randomly assigned to either the
in the sample were from a cooperative dance school. experimental or control group. A random number
genTherefore, while the sample isn’t fully representative erator was used for this assignment, with the first 17
of all user categories, it provides a solid foundation for numbers allocated to the experimental group and the
future research. remaining 17 to the control group. This procedure
apMeasurement and Instruments. The data collection plies to both ‘adults’ and ‘children and young people’
for the dependent variables will utilize quantitative categories. The test will proceed in multiple stages. Each
measurements, which allows for numerical data col- of these steps has a precise objective and are designed to
lection and subsequent statistical analysis. Initially, be able to collect valid and reliable data:
a questionnaire was selected as the most appropriate Introduction. In this initial stage, the user will be
method for collecting this data. However, considering greeted and given a brief introduction. Only essential
the administration method, a structured interview was details for interacting with NAO will be provided at this
introduced instead. This method, unlike questionnaires, time, while answers to additional inquiries will be
deprovides the opportunity to clarify questions, aiding ferred until the experiment’s conclusion.
participants, including children, to fully understand Interaction and task. In this stage, the user will interact
and comfortably participate in data collection. The with the NAO virtual robot. The session will start with
interview process involves asking the user to score each an introduction between the robot and the user.
Followquestion on a rating scale. Specific tools have been ing topics will include interests and family, culminating
utilized for this process. Batson’s self-assessment: This in the final task.
assessment [25] asks participants to rate their experience Gathering quantitative information. Once the test
of specific emotions on a scale from 1 to 5. For the Batson has been completed, the user will immediately be given
self-assessment, users will score their emotions on a a questionnaire containing the necessary quantitative
scale from ‘Not at all’ to ‘Totally’. The assessment will measurements, namely, the Batson self-assessment [25]
evaluate 23 emotions, expressed through the following and the AMS questionnaire [26, 27].
adjectives: frightened, sufering, sympathetic, sensitive, Closing. In this last phase we will move on to the final
agitated, cordial, worried, stressed, sad, compassionate, greetings and thanks, answering users’ questions and
upset, tender, distressed, impressed, downhearted, curiosities.
depressed, aflicted, annoyed, kind, melancholic, moved,
and uncomfortable. 3.2. Data Analysis
AMS-Q questionnaire: the administration of this
questionnaire [26, 27] will allow us to perceive the degree to Upon completion of the experiment and data collection,
which users attribute mental states to the NAO robot. various quantitative analyses will be conducted using
The test consists of 25 questions and asks users to rate Excel. For the Batson self-assessment results, the mean
whether they think the robot (e.g. ”can you understand?”, score and standard deviation for each emotion will be
”can it decide?”, ”can you tell a lie?”, ”can you try to do calculated for each group. A T-test will then be performed
something?”). to determine if there’s a significant diference between
Upon completion of the experiment and data collection, the means of the experimental and control groups. The
various quantitative analyses will be conducted using AMS questionnaire results will undergo a similar analysis,
Excel. For the Batson self-assessment results [25], the but will first be divided into five mental state categories.
mean score and standard deviation for each emotion T-tests will be calculated for both the experimental and
will be calculated for each group. A T-test will then be control groups as a whole, and separately for children
performed to determine if there’s a significant diference and adults. This allows for data interpretation across
between the means of the experimental and control both groups and distinct age categories.
groups. The AMS questionnaire [26, 27] results will
undergo a similar analysis, but will first be divided into</p>
    </sec>
    <sec id="sec-2">
      <title>4. Creating the personality of the</title>
      <p>virtual robot NAO</p>
    </sec>
    <sec id="sec-3">
      <title>5. Results and Comparisons</title>
      <p>5.1. ’Children and Youth’ category
The process of defining the personality of NAO consti- Comparing the Batson self-assessment results, both
tuted the first step necessary for the implementation of the experimental and control groups reported minimal
the robot. For this implementation, we focused on the negative emotions, with over 88% stating they felt
creation of a Personas. This step then made it possible ‘Not at all’ distressed, worried, stressed, sad, upset,
to program the robot and write dialogues that were con- downhearted, depressed, distressed, and annoyed.
sistent with each other and with the robot’s personality. However, diferences emerged in positive emotions, with
In order to describe and frame the desired personality of 100% of the experimental group feeling ‘Totally’ nice and
the robot, the Big Five [28] test was performed. Thanks kind, compared to 76.5% in the control group. The T-test
to this test, it was possible to optimise and think about showed minimal significance levels for ‘Sympathetic’
character traits. In a first step, the job that NAO could (t-stat=-2.135, df=32, p&lt;0.05) and ‘Kind’ (t-stat=-2.063,
hypothetically perform was chosen. The professional df=32, p&lt;0.05). Regarding the AMS questionnaire,
ifgure identified was the teacher. This, it was assessed, similar average scores were observed in all dimensions
would fit well with the envisaged personality and the for both groups: epistemic (7.5 vs. 8.9), emotional (7.5 vs.
subsequent task. 7.7), desires and intentions (8.8 vs. 9.7), imagination (7.6
During the following phase, the main characteristics to vs. 8.8), and perceptual (6.8 vs. 6.7). The T-test did not
be attributed to the robot were chosen, which were con- reveal any significance, but it’s noteworthy that high
sidered fundamental, such as the traits: calm, patient and average scores were obtained in each dimension for both
wise. Once these elements had been identified, the test groups.
was completed. The results obtained as follows.</p>
      <p>Emotional Stability: 37 out of 120.</p>
      <p>The robot was chosen to have a very positive attitude
that rarely experiences negative emotions. He is charac- 5.2. ”Adult” Category
terised by a sunny and patient manner. These traits were
also favoured with a view to safe interaction.</p>
      <p>Extraversion: 88 out of 120.</p>
      <p>As discussed extensively in chapter one, robots need to be
able to communicate and engage in interaction with ease
in order to be recognised as social agents. This character
trait emerged from this reflection. In particular, high
scores in this category describe a sociable and assertive
personality. This could lead NAO to make friends very
quickly and relate to users.</p>
      <p>Openness to experience: 88 out of 120.</p>
      <p>In order to model a robot that would then be credible
when tested with users, it was decided not to attribute
characteristics to NAO that could be considered
unthinkable. Its character was, therefore, calibrated to be based
on facts.</p>
      <p>Conscientiousness: 106 out of 120.</p>
      <p>This is the category where NAO scored the highest, since
it has been designed to be a responsible, organised and
disciplined robot. Furthermore, questions concerning
levels of confidence in one’s own abilities were always
answered as ’agree’ and ’very agree’.</p>
      <p>Agreeableness: 101 out of 120.</p>
      <p>In terms of values, sincerity and the spirit of cooperation
were favored. Those who score high on this trait are also
characterized by kindness and altruism.</p>
      <p>Comparing the Batson Self-Assessment results, both
the experimental and control groups reported minimal
negative emotions, with over 82% stating they felt ‘Not
at all’ scared, hurt, sad, upset, distressed, downhearted,
depressed, annoyed, melancholic, and uncomfortable.</p>
      <p>However, diferences emerged in positive emotions.</p>
      <p>The experimental group reported total sympathy for
the robot by 64.75%, while the control group reported
41.25%.</p>
      <p>The experimental group, which interacted with a
more emotional robot, reported feeling somewhat more
sensitive, with only 23.5% saying ‘Not at all’, compared to
53% in the control group. This diference was significant.</p>
      <p>The experimental group also reported feeling more
friendly towards the robot, with 82.4% reporting ‘Totally’,
compared to only 29.5% in the control group. This
emotion was found to be significant (t-stat=-2.212, df=32,
p&lt;0.05). For the feeling ‘impressed’, no subjects in the
experimental group reported ‘Not at all’ and 41.25%
indicated ‘Very much’. In contrast, 64.7% of the subjects
in the control group answered ‘Not at all’ and none
indicated ‘Totally’. This emotion was also found to be
significant (t-stat=-5.944, df=32, p=0.000001). Regarding
the AMS questionnaire, divergent averages were noted
for all dimensions between the control and experimental
group: epistemic (5.5 vs. 8.2), emotional (1.9 vs. 7.5),
desires and intentions (4.0 vs. 7.9), imagination (2.0 vs.
6.9), and perceptual (2.4 vs. 5.6). Upon applying the
T-test, the dimensions epistemic (t-stat=-2.81, df=32,
[1] M. A. Goodrich, A. C. Schultz, et al., Human–robot
interaction: a survey, Foundations and Trends® in</p>
      <p>Human–Computer Interaction 1 (2008) 203–275.
[2] D. Brunetti, C. Gena, F. Vernero, Smart interactive
technologies in the human-centric factory 5.0: a
5.3. Comparison between ”Children and survey, Applied Sciences 12 (2022) 7965.
[3] S. Anwar, N. A. Bascou, M. Menekse, A. Kardgar,</p>
      <p>Young People” and ”Adults A systematic review of studies on educational
Comparing the results between the two categories, chil- robotics, Journal of Pre-College Engineering
Edudren and adults, from the Batson’s test showed that both cation Research (J-PEER) 9 (2019) 2.
groups empathized with the robot, with higher averages [4] C. D. Kidd, C. Breazeal, Robots at home:
Unfor positive emotions. However, adults in the experimen- derstanding long-term human-robot interaction,
tal group reported higher averages for positive emotions in: 2008 IEEE/RSJ International Conference on
than the control group. In the AMS questionnaire, no Intelligent Robots and Systems, IEEE, 2008, pp.
clear diferences emerged between the experimental and 3230–3235.
control groups for children. However, adults in the ex- [5] O. Pino, G. Palestra, R. Trevino, B. D. Carolis, The
perimental group attributed higher scores to the robot in humanoid robot NAO as trainer in a memory
proall dimensions. For the epistemic dimension, children in gram for elderly people with mild cognitive
imthe experimental group had a mean of 8.9 and adults 8.2, pairment, Int. J. Soc. Robotics 12 (2020) 21–33.
while the control group had 7.5 for children and 5.5 for URL: https://doi.org/10.1007/s12369-019-00533-y.
adults. For the emotional dimension, the experimental doi:10.1007/S12369- 019- 00533- Y.
group averaged 7.7 for children and 7.5 for adults, while [6] C. Breazeal, Designing sociable robots, MIT press,
the control group had 7.5 for children and 1.9 for adults. 2004.</p>
      <p>For the dimension of desires and intentions, the experi- [7] C. Breazeal, K. Dautenhahn, T. Kanda, Social
mental group averaged 9.7 for children and 7.9 for adults, robotics, Springer handbook of robotics (2016)
while the control group had 8.8 for children and 4 for 1935–1972.
adults. For the imagination dimension, the experimental [8] A. Jameson, S. Gabrielli, P. O. Kristensson, K.
Reigroup had a mean of 8.8 for children and 6.9 for adults, necke, F. Cena, C. Gena, F. Vernero, How can we
while the control group had 7.6 for children and 2 for support users’ preferential choice?, in: CHI ’11
adults. Finally, for the perceptual dimension, the experi- Extended Abstracts on Human Factors in
Computmental group averaged 6.7 for children and 5.6 for adults, ing Systems, CHI EA ’11, Association for
Comwhile the control group had 6.8 for children and 2.4 for puting Machinery, New York, NY, USA, 2011,
adults. p. 409–418. URL: https://doi.org/10.1145/1979742.
1979620. doi:10.1145/1979742.1979620.
[9] I. Leite, A. Pereira, S. Mascarenhas, C. Martinho,
6. Discussion and Conclusion R. Prada, A. Paiva, The influence of empathy in
human–robot relations, International journal of
The paper examined the relationship between Human- human-computer studies 71 (2013) 250–260.
Robot Interaction (HRI) and empathy, and the role of [10] A. Paiva, I. Leite, H. Boukricha, I. Wachsmuth,
social robots. It used a semi-structured interview method Empathy in virtual agents and robots: A survey,
for data collection, which was efective in maintaining ACM Transactions on Interactive Intelligent
Sysuser concentration. The study found that the personal tems (TiiS) 7 (2017) 1–40.
background of users, especially adults, significantly influ- [11] D. Perzanowski, A. C. Schultz, W. Adams, E. Marsh,
enced their responses. The results revealed that adults in M. Bugajska, Building a multimodal human-robot
the experimental group attributed more mental states to interface, IEEE intelligent systems 16 (2001) 16–21.
the robot, suggesting that the robot’s additional features, [12] T. Salter, K. Dautenhahn, R. Te Boekhorst,
Learnsuch as movements and positive mood, made it appear ing about natural human–robot interaction styles,
more capable. However, these features had no impact on Robotics and Autonomous Systems 54 (2006)
younger users, who were consistently enthusiastic and 127–134.
perceived the robot as almost limitless. In conclusion, [13] R. Stiefelhagen, C. Fugen, R. Gieselmann,
the paper highlighted the importance of emotions and H. Holzapfel, K. Nickel, A. Waibel, Natural
empathy in HRI and suggested that these factors will be human-robot interaction using speech, head pose
crucial in the future development of social robots. and gestures, in: 2004 IEEE/RSJ International
Conference on Intelligent Robots and Systems Schizophrenia research 92 (2007) 151–159.
(IROS)(IEEE Cat. No. 04CH37566), volume 3, IEEE, [24] C. Gena, F. Manini, A. Lieto, A. Lillo, F. Vernero,
2004, pp. 2422–2427. Can empathy afect the attribution of mental states
[14] P. Bucci, Building believable robots : an exploration to robots?, in: Proceedings of the 25th
Interof how to make simple robots look, move, and feel national Conference on Multimodal Interaction,
right, Ph.D. thesis, University of British Columbia, ICMI ’23, Association for Computing Machinery,
2017. URL: https://open.library.ubc.ca/collections/ New York, NY, USA, 2023, p. 94–103. URL: https:
ubctheses/24/items/1.0355204. doi:http://dx.doi. //doi.org/10.1145/3577190.3614167. doi:10.1145/
org/10.14288/1.0355204. 3577190.3614167.
[15] E. Bagheri, O. Roesler, B. Vanderborght, Toward a [25] C. D. Batson, M. P. Polycarpou, E. Harmon-Jones,
reinforcement learning based framework for learn- H. J. Imhof, E. C. Mitchener, L. L. Bednar, T. R. Klein,
ing cognitive empathy in human-robot interactions, L. Highberger, Empathy and attitudes: can feeling
2020. for a member of a stigmatized group improve
feel[16] L. Battisti, S. Fagioli, A. Ferrato, C. Limongelli, ings toward the group?, J. Pers. Soc. Psychol. 72
S. Mastandrea, M. Mezzini, D. Nardo, G. Sansonetti, (1997) 105–118.</p>
      <p>Towards empathetic social robots: Investigating the [26] L. Miraglia, G. Peretti, F. Manzi, C. Di Dio, D.
Masinterplay between facial expressions and brain ac- saro, A. Marchetti, Development and validation
tivity (short paper), in: A. Soto, E. Zangerle (Eds.), of the attribution of mental states questionnaire
Joint Proceedings of the ACM IUI 2024 Workshops (ams-q): A reference tool for assessing
anthropoco-located with the 29th Annual ACM Conference morphism, Frontiers in Psychology 14 (2023).
on Intelligent User Interfaces (IUI 2024), Greenville, [27] C. Di Dio, F. Manzi, S. Itakura, T. Kanda, H. Ishiguro,
South Carolina, USA, March 18, 2024, volume 3660 D. Massaro, A. Marchetti, It does not matter who
of CEUR Workshop Proceedings, CEUR-WS.org, 2024. you are: Fairness in pre-schoolers interacting with
URL: https://ceur-ws.org/Vol-3660/paper22.pdf. human and robotic partners, Int. J. Soc. Robot. 12
[17] T. Bickmore, D. Schulman, Practical approaches to (2020) 1045–1059.</p>
      <p>comforting users with relational agents, in: CHI [28] S. D. Gosling, P. J. Rentfrow, W. B. Swann Jr, A
’07 Extended Abstracts on Human Factors in Com- very brief measure of the big-five personality
doputing Systems, CHI EA ’07, Association for Com- mains, Journal of Research in personality 37 (2003)
puting Machinery, New York, NY, USA, 2007, p. 504–528.
2291–2296. URL: https://doi.org/10.1145/1240866.</p>
      <p>1240996. doi:10.1145/1240866.1240996.
[18] H. Cramer, J. Goddijn, B. Wielinga, V. Evers, Efects
of (in)accurate empathy and situational valence on
attitudes towards robots, in: 2010 5th ACM/IEEE
International Conference on Human-Robot
Interaction (HRI), 2010, pp. 141–142. doi:10.1109/HRI.</p>
      <p>2010.5453224.
[19] L. Cattaneo, G. Rizzolatti, The mirror neuron
sys</p>
      <p>tem, Archives of neurology 66 (2009) 557–560.
[20] M. Iacoboni, Imitation, empathy, and
mirror neurons, Annual Review of
Psychology 60 (2009) 653–670. URL: https:
//www.annualreviews.org/content/journals/
10.1146/annurev.psych.60.110707.163604.
doi:https://doi.org/10.1146/annurev.psych.</p>
      <p>60.110707.163604.
[21] H. M. Wellman, The child’s theory of mind., The</p>
      <p>MIT Press, 1992.
[22] S. M. Carlson, M. A. Koenig, M. B. Harms, Theory of
mind, Wiley Interdisciplinary Reviews: Cognitive</p>
      <p>Science 4 (2013) 391–402.
[23] M. Brüne, M. Abdel-Hamid, C. Lehmkämper, C.
Sonntag, Mental state attribution, neurocognitive
functioning, and psychopathology: what predicts
poor social competence in schizophrenia best?,</p>
    </sec>
  </body>
  <back>
    <ref-list />
  </back>
</article>