<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>A Modular REST-Based Framework for Human-in-the-Loop Robot-Assisted personalized Rehabilitation in Neurodevelopmental Disorders</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Marco De Luca</string-name>
          <email>marco.deluca2@unina.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Davide De Tommaso</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Davide Ghiglino</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Anna Rita Fasolino</string-name>
          <email>fasolino@unina.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Porfirio Tramontana</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Agnieszka Wykowska</string-name>
          <email>agnieszka.wykowska@iit.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Istituto Italiano di Tecnologia</institution>
          ,
          <addr-line>Genova</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>University of Naples "Federico II"</institution>
          ,
          <addr-line>Napoli</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Socially Assistive Robots (SARs) are increasingly used in therapeutic settings to support cognitive, social, and motor development through personalized, interactive engagement. However, deploying SARs in real-world clinical environments requires software frameworks that are both adaptable to individual patient needs and accessible to non-technical users. This work presents a modular full-stack REST framework designed to operationalize robot-assisted therapy with the iCub humanoid robot. The system supports the rapid deployment of personalized therapeutic workflows-referred to as iCub Applications-by composing modular robot actions into reusable templates. These applications are executed through a Wizard-of-Oz (WoZ) paradigm, enabling a human operator to teleoperate the robot online and maintain adaptive, patient-centered control. An intuitive, web-based frontend allows clinicians and researchers to configure and supervise sessions without requiring programming expertise. The interface dynamically generates its content based on the backend configuration exposed via a REST API, allowing multiple applications to be supported without modifying the frontend code, since all application logic and structure are defined on the backend. By integrating flexibility, modularity, and human-in-the-loop interaction, the framework bridges the gap between low-level robotic control and clinical usability, contributing to the advancement of socially-aware, human-centered AI in assistive and rehabilitation contexts.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;human-robot interaction</kwd>
        <kwd>software architecture</kwd>
        <kwd>socially assistive robots</kwd>
        <kwd>neurodevelopmental disorders</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Integrating robotics into therapeutic care has opened new opportunities to deliver engaging, adaptive,
and personalized support for individuals with developmental, cognitive, and motor challenges. In
particular, Socially Assistive Robots (SARs) have shown promise in domains such as autism therapy,
motor rehabilitation, and cognitive training [
        <xref ref-type="bibr" rid="ref1 ref2">1, 2</xref>
        ]. Through their physical embodiment and ability
to communicate using multiple channels—speech, gesture, facial expressions—SARs can foster social
interaction, increase motivation, and enhance patient involvement.
      </p>
      <p>
        However, the efectiveness of SAR-based interventions depends heavily on the underlying software
architecture. Simply deploying robot hardware is not suficient; what is needed is a system that
enables real-time control and adaptation of robot behavior according to individual user responses. This
highlights the central role of Human–Robot Interaction (HRI) design, particularly in clinical settings
that require personalization, transparency, and therapist supervision [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
      </p>
      <p>
        In this context, the iCub humanoid robot[
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] stands out as a valuable platform. Its anthropomorphic
form, child-like dimensions, and comprehensive sensorimotor capabilities make it particularly
wellsuited for working with children and vulnerable populations. The software infrastructure behind iCub
is built on YARP (Yet Another Robot Platform)[
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], a modular middleware designed to enable distributed,
real-time communication among robot components. This architecture supports integration of features
such as speech synthesis, gaze control, and gesture coordination.
      </p>
      <p>Despite this potential, developing therapeutic sessions with iCub is a challenge for non-technical
users. Therapists and clinical staf typically require programming skills to define and execute
interaction scenarios, which limits the scalability and day-to-day usability of the platform in healthcare
environments. This creates a gap between what the robot is capable of and what clinicians are able to
deploy independently.</p>
      <p>
        To address this limitation, we propose a new software framework that allows the configuration and
execution of structured iCub Applications, customizable therapeutic workflows built from modular robot
actions. These applications are modeled using Finite State Machines (FSMs) and follow a Wizard-of-Oz
(WoZ) execution model [
        <xref ref-type="bibr" rid="ref6 ref7">6, 7</xref>
        ], where a human operator guides the robot’s behavior during the session.
This hybrid approach maintains session structure while enabling online adaptation to patient behavior.
      </p>
      <p>The system is built on a full-stack REST architecture that cleanly separates the backend—responsible
for protocol logic, application definition, and robot coordination—from the frontend, which provides
an intuitive web-based interface for therapists and researchers. Notably, the frontend dynamically
generates its content based on definitions received via the REST API, allowing new applications to be
deployed or reconfigured entirely from the backend without requiring changes to the frontend code.
This design supports accessibility, scalability, and adaptability, making the platform well-suited for both
clinical and research settings.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Adaptive Therapeutic Sessions with the iCub Robot Platform</title>
      <p>
        The iCub humanoid robot is a flexible research platform for Human–Robot Interaction (HRI), cognitive
robotics, and therapeutic settings [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Its child-sized body, 53 degrees of freedom, and rich sensor
suite—including stereo vision, tactile skin, microphones, and force sensors—make it especially suitable
for engaging with children and individuals with neurodevelopmental disorders. These capabilities
enable natural, multimodal interactions through gaze, speech, gestures, and touch.
      </p>
      <p>
        iCub’s software is built on the modular YARP middleware [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], which separates low-level hardware
control from high-level behaviors. This design exposes core functionalities—like speech output, gaze
shifts, or motor control—as independent services that can be flexibly combined. Therapists can therefore
activate only the robot components needed for each session, making it easier to adapt interactions to
specific therapeutic goals.
      </p>
      <p>Within this framework, we define an iCub Application as a structured sequence of robot actions
designed to support a specific therapeutic activity. Each application orchestrates the robot’s multimodal
capabilities into a coherent interaction flow that aligns with clinical objectives, such as promoting joint
attention, encouraging verbal interaction, or guiding motor tasks.</p>
      <p>
        Applications are executed using a Wizard-of-Oz (WoZ) approach [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], where the robot appears
autonomous to the patient but is in fact remotely controlled by a human operator. This model ofers
both flexibility and safety—particularly important in pediatric or cognitively sensitive contexts—while
maintaining the illusion of autonomy that is often crucial for engagement. In a typical session setup
(Figure 1), the therapist and patient are seated across from iCub, with the therapist operating the
session from a separate interface, out of the patient’s view. The therapist selects an iCub Application
and configures its session parameters according to the patient’s profile. Throughout the session, the
therapist assesses the patient’s responses and selects the appropriate robot actions, ensuring that the
experience is personalized, engaging, and therapeutically relevant.
      </p>
      <p>This framework supports a human-in-the-loop interaction model in which robot behavior is not
pre-scripted but flexibly directed by the therapist’s clinical judgment. It ofers a balance between
repeatability—important for structured therapy—and real-time adaptability—essential for personalization.
As such, iCub serves not just as a programmable robot, but as a responsive therapeutic partner capable
of supporting individualized, interactive interventions.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Proposed Solution</title>
      <p>To support the execution of structured yet flexible robot-assisted therapy sessions using the iCub
humanoid platform, we introduce a modular, full-stack REST software framework. The proposed
framework is designed to abstract the technical complexity of robotic control and provide an accessible
interface for clinicians and technical operators to define, configure, and supervise therapeutic protocols
without requiring robotics or programming expertise.</p>
      <p>At the core of our framework is the support for executing iCub Applications, reusable, high-level
therapy programs defined as structured sequences of robot actions. Each action corresponds to a
capability of the robot, such as gaze control, speech synthesis, gesture execution, or facial expression
rendering. These applications are executed under the WoZ paradigm, which places a human operator
in the control loop, enabling real-time guidance of robot behavior based on the patient’s engagement
and responses. This approach supports patient-specific adaptation while maintaining the safety and
repeatability required in clinical settings.</p>
      <p>To support the operator during the execution of each application, the framework incorporates three
key design choices. First, each iCub Application is modeled as a Finite State Machine (FSM), where
individual states correspond to specific robot actions and transitions represent the flow of interaction.
According to the WoZ paradigm, these transitions are triggered by the therapist in real time, based on
their observation of the patient’s behavior. This approach ensures that clinicians retain full control over
the session while allowing for dynamic, personalized adjustments. An example of an FSM representation
of an iCub Application is illustrated in Figure 2.</p>
      <p>do:
ShakeHand
do:</p>
      <p>Ask
Question
do:
Happy
Emotion
do:
Sad
Emotion
do:
Grab
Cube
do:
Rotate
Cube</p>
      <p>do:
Drop off
Cube
do:</p>
      <p>Say
Goodbye</p>
      <p>Second, to facilitate therapist interaction with the application during execution, the system provides
a web-based dashboard based on the selected iCub Application. This dashboard is composed of modular
graphical widgets, as shown in Figure 3. The main widget consists of an FSM visualizer that can be
used by the operator to track and control the execution of the application. Other dashboard widgets
correspond to a specific robot function or protocol control, such as Text-to-Speech control, emotion
display configuration, and video stream monitoring.</p>
      <p>Third, the frontend dashboard is dynamically generated at runtime based on definitions retrieved from
the backend via the REST API. This design allows the addition or customization of therapy applications
solely through backend configuration, without requiring any changes to the frontend code. A key
feature of this approach is the ability to define configurable session arguments—parameters that can
be selected by the user at the start of the session (e.g., task dificulty, target emotion, or number of
repetitions). The structure and allowed values of these arguments are fully defined in the backend and
automatically reflected in the user interface at runtime. Moreover, the layout of the dashboard widgets
is highly customizable and can be modified, imported, or exported as a JSON configuration file. This
enables flexible adaptation of the interface to specific clinical needs or user preferences, supporting a
variety of use cases and session types. Finally, custom Python methods defined in the main application
backend are automatically exposed as REST endpoints and can be selectively included in the dashboard
on request. This allows developers to quickly integrate new robot capabilities or protocol logic without
additional frontend development, further enhancing the modularity and extensibility of the framework.</p>
      <p>At the start of a rehabilitation session, the operator can select one of the available iCub Applications and,
therefore, the frontend automatically configures the dashboard to reflect the structure and parameters of
the selected FSM. Therapists can further personalize the interface by rearranging, resizing, or disabling
widgets based on the therapeutic goals and interaction context. Through this configurable dashboard,
the therapist remains central to the interaction, manually triggering state transitions in response to
the patient’s real-time behavior. This human-in-the-loop approach ensures that sessions are not only
responsive and personalized, but also maintain the consistency and structure necessary for clinical
reliability and experimental reproducibility.</p>
      <p>The overall software architecture adopts a client–server model, as shown in Figure 4. The backend
component, named Pyicub, handles robot coordination, FSM-driven session execution, and service
orchestration. It exposes REST APIs to the frontend, enabling seamless, asynchronous communication
between user-facing components and the robot’s control logic.</p>
      <p>Frontend</p>
      <p>PyiCub Backend
Desktop HTTP/REST
and
Mobile
«component»
Angular HTTP</p>
      <p>Client
«component»
Dashboard
Configuration</p>
      <p>Manager</p>
      <p>HTTP/REST
«component»
Services
Registry
«component»
iCub Application
«component»</p>
      <p>Robots
Registry
«component»
iCub
Controller</p>
      <p>YARP
iCub
Robot</p>
      <p>
        The backend is structured into four main modules that handle robot communication and therapeutic
session execution. The iCub Controller abstracts low-level interactions with the robot through the
YARP middleware [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], a soft real-time, distributed communication framework based on named Ports. It
exposes robot functionalities (e.g., gaze control, arm movement, speech output) as RESTful services. This
abstraction allows the backend to trigger complex robot behaviors through simple HTTP calls, enabling
hardware-independent control across deployments. The iCub Application module manages the
logic of each therapeutic session, orchestrating robot actions by invoking the REST endpoints according
to the protocol’s flow. By composing sessions from modular, atomic actions, this component facilitates
both reuse and rapid prototyping of new iCub Applications, aligning with the system’s goal of flexibility
and adaptability. The RobotsRegistry automatically detects available iCub units on the network at
runtime, while the ServiceRegistry maintains a dynamic mapping between robot instances and
their supported iCub Applications. Together, these components ensure correct routing of requests and
enable scalable, distributed execution across multiple robots without requiring manual reconfiguration.
      </p>
      <p>The frontend is a responsive Single Page Application (SPA) built with Angular. At its core, the HTTP
Client handles all asynchronous REST communications with the backend and user-triggered events,
keeping the interface synchronized with system logic. The Dashboard Configuration Manager
controls the layout and widget setup, allowing the interface to adapt flexibly to diferent therapeutic
sessions while ensuring an intuitive user experience.</p>
      <p>In summary, the proposed system delivers a cohesive full-stack REST architecture that bridges the gap
between iCub’s technical capabilities and the real-world needs of clinicians and researchers. By
modularizing robot functionality, supporting dynamic session configuration, and providing a user-friendly
frontend, the system enables accessible, repeatable, and adaptive robot-assisted therapy—advancing the
practical integration of socially assistive robotics in clinical practice.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Conclusion and Future Work</title>
      <p>This work introduced a modular, full-stack REST framework that supports the configuration and
execution of structured, adaptable therapeutic sessions using the iCub robot. By abstracting low-level
control through a unified interface and adopting the Wizard-of-Oz paradigm, the system allows
therapists to configure and execute iCub Applications—reusable therapeutic protocols composed of modular
robot actions—directly via a web-based frontend. This approach enables real-time personalization
based on patient needs, without requiring any programming skills, while preserving the structure
of clinical workflows. The backend, built on the YARP middleware, ensures scalability, reusability,
and hardware abstraction via RESTful APIs. Meanwhile, the frontend ofers a dynamic, customizable
dashboard that supports human-in-the-loop interaction. Overall, the proposed system advances the
integration of socially assistive robotics by aligning technical flexibility with the practical needs of
adaptive, human-centered care.</p>
      <p>The proposed solution was preliminarily validated through a case study involving technical operators
and therapists who used the web application to conduct therapy sessions with actual patients and a
single robot. Their feedback provided valuable insights into the usability and clinical applicability of
the system, confirming its potential to support personalized, structured robot-assisted interventions in
everyday therapeutic practice.</p>
      <p>As future work, we plan to extend the experimental validation by involving a broader range of
clinicians and therapists to evaluate the system’s efectiveness in real-world therapeutic workflows.
Furthermore, we intend to develop a graphical user interface that will enable users to define new
iCub Applications directly from the browser. This tool will guide the composition of new FSM-based
protocols by selecting and chaining available iCub actions—exposed as REST services through the
YARP middleware—thus making application creation more accessible to non-technical users. Finally,
we are exploring the integration of semi-autonomous behavior layers to complement the
Wizard-ofOz approach, enabling a more balanced interaction between clinician control and robot initiative in
long-term interventions.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Declaration on Generative AI</title>
      <p>The author(s) has not employed any Generative AI tool during the preparation of this work.
This work was supported by the Italian Ministry of Research, under the complementary actions to the
NRRP “Fit4MedRob - Fit for Medical Robotics” Grant (# PNC0000007).</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>A.</given-names>
            <surname>Triantafyllidis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Alexiadis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Votis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Tzovaras</surname>
          </string-name>
          ,
          <article-title>Social robot interventions for child healthcare: A systematic review of the literature</article-title>
          ,
          <source>Computer Methods and Programs in Biomedicine Update</source>
          <volume>3</volume>
          (
          <year>2023</year>
          )
          <article-title>100108</article-title>
          . URL: doi.org/10.1016/j.cmpbup.
          <year>2023</year>
          .
          <volume>100108</volume>
          . doi:https://doi.org/10.1016/j. cmpbup.
          <year>2023</year>
          .
          <volume>100108</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>M.</given-names>
            <surname>Kyrarini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Lygerakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Rajavenkatanarayanan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Sevastopoulos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. R.</given-names>
            <surname>Nambiappan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. K.</given-names>
            <surname>Chaitanya</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. R.</given-names>
            <surname>Babu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Mathew</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Makedon</surname>
          </string-name>
          ,
          <article-title>A survey of robots in healthcare</article-title>
          ,
          <source>Technologies</source>
          <volume>9</volume>
          (
          <year>2021</year>
          ). URL:
          <volume>10</volume>
          .3390/technologies9010008. doi:
          <volume>10</volume>
          .3390/technologies9010008.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>J. D.</given-names>
            <surname>Kumar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Bansal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Kaushik</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Sethi</surname>
          </string-name>
          ,
          <article-title>Human-robot interaction: Designing efective interfaces for collaborative tasks</article-title>
          ,
          <source>in: 2024 1st International Conference on Advances in Computing, Communication and Networking (ICAC2N)</source>
          ,
          <year>2024</year>
          , pp.
          <fpage>356</fpage>
          -
          <lpage>360</lpage>
          . URL:
          <volume>10</volume>
          .1109/ICAC2N63387.
          <year>2024</year>
          .
          <volume>10895441</volume>
          . doi:
          <volume>10</volume>
          .1109/ICAC2N63387.
          <year>2024</year>
          .
          <volume>10895441</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>G.</given-names>
            <surname>Metta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Natale</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Nori</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Sandini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Vernon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Fadiga</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            von
            <surname>Hofsten</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Rosander</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Lopes</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Santos-Victor</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Bernardino</surname>
          </string-name>
          ,
          <string-name>
            <surname>L. Montesano,</surname>
          </string-name>
          <article-title>The icub humanoid robot: An open-systems platform for research in cognitive development</article-title>
          ,
          <source>Neural Networks</source>
          <volume>23</volume>
          (
          <year>2010</year>
          )
          <fpage>1125</fpage>
          -
          <lpage>1134</lpage>
          . URL: doi.org/10.1016/ j.neunet.
          <year>2010</year>
          .
          <volume>08</volume>
          .010. doi:https://doi.org/10.1016/j.neunet.
          <year>2010</year>
          .
          <volume>08</volume>
          .010,
          <string-name>
            <surname>social</surname>
            <given-names>Cognition</given-names>
          </string-name>
          : From Babies to Robots.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>G.</given-names>
            <surname>Metta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Fitzpatrick</surname>
          </string-name>
          , L. Natale, Yarp: Yet another robot platform,
          <source>International Journal of Advanced Robotic Systems</source>
          <volume>3</volume>
          (
          <year>2006</year>
          )
          <article-title>8</article-title>
          . URL:
          <volume>10</volume>
          .5772/5761. doi:
          <volume>10</volume>
          .5772/5761.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>J. F.</given-names>
            <surname>Kelley</surname>
          </string-name>
          ,
          <article-title>An iterative design methodology for user-friendly natural language ofice information applications</article-title>
          ,
          <source>ACM Trans. Inf. Syst</source>
          .
          <volume>2</volume>
          (
          <year>1984</year>
          )
          <fpage>26</fpage>
          -
          <lpage>41</lpage>
          . URL:
          <volume>10</volume>
          .1145/357417.357420. doi:
          <volume>10</volume>
          .1145/ 357417.357420.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>A.</given-names>
            <surname>Steinfeld</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. C.</given-names>
            <surname>Jenkins</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Scassellati</surname>
          </string-name>
          ,
          <article-title>The oz of wizard: simulating the human for interaction research</article-title>
          ,
          <source>in: Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction</source>
          , HRI '09,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2009</year>
          , p.
          <fpage>101</fpage>
          -
          <lpage>108</lpage>
          . URL:
          <volume>10</volume>
          .1145/1514095.1514115. doi:
          <volume>10</volume>
          .1145/1514095.1514115.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>L. D.</given-names>
            <surname>Riek</surname>
          </string-name>
          ,
          <article-title>Wizard of oz studies in hri: a systematic review and new reporting guidelines</article-title>
          ,
          <source>J. Hum.- Robot Interact</source>
          .
          <volume>1</volume>
          (
          <year>2012</year>
          )
          <fpage>119</fpage>
          -
          <lpage>136</lpage>
          . URL:
          <volume>10</volume>
          .5898/JHRI.1.1.Riek. doi:
          <volume>10</volume>
          .5898/JHRI.1.1.Riek.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>