<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Automatic Measurements of a Leisure Activity for People with Profound Disabilities</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Robby van Delden</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Dennis Reidsma</string-name>
          <email>d.reidsma@utwente.nl</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Human Media Interaction, University of Twente</institution>
          ,
          <addr-line>Enschede, the Netherlands, r.w.vandelden</addr-line>
        </aff>
      </contrib-group>
      <fpage>48</fpage>
      <lpage>59</lpage>
      <abstract>
        <p>We report on challenges we encountered when using automatic measurements for a longer term exploratory study (8-10 sessions, 9 participants) with people with Profound Intellectual and Multiple Disabilities (PIMD). In the overall study, which we will publish elsewhere, one element that we investigated was if we were able to persuade users of this target group to move more by providing interaction with an interactive ball. This paper focuses on the challenges we had regarding the use of our method for automatic measurement of movement based on camera recordings during this study. With this paper we like to remind researchers to not rely blindly on the outcome of automatic measurements and instead to analyze measures in depth, which can become difficult when using extensive sets of data. 1</p>
      </abstract>
      <kwd-group>
        <kwd>Behavior Change Support Systems</kwd>
        <kwd>profound disabilities</kwd>
        <kwd>PIMD</kwd>
        <kwd>interactive ball</kwd>
        <kwd>automatic measurements</kwd>
        <kwd>leisure activity</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        Not all people have the combination of cognitive and/or physical capabilities to be able
to enjoy modern sources of leisure such as computer games, watching TV, or interactive
theme-park rides. Especially for people with Profound Intellectual and Multiple
Disabilities (PIMD) there is a limited amount of suitable entertainment [
        <xref ref-type="bibr" rid="ref1 ref16 ref17">1, 15, 16</xref>
        ]. Where
there is a rise of persuasive technology incorporating gamification elements and other
entertaining ways of feedback and persuasion, we see very little done for people with
special needs, as illustrated by the number of results in March 2018 when searching in
1 The measurements discussed in this paper contributed to the results of two comparative studies
which are currently under review in two other (journal) manuscript, tentatively titled
‘Evaluating a newly developed interactive activity for people with profound intellectual and multiple
disabilities’ and ‘Do we get your attention?! Looking into alertness, movement, and affective
behaviour of people with PIMD upon introduction of a playful interactive product’. The
concept for the ball has been described by van Delden et al. in 2014 [
        <xref ref-type="bibr" rid="ref16">15</xref>
        ], and parts of the current
paper are also presented in a PhD thesis [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. The current paper presents additional rationale
and details of the ball and insights on applying automatic measurements that are beyond the
(non-technological) scope of the two manuscripts under review. Please cite the two other
papers for everything that is reported in those papers, and especially for target group description,
interactive concept, goals, and study design (and, obviously, the study results).
the past Persuasive Technology proceedings for ‘special needs’ (1 hit, a paper on people
with autism) compared to ‘gamification’ (16 hits).
1.1
      </p>
    </sec>
    <sec id="sec-2">
      <title>A PIMD Target Group</title>
      <p>
        People with PIMD are dependent on their caregivers, have low intellectual capabilities
(immeasurable), and have disabilities that reinforce each other and can not be
compensated for [
        <xref ref-type="bibr" rid="ref12">11</xref>
        ]. Unlike healthy volunteers used in many other projects on persuasive
technology, people with PIMD are unable to verbally communicate their preferences, their
suggestions for improvement, or self-reports of their experience. Instead, they mainly
communicate with the caregivers through body movement [
        <xref ref-type="bibr" rid="ref5 ref6 ref7">5, 6</xref>
        ]. People with PIMD
make a very heterogeneous user group due to the variety and combinations of
cognitive, sensory and physical disabilities. Even their everyday caregivers themselves often
need to discuss among each other and take their time to establish interpretations of their
actions and preferences. This complicates finding appropriate measurements.
      </p>
    </sec>
    <sec id="sec-3">
      <title>1.2 Interaction with an Interactive Ball to Promote Movement</title>
      <p>
        In our project, which was instigated by the care organization Dichterbij, we emphasized
the opportunities offered by new interactive technologies. For example, the Kinect depth
camera or the Arduino prototyping platform can help develop suitable activating leisure
activities that persuade users from this target group in a pleasurable way to exhibit
certain wanted behavior. Larsen and Hedvall showed that interactive technology allows to
create new pleasurable experiences tailored to people with PIMD [
        <xref ref-type="bibr" rid="ref10">9</xref>
        ]. We have built
upon this work and combined it with insights from our own work. We created an
interactive ball responding to sounds and movement of a participant with PIMD. For the
interaction a facilitator looked at the participant and remote controlled the ball
according to a tailored protocol; this interaction between ball and user can be seen in Figure 1.
There were several possible benefits of providing such an interactive leisure activity that
were taken into account; in the context of the current paper we only go into promoting
movement.
1.3
      </p>
    </sec>
    <sec id="sec-4">
      <title>Focus and Goal of the Paper</title>
      <p>
        One approach which seemed promising was to make use of automatic measurements,
both for measuring study outcomes as well as for triggering the interaction patterns,
for instance applying automatic measurements to indicate preferences by measuring
movements that are almost invisible to the eye [
        <xref ref-type="bibr" rid="ref9">8</xref>
        ]. The outcome and findings of our
final study will be reported elsewhere. In this paper we focus on sharing our insights
regarding the simple automatic measurement method we used for measuring movement,
which we called Simplified Motion Energy Analysis.
      </p>
      <p>With this paper, we want to achieve two things: 1) we want to inspire our research
community to also make people with special needs benefit from persuasive technology
and share some of our process and choices to this end, and, more importantly, 2) remind
the reader that unforeseen side-effects and incorrect values in automatic measurements
challenge us in using extensive data for this kind of system. This holds for evaluations,
especially for the analysis as shown in this paper, but in truly automated interventions
also for the performance of the system during the intervention itself.
The remainder of this paper is structured as follows. We will continue with related
work on the user group and (related) interactive entertainment for people with special
needs. In the third section we will briefly describe our prototype and underlying design
principles. In the fourth section we will then explain our automated measurement for
movement and our experiences with this. We will conclude by discussing some of our
vision and the challenges that become apparent when working with this target group
attempting to use automatic measurements over longer periods time.
2</p>
      <sec id="sec-4-1">
        <title>Leisure Activities for People with PIMD</title>
        <p>
          To our knowledge there are only a few examples available of research that develops
truly interactive technology for people with PIMD [
          <xref ref-type="bibr" rid="ref1 ref10 ref16 ref4">1, 4, 9, 15</xref>
          ]. We see truly interactive
systems as presenting a developing dialogue of interaction between man and machine.
This goes beyond ‘switches’, pushing a button in order to get a repeating constant
response [
          <xref ref-type="bibr" rid="ref1 ref16">1, 15</xref>
          ]. Nonetheless, explorations into the latter can also provide additional
entertainment opportunities. For instance, a modified switch controlled ride-on toy car for
young children with severe physical disabilities could provide an additional activity in
which postural control might be trained in a motivating and fun way [
          <xref ref-type="bibr" rid="ref8">7</xref>
          ]. However, there
is a limited amount of entertainment in general and especially a lack of non-sedentary
activities for people with PIMD [
          <xref ref-type="bibr" rid="ref18">17</xref>
          ]. Others have also indicated a lack of interactive
entertainment for people that are severely disabled [
          <xref ref-type="bibr" rid="ref1 ref15 ref16">1, 14, 15</xref>
          ].
        </p>
        <p>
          The related work that can be found seems to point towards personalized and tailored
interactions, which also fits the Persuasive System Design model [
          <xref ref-type="bibr" rid="ref13">12</xref>
          ]. For instance,
Thaller et al. developed an interactive Radio Frequency (RF) controlled toy with a
mouth operated joystick [
          <xref ref-type="bibr" rid="ref15">14</xref>
          ]. To allow their ‘4D-joystick’ to be used with people with
some tremors they could set personalized ‘dead zones’ to filter the input and planned to
allow for rearranging mappings between input and output on an individual level [
          <xref ref-type="bibr" rid="ref15">14</xref>
          ].
Others have also indicated the importance of these personalized and tailored
interactions for this heterogeneous user group of people with PIMD [
          <xref ref-type="bibr" rid="ref11 ref3">3, 10</xref>
          ]. Providing such
appropriate (personalized) sensory stimuli and interpreting responses to people with
PIMD has been reported to be fairly difficult. Analyses might therefore benefit from
automatic measurements, including electrodermal responses, motion energy analysis
and heart-rate [
          <xref ref-type="bibr" rid="ref11 ref9">8, 10</xref>
          ].
3
        </p>
      </sec>
      <sec id="sec-4-2">
        <title>An Interactive Ball</title>
        <p>
          In our design efforts we build on our earlier work and that of Caltenco et al., also
using interactive body-controlled physically present objects [
          <xref ref-type="bibr" rid="ref1 ref16">1, 15</xref>
          ]. With our object, the
interactive ball, we tried to create an enjoyable experience that motivates the targeted
users to move. We believe that using truly interactive systems for this user group can
generate a pleasurable and activating leisure activity.
        </p>
        <p>We developed an interactive ball that can be remotely controlled based on gross body
movements and vocalizations, see Figure 1. During our development there were two
different versions of this interactive system. Version 1 was controlled fully
automatically on basis of recognized body movement using a Kinect sensor. We only used this
version with students and at the start of a pilot with 5 participants from the target group.
In version 2 the ball was remotely controlled by a facilitator, this was the ‘final’ version
that we used for the longer term (8-10 sessions) study. 2
The ball moves based on changing the center of gravity by rotating two weighted arms
connected to a servo inside the ball. This allows for a gentle movement, unlike the more
direct and quick movements of the commercially available interactive ball Sphero. With
a WeFly wifi-hotspot, simple string based commands were sent to the ball. We
programmed a simple Graphical User Interface (GUI) in C++ (using the Qt framework)
that allowed the facilitator to move the ball with the keyboard cursors and to use some
fields to set the speed. Key inputs could be used to play 17 different sounds. Most of
these sounds were made using free virtual (synthesizer) instruments, 6 were
recordings from an online audio database (mainly animal sounds and bells). The sounds were
played in front of the user over standard PC speakers. We painted the ball in highly
2 Currently we also have a simple tablet application to control the ball’s movement, change the</p>
        <p>LEDs, and play sounds from within the ball.
contrasting blue and yellow colors. After some sessions we also added a rattle, so the
ball made more noise. This could make it easier to follow the ball, especially important
for people with more limited visual capabilities.</p>
        <p>As stated above our first concept was based on an automated detection of the user’s
movements with the Kinect depth camera and software. At the same time we used
a webcam and a simple background subtraction method to track the position of the
ball. We created an interaction in which the ball’s position (left/right on a predefined
path) was dependent on the position of the head. To test the technical feasibility and
to improve this interaction we tested this with 40 students. During these tests we made
some small technical improvements (e.g. in the implementation of sending commands)
but the system worked reliable enough at the end of these tests.</p>
        <p>However, during the first tests with people from the actual target group we realized
that one fully automated interactive system was not flexible enough to deal with the
heterogeneity of the user group. For instance, we planned to use the head positions as
the main method of interaction, but already the first PIMD-user did not move her head
left/right often enough, so this data could not be used as main input. Besides the lacking
adaptability to the user’s abilities and interaction methods, the first concept was also not
tailored enough to preferences of the user. Solving this technically could become quite
hard. Therefore, we switched to a Wizard of Oz approach as it seemed to be a good first
step to see whether the interactive ball in whatever way could be beneficial for people
with PIMD.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>3.1 Interaction and Design Rationale</title>
      <p>Before designing the interaction we have spoken with several caregivers and a therapist
about the project and did a literature research. We also watched play sessions, visited
daily living environments, and were informed about a variety of activities for this user
group. We believe this is essential in coming to a suitable interaction. However, no
overgeneralization should take place: things seen for one user can not simply be transferred
to another. Therefore, we tailored the interaction patterns of the ball in an iterative and
individualized way. We continued to work closely together with caregivers, in order to
analyze the user’s behavior and experience and to help improve the interactive system,
during a first set of habituation sessions for each participant.</p>
      <p>We chose for a ball as this is a general shape used often in traditional playful
interactions, furthermore we saw one person from the target group—with relatively high
capabilities— trying to push a big ball during an observational session. We think a big
physical object has benefits over showing objects on a screen, as we expect people with
visual disabilities to be more likely to be able to identify and follow this. As a start of the
interaction we preferably use the movements of the user that were likely to occur, such
as the rocking behavior of the upper body (see Figure 1), using their ‘vocalizations’ or
the fiddling of their hands. After this (unconscious) initiation of interaction, the user has
to learn the link over time between action and response, something we knew beforehand
not all users will be capable of. Furthermore, we were not even sure whether some users
would recognize the ball and its behavior. To increase chances for these kinds of
recognition, sessions were held several times. To further improve chances of success we also
tailored to the possible actions of the user, as well as the responses that needed to be
stimulating enough for the user to make a cause-effect link more plausible. We added
sound feedback as we knew that some people from the user group are more sensitive to
this than merely visual feedback.</p>
      <p>The exact interaction protocol depended on the user. More info on this protocol will be
published elsewhere; here we only briefly describe the general way of interaction. In
this protocol actions where described like, user action: leans to our left, ball: move ball
to our left, sound: play sound type 1. For the implemented protocol for the longer term
study we started from the premises of what could be expected from a user as well as
what we designed that the interactive ball could do: move, make sounds, and change
appearance using bright LEDs. This resulted in some links that might not be as intuitive
as they could have been. We made the ball respond to upper body movement and/or
focus of attention. The ball was not placed within reach of the user, to trigger them to
show alertness farther away than their close encounters, as well as to make the ball less
likely to bump against a person or to harm the participant in any other way. When the
user moved his upper body, the ball was remote controlled by a facilitator to start to
roll. The ball also made sounds when an attempt for interaction was made or if the ball
was kept in the user’s visual focus for some time. Furthermore, for some users the ball
moved from side to side when it was not been interacted with for some time. This was
done in order to (re)gain the attention of the user. Sounds were also played to further
grab the attention of the user, and some categories of sounds were removed or played
more often depending on (dis)liking it.
4</p>
      <sec id="sec-5-1">
        <title>Automatic Measurement of Movement: Simplified Motion</title>
      </sec>
      <sec id="sec-5-2">
        <title>Energy Analysis (SMEA)</title>
        <p>For one of our outcome measures (the only one we discuss in this paper) we used
computer vision techniques to automatically detect movement responses. This allowed us to
make use of the benefits of automatic measurements without burdening the participants
too much. For this target group it is important to consider downsides of for instance
placing body worn sensors, as this might have detrimental effects for overly sensitive
people. In our study design we also took into account an extended time for getting
accustomed to a new situation.</p>
        <p>We carried out a pilot with five people from the target group, followed by the final study
with 9 participants. One of the researchers was present to control the ball, one other
researcher was taking notes of noticeable and relevant behavior and (for a set number of
sessions) take a structured interview, and one caregiver was present to help the
participant if needed (e.g. one participant had phlegm that resulted in coughing severely and
several times needed to be helped and calmed). We used three sessions to tailor the
interaction protocol to each participant, and then had up to two more additional sessions,
where these 3-5 sessions were used to get the user acquainted with the people present,
the ball, and the room. We then used the last 5 sessions to do our measurements. We
compared the movement during interaction with periods where the ball was not present;
exact details are discussed in the other papers. We compared on a person-to-person
basis and used the data only for descriptive statistics including whether they moved more,
less, or showed about the same amount of movement. Beside these automatic measures
there were also other measures for the final study which are outside the scope of this
paper.
4.1</p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>SMEA</title>
      <p>
        To have a measure for movement we used a fairly simple method. It is a slightly simpler
version of Motion Energy Analysis as used by Ramseyer &amp; Tsacher [
        <xref ref-type="bibr" rid="ref14">13</xref>
        ] and shows
resemblance to the motion history also applied for a similar participant by Iwabuchi et al.
[
        <xref ref-type="bibr" rid="ref9">8</xref>
        ]. In the Simplified Motion Energy Analysis we measured the amount of pixels that
changed beyond a certain threshold. This seems to be especially fitting for
measurements in a planar frame with respect to the camera, which in turn fits an interaction of
leaning left and right. Our Simplified Motion Energy Analysis method consists of the
following steps, see Figure 2 : 1) grab the video frames, 2) convert those to gray scale,
3) crop the image to only the area of interest, the amount of cropped pixels depended
on the position of the wheelchair and presence of people around, 4) copy the image and
delay it for the next step, 5) subtract the delayed frame from the current frames with
background subtraction and provide either a) absolute pixel differences (as in the
figure) or b) a binary version with number pixels that changed (as explained and used for
our study), 6) sum all pixels (or differences), and 7) save them to a text file. We had to
analyze quite a lot of footage: 5 sessions, 9 participants, and 30 minutes of interaction,
which sums up to 1350 minutes where each second is made of numerous frames that
had to be subtracted from each other. To do this in a timely manner, we used a computer
vision framework (Parlevision) from our research group written in C++ (using the Qt
framework), that was build on OpenCV providing a GUI that allowed to alter settings
in runtime. This allowed us to run the analysis faster than real time.
4.2
      </p>
    </sec>
    <sec id="sec-7">
      <title>Analysis of the SMEA Results</title>
      <p>We investigated the resulting measurements thoroughly. Figure 3 shows post-filtered
results. We used Matlab R2012a to generate the graphs and filter out noise-polluted
results, which we will discuss next. The meaning of the results regarding the outcome
measure will be published elsewhere.</p>
      <p>Using similar graphs we saw numerous peaks of measured movements for participants3
where we did not remember seeing this during the session. For instance, one
participant slept through the entire session but still we saw several peaks in the measured
3 This measured movement is represented in amount of pixels that differed beyond a threshold,
where we subtracted the average movement of the entire session of the day for representation
purposes.
movement4. This made abundantly clear that there was noise interfering with the
measurements. Therefore, we tried to sync the measured frames with the video recordings
and we saw several reasons for this occurring, we did not notice these aspects in the
pilots.
4.3</p>
    </sec>
    <sec id="sec-8">
      <title>Unexpected Sources of Noise</title>
      <p>
        One, the camera apparently made use of auto-focus. At certain points the camera would
focus differently and this resulted in measured peaks, when changing focus and often
a second or so later changing back to original focus, we hypothesize this could have
been related to facial recognition of the camera. Two, in a similar fashion we noticed
something that seemed like a color filter change, combined with our own experience
this seemed to occur when sunlight intensity changed (even though we equipped the
room with curtains), see Figure 4. Three, the camera sometimes shook resulting in
short peaks, this occurred when the ball was moved in a certain rhythm back and forth.
Fourth, at certain moments feet of people present entered the footage, we cropped the
images to prevent this as much as possible, see Figure 4. Fifth, although very limited
the reflections of the moving ball could be seen in one of the wheelchairs. Sixth, we
knew from previous work that the clothing of the day can influence the amount of
pixels changed (compare a checkered blouse with a black sweater). This mainly made
the results harder to compare between days, which was less of an issue with our study
4 In a normal population this sleeping participant might be considered an outlier and could be
withdrawn from analysis. For our main study, we kept this participant because such “outliers”
can in fact occur very regularly with this target group.
setup. Seventh, the wheelchair also moved due to the heavy movements of some of the
participants which might influence the results when looking only at the averages.
Although we did not anticipate any of this, to a certain amount our context thus violated
the static camera and light conditions prerequisites of MEA [
        <xref ref-type="bibr" rid="ref14">13</xref>
        ], which required us to
manually inspect the peaks and video, and filter out these sources of noise.
5
      </p>
      <sec id="sec-8-1">
        <title>Discussion and Conclusion</title>
        <p>For our project we were motivated by being confronted with the issues that the target
group had such limited possible leisure activities. The observations of the target group
and conversations with caregivers helped us to clarify that for some people of the target
group even a bit of movement increase can be of added value. Early on in the project, we
realized that expecting statistical results over the generalized population, a population
which although small in size is very heterogeneous, might be too much to expect. Even
targeting a specific sub-group selected beforehand can be hard as it was not always
predictable for which users it might or might not be beneficial. We tried to fit this
heterogeneity also in our research approach, placing a lot of emphasis on analysis of
individual cases. In short, we knew it would be a time consuming research where there
would not be ‘a cure for all’ of a large population but that we could inspire others to
also create more interactive leisure activities for this target group.</p>
        <p>In our study design we took into account the variability within and between participants
for this target group. This also had advantages to deal with some of the shortcomings of
the measurement tool (e.g. impact of clothing changes on SMEA). Furthermore, we did
a pilot study but refrained from in-depth analysis of the entire sessions (they were not
an effect study after all), it is however good to mention that if we had done this analysis
it might have prevented some of the issues (e.g. auto-focus) from occurring. We do
believe an analysis of any ‘new’ automatic measurement tool should be done thoroughly
as unexpected events can influence results. Although there were severe shortcomings of
our approach it did give an interesting and useful result, so we still argue that automatic
measures can be an effective tool.</p>
        <p>However, in our case the measurement ‘errors’ could be related quite easily to outliers,
but what happens if results are not recognizable so easily as outliers, would they be
able to impact results in an unnoticeable way? Furthermore, when data is gathered over
a even longer period of time, would it not be impossible to investigate these outliers one
by one?
In our case we knew that habituation for this target group was important. However,
in many cases we (can) only measure persuasive systems over a short term.
Furthermore, the ethical considerations for this target group and several health related issues,
especially when measured over a longer time period, can also take up a long time5.
Therefore, is it conceivable that systems can be beneficial for end-users and from a
technological point of view be developed in a research setting, but are less likely to be
researched as evaluations might become too complicated and time consuming?
For this target group some are able to crawl others had a hard time even moving their
head. It is important that the automatic measurement tool can deal with both. However,
even if the tool can measure both, the same kind of data might have a different meaning
from person to person in our specific context. How can we take into account such large
difference in possibilities, and interpreting their data, between participants?
In short we like to conclude: 1) that developing persuasive entertainment for people
with PIMD is challenging but seems worthwhile to investigate, 2) that automatic
measurements can be a useful tool for longer term measurements but that shortcomings
should not be overlooked and might require a thorough analysis and manual filtering
step.</p>
        <p>Acknowledgements This research was supported by the Dutch national program
COMMIT, and received additional funding of Dichterbij, the health care organization
involved. We like to thank Wietske van Oorsouw and Petri Embregts of University of
Tilburg, as well as Sophie Wintels, for the joint setting up and carrying out the study
reported elsewhere and for the hours and hours of manual annotation. We also like to
thank our professors Dirk Heylen and Vanessa Evers for their time and valuable input
in the project, Kitt Engineering for developing the hardware for the ball for us. And
most importantly all the participants and the involved family members, as well as all
the caregivers and other employees at Dichterbij making time in their already very busy
schedules, these are the people actually making this project possible.</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Caltenco</surname>
            ,
            <given-names>H.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Larsen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hedvall</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          :
          <article-title>Enhancing multisensory environments with design artifacts for tangible interaction</article-title>
          .
          <source>In: In proceedings of The Seventh International Workshop on Haptic and Audio Interaction Design (HAID)</source>
          . pp.
          <fpage>45</fpage>
          -
          <lpage>47</lpage>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2. van Delden,
          <string-name>
            <surname>R.</surname>
          </string-name>
          :
          <article-title>(Steering) interactive play behavior</article-title>
          .
          <source>Phd thesis</source>
          , University of Twente, the
          <string-name>
            <surname>Netherlands</surname>
          </string-name>
          (
          <year>2017</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Fowler</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Multisensory Rooms and Environments Controlled Sensory Experiences for People with Profound and Multiple Disabilities</article-title>
          . Jessica Kingsley Publishers (
          <year>2008</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Hedvall</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Larsen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Caltenco</surname>
            ,
            <given-names>H.A.</given-names>
          </string-name>
          :
          <article-title>Inclusion through design - engaging children with disabilities in development of multi-sensory environments</article-title>
          . In: Assistive Technology- From Research to Practice. pp.
          <fpage>628</fpage>
          -
          <lpage>633</lpage>
          (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Hogg</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Reeves</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Roberts</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mudford</surname>
            ,
            <given-names>O.C.</given-names>
          </string-name>
          :
          <article-title>Consistency, context and confidence in judgements of affective communication in adults with profound intellectual and multiple disabilities</article-title>
          .
          <source>J Intellect Disabil Res</source>
          <volume>45</volume>
          (
          <issue>1</issue>
          ),
          <fpage>18</fpage>
          -
          <lpage>29</lpage>
          (
          <year>2001</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <article-title>5 The study was approved by the Medical Ethical Committee of the MST (regional hospital in Enschede, the Netherlands)</article-title>
          ,
          <source>dossier NL 48070.044</source>
          .
          <article-title>14 and the internal science advisory board of the University of Twente</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          6.
          <string-name>
            <surname>Hostyn</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Maes</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          :
          <article-title>Interaction between persons with profound intellectual and multiple disabilities and their partners: A literature review</article-title>
          .
          <source>Journal of Intellectual and Developmental Disability</source>
          <volume>34</volume>
          (
          <issue>4</issue>
          ),
          <fpage>296</fpage>
          -
          <lpage>312</lpage>
          (
          <year>2009</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          7.
          <string-name>
            <surname>Huang</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Galloway</surname>
          </string-name>
          , J.:
          <article-title>Modified ride-on toy cars for early power mobility: a technical report</article-title>
          .
          <source>Pediatr Phys Ther</source>
          .
          <volume>24</volume>
          (
          <issue>2</issue>
          ),
          <fpage>149</fpage>
          -
          <lpage>154</lpage>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          8.
          <string-name>
            <surname>Iwabuchi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yang</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Taniguchi</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sano</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Aoki</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nakamura</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          :
          <article-title>Visualizing motion history for investigating the voluntary movement and cognition of people with severe and multiple disabilities</article-title>
          .
          <source>In: Computers Helping People with Special Needs</source>
          . pp.
          <fpage>238</fpage>
          -
          <lpage>243</lpage>
          (
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          9.
          <string-name>
            <surname>Larsen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hedvall</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          :
          <article-title>Ideation and ability: When actions speak louder than words</article-title>
          .
          <source>In: In Proceedings of the 12th Participatory Design Conference (PDC)</source>
          . pp.
          <fpage>37</fpage>
          -
          <lpage>40</lpage>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          10.
          <string-name>
            <surname>Lima</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Silva</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Magalhaes</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Amaral</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pestana</surname>
          </string-name>
          , H., de Sousa, L.:
          <article-title>Can you know me better? an exploratory study combining behavioural and physiological measurements for an objective assessment of sensory responsiveness in a child with profound intellectual and multiple disabilities</article-title>
          .
          <source>Journal of applied research in intellectual disabilities (JARID) 25(6)</source>
          ,
          <fpage>522</fpage>
          -
          <lpage>530</lpage>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          11.
          <string-name>
            <surname>Nakken</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vlaskamp</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>Joining forces: supporting individuals with profound multiple learning disabilities</article-title>
          .
          <source>Tizard Learning Disabil Rev</source>
          <volume>7</volume>
          (
          <issue>3</issue>
          ),
          <fpage>10</fpage>
          -
          <lpage>16</lpage>
          (
          <year>2002</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          12.
          <string-name>
            <surname>Oinas-Kukkonen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Harjumaa</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Persuasive systems design: Key issues, process model, and system features</article-title>
          .
          <source>Communications of the Association for Information Systems</source>
          <volume>24</volume>
          (
          <issue>28</issue>
          ),
          <fpage>485</fpage>
          -
          <lpage>500</lpage>
          (
          <year>2009</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          13.
          <string-name>
            <surname>Ramseyer</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tschacher</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          :
          <article-title>Nonverbal synchrony in psychotherapy: coordinated body movement reflects relationship quality and outcome</article-title>
          .
          <source>Journal of Consulting and Clinical Psychology</source>
          <volume>79</volume>
          (
          <issue>3</issue>
          ),
          <fpage>284</fpage>
          -
          <lpage>295</lpage>
          (
          <year>2011</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          14.
          <string-name>
            <surname>Thaller</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nussbaum</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Parker</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Accessible 4d-joystick for remote controlled models</article-title>
          .
          <source>In: Computers Helping People with Special Needs</source>
          . pp.
          <fpage>218</fpage>
          -
          <lpage>225</lpage>
          (
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          15. van Delden,
          <string-name>
            <given-names>R.</given-names>
            ,
            <surname>Reidsma</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            ,
            <surname>Oorsouw</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            ,
            <surname>Poppe</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            ,
            <surname>Vos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            ,
            <surname>Lohmeijer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Embregts</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            ,
            <surname>Evers</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            ,
            <surname>Heylen</surname>
          </string-name>
          ,
          <string-name>
            <surname>D.</surname>
          </string-name>
          :
          <article-title>Towards an interactive leisure activity for people with pimd</article-title>
          .
          <source>In: 14th International Conference on Computers Helping People with Special Needs</source>
          ,
          <string-name>
            <surname>ICCHP</surname>
          </string-name>
          <year>2014</year>
          . pp.
          <fpage>276</fpage>
          -
          <lpage>282</lpage>
          (
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          16.
          <string-name>
            <surname>Vlaskamp</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>de Geeter</surname>
            ,
            <given-names>K.I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Huijsmans</surname>
            ,
            <given-names>L.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Smit</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          :
          <article-title>Passive activities: the effectiveness of multisensory environments on the level of activity of individuals with profound multiple disabilities</article-title>
          .
          <source>Journal of Applied Research in Intellectual Disabilities</source>
          <volume>16</volume>
          (
          <issue>2</issue>
          ),
          <fpage>135</fpage>
          -
          <lpage>143</lpage>
          (
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          17.
          <string-name>
            <surname>Zijlstra</surname>
            ,
            <given-names>H.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vlaskamp</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>Leisure provision for persons with profound intellectual and multiple disabilities: quality time or killing time</article-title>
          ?
          <source>Journal of Intellectual Disability Research</source>
          <volume>49</volume>
          (
          <issue>6</issue>
          ),
          <fpage>434</fpage>
          -
          <lpage>448</lpage>
          (
          <year>2005</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>