<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Automatic Measurements of a Leisure Activity for People with Profound Disabilities</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Robby</forename><surname>Van Delden</surname></persName>
							<affiliation key="aff0">
								<orgName type="laboratory">Human Media Interaction</orgName>
								<orgName type="institution">University of Twente</orgName>
								<address>
									<settlement>Enschede</settlement>
									<country key="NL">the Netherlands</country>
								</address>
							</affiliation>
						</author>
						<author role="corresp">
							<persName><forename type="first">Dennis</forename><surname>Reidsma</surname></persName>
							<email>d.reidsma@utwente.nl</email>
							<affiliation key="aff0">
								<orgName type="laboratory">Human Media Interaction</orgName>
								<orgName type="institution">University of Twente</orgName>
								<address>
									<settlement>Enschede</settlement>
									<country key="NL">the Netherlands</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Automatic Measurements of a Leisure Activity for People with Profound Disabilities</title>
					</analytic>
					<monogr>
						<imprint>
							<date/>
						</imprint>
					</monogr>
					<idno type="MD5">325DAC556EE3F5CF26588AC172F85FAB</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-24T15:08+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Behavior Change Support Systems</term>
					<term>profound disabilities</term>
					<term>PIMD</term>
					<term>interactive ball</term>
					<term>automatic measurements</term>
					<term>leisure activity</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>We report on challenges we encountered when using automatic measurements for a longer term exploratory study (8-10 sessions, 9 participants) with people with Profound Intellectual and Multiple Disabilities (PIMD). In the overall study, which we will publish elsewhere, one element that we investigated was if we were able to persuade users of this target group to move more by providing interaction with an interactive ball. This paper focuses on the challenges we had regarding the use of our method for automatic measurement of movement based on camera recordings during this study. With this paper we like to remind researchers to not rely blindly on the outcome of automatic measurements and instead to analyze measures in depth, which can become difficult when using extensive sets of data. 1</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1">Introduction</head><p>Not all people have the combination of cognitive and/or physical capabilities to be able to enjoy modern sources of leisure such as computer games, watching TV, or interactive theme-park rides. Especially for people with Profound Intellectual and Multiple Disabilities (PIMD) there is a limited amount of suitable entertainment <ref type="bibr" target="#b0">[1,</ref><ref type="bibr" target="#b14">15,</ref><ref type="bibr" target="#b15">16]</ref>. Where there is a rise of persuasive technology incorporating gamification elements and other entertaining ways of feedback and persuasion, we see very little done for people with special needs, as illustrated by the number of results in March 2018 when searching in 1 The measurements discussed in this paper contributed to the results of two comparative studies which are currently under review in two other (journal) manuscript, tentatively titled 'Evaluating a newly developed interactive activity for people with profound intellectual and multiple disabilities' and 'Do we get your attention?! Looking into alertness, movement, and affective behaviour of people with PIMD upon introduction of a playful interactive product'. The concept for the ball has been described by van Delden et al. in 2014 <ref type="bibr" target="#b14">[15]</ref>, and parts of the current paper are also presented in a PhD thesis <ref type="bibr" target="#b1">[2]</ref>. The current paper presents additional rationale and details of the ball and insights on applying automatic measurements that are beyond the (non-technological) scope of the two manuscripts under review. Please cite the two other papers for everything that is reported in those papers, and especially for target group description, interactive concept, goals, and study design (and, obviously, the study results).</p><p>the past Persuasive Technology proceedings for 'special needs' (1 hit, a paper on people with autism) compared to 'gamification' (16 hits).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.1">A PIMD Target Group</head><p>People with PIMD are dependent on their caregivers, have low intellectual capabilities (immeasurable), and have disabilities that reinforce each other and can not be compensated for <ref type="bibr" target="#b10">[11]</ref>. Unlike healthy volunteers used in many other projects on persuasive technology, people with PIMD are unable to verbally communicate their preferences, their suggestions for improvement, or self-reports of their experience. Instead, they mainly communicate with the caregivers through body movement <ref type="bibr" target="#b4">[5,</ref><ref type="bibr" target="#b5">6]</ref>. People with PIMD make a very heterogeneous user group due to the variety and combinations of cognitive, sensory and physical disabilities. Even their everyday caregivers themselves often need to discuss among each other and take their time to establish interpretations of their actions and preferences. This complicates finding appropriate measurements.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.2">Interaction with an Interactive Ball to Promote Movement</head><p>In our project, which was instigated by the care organization Dichterbij, we emphasized the opportunities offered by new interactive technologies. For example, the Kinect depth camera or the Arduino prototyping platform can help develop suitable activating leisure activities that persuade users from this target group in a pleasurable way to exhibit certain wanted behavior. Larsen and Hedvall showed that interactive technology allows to create new pleasurable experiences tailored to people with PIMD <ref type="bibr" target="#b8">[9]</ref>. We have built upon this work and combined it with insights from our own work. We created an interactive ball responding to sounds and movement of a participant with PIMD. For the interaction a facilitator looked at the participant and remote controlled the ball according to a tailored protocol; this interaction between ball and user can be seen in Figure <ref type="figure" target="#fig_0">1</ref>. There were several possible benefits of providing such an interactive leisure activity that were taken into account; in the context of the current paper we only go into promoting movement.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.3">Focus and Goal of the Paper</head><p>One approach which seemed promising was to make use of automatic measurements, both for measuring study outcomes as well as for triggering the interaction patterns, for instance applying automatic measurements to indicate preferences by measuring movements that are almost invisible to the eye <ref type="bibr" target="#b7">[8]</ref>. The outcome and findings of our final study will be reported elsewhere. In this paper we focus on sharing our insights regarding the simple automatic measurement method we used for measuring movement, which we called Simplified Motion Energy Analysis.</p><p>Sixth International Workshop on Behavior Change Support Systems (BCSS18): Automatic Measurements of a Leisure Activity for People with PIMD With this paper, we want to achieve two things: 1) we want to inspire our research community to also make people with special needs benefit from persuasive technology and share some of our process and choices to this end, and, more importantly, 2) remind the reader that unforeseen side-effects and incorrect values in automatic measurements challenge us in using extensive data for this kind of system. This holds for evaluations, especially for the analysis as shown in this paper, but in truly automated interventions also for the performance of the system during the intervention itself.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.4">Paper Outline</head><p>The remainder of this paper is structured as follows. We will continue with related work on the user group and (related) interactive entertainment for people with special needs. In the third section we will briefly describe our prototype and underlying design principles. In the fourth section we will then explain our automated measurement for movement and our experiences with this. We will conclude by discussing some of our vision and the challenges that become apparent when working with this target group attempting to use automatic measurements over longer periods time.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2">Leisure Activities for People with PIMD</head><p>To our knowledge there are only a few examples available of research that develops truly interactive technology for people with PIMD <ref type="bibr" target="#b0">[1,</ref><ref type="bibr" target="#b3">4,</ref><ref type="bibr" target="#b8">9,</ref><ref type="bibr" target="#b14">15]</ref>. We see truly interactive systems as presenting a developing dialogue of interaction between man and machine. This goes beyond 'switches', pushing a button in order to get a repeating constant response <ref type="bibr" target="#b0">[1,</ref><ref type="bibr" target="#b14">15]</ref>. Nonetheless, explorations into the latter can also provide additional entertainment opportunities. For instance, a modified switch controlled ride-on toy car for young children with severe physical disabilities could provide an additional activity in which postural control might be trained in a motivating and fun way <ref type="bibr" target="#b6">[7]</ref>. However, there</p><p>Sixth International Workshop on Behavior Change Support Systems (BCSS18): Automatic Measurements of a Leisure Activity for People with PIMD 51 is a limited amount of entertainment in general and especially a lack of non-sedentary activities for people with PIMD <ref type="bibr" target="#b16">[17]</ref>. Others have also indicated a lack of interactive entertainment for people that are severely disabled <ref type="bibr" target="#b0">[1,</ref><ref type="bibr" target="#b13">14,</ref><ref type="bibr" target="#b14">15]</ref>.</p><p>The related work that can be found seems to point towards personalized and tailored interactions, which also fits the Persuasive System Design model <ref type="bibr" target="#b11">[12]</ref>. For instance, Thaller et al. developed an interactive Radio Frequency (RF) controlled toy with a mouth operated joystick <ref type="bibr" target="#b13">[14]</ref>. To allow their '4D-joystick' to be used with people with some tremors they could set personalized 'dead zones' to filter the input and planned to allow for rearranging mappings between input and output on an individual level <ref type="bibr" target="#b13">[14]</ref>.</p><p>Others have also indicated the importance of these personalized and tailored interactions for this heterogeneous user group of people with PIMD <ref type="bibr" target="#b2">[3,</ref><ref type="bibr" target="#b9">10]</ref>. Providing such appropriate (personalized) sensory stimuli and interpreting responses to people with PIMD has been reported to be fairly difficult. Analyses might therefore benefit from automatic measurements, including electrodermal responses, motion energy analysis and heart-rate <ref type="bibr" target="#b7">[8,</ref><ref type="bibr" target="#b9">10]</ref>.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3">An Interactive Ball</head><p>In our design efforts we build on our earlier work and that of Caltenco et al., also using interactive body-controlled physically present objects <ref type="bibr" target="#b0">[1,</ref><ref type="bibr" target="#b14">15]</ref>. With our object, the interactive ball, we tried to create an enjoyable experience that motivates the targeted users to move. We believe that using truly interactive systems for this user group can generate a pleasurable and activating leisure activity.</p><p>We developed an interactive ball that can be remotely controlled based on gross body movements and vocalizations, see Figure <ref type="figure" target="#fig_0">1</ref>. During our development there were two different versions of this interactive system. Version 1 was controlled fully automatically on basis of recognized body movement using a Kinect sensor. We only used this version with students and at the start of a pilot with 5 participants from the target group. In version 2 the ball was remotely controlled by a facilitator, this was the 'final' version that we used for the longer term (8-10 sessions) study. <ref type="foot" target="#foot_0">2</ref>The ball moves based on changing the center of gravity by rotating two weighted arms connected to a servo inside the ball. This allows for a gentle movement, unlike the more direct and quick movements of the commercially available interactive ball Sphero. With a WeFly wifi-hotspot, simple string based commands were sent to the ball. We programmed a simple Graphical User Interface (GUI) in C++ (using the Qt framework) that allowed the facilitator to move the ball with the keyboard cursors and to use some fields to set the speed. Key inputs could be used to play 17 different sounds. Most of these sounds were made using free virtual (synthesizer) instruments, 6 were recordings from an online audio database (mainly animal sounds and bells). The sounds were played in front of the user over standard PC speakers. We painted the ball in highly Sixth International Workshop on Behavior Change Support Systems (BCSS18): Automatic Measurements of a Leisure Activity for People with PIMD contrasting blue and yellow colors. After some sessions we also added a rattle, so the ball made more noise. This could make it easier to follow the ball, especially important for people with more limited visual capabilities.</p><p>As stated above our first concept was based on an automated detection of the user's movements with the Kinect depth camera and software. At the same time we used a webcam and a simple background subtraction method to track the position of the ball. We created an interaction in which the ball's position (left/right on a predefined path) was dependent on the position of the head. To test the technical feasibility and to improve this interaction we tested this with 40 students. During these tests we made some small technical improvements (e.g. in the implementation of sending commands) but the system worked reliable enough at the end of these tests.</p><p>However, during the first tests with people from the actual target group we realized that one fully automated interactive system was not flexible enough to deal with the heterogeneity of the user group. For instance, we planned to use the head positions as the main method of interaction, but already the first PIMD-user did not move her head left/right often enough, so this data could not be used as main input. Besides the lacking adaptability to the user's abilities and interaction methods, the first concept was also not tailored enough to preferences of the user. Solving this technically could become quite hard. Therefore, we switched to a Wizard of Oz approach as it seemed to be a good first step to see whether the interactive ball in whatever way could be beneficial for people with PIMD.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1">Interaction and Design Rationale</head><p>Before designing the interaction we have spoken with several caregivers and a therapist about the project and did a literature research. We also watched play sessions, visited daily living environments, and were informed about a variety of activities for this user group. We believe this is essential in coming to a suitable interaction. However, no overgeneralization should take place: things seen for one user can not simply be transferred to another. Therefore, we tailored the interaction patterns of the ball in an iterative and individualized way. We continued to work closely together with caregivers, in order to analyze the user's behavior and experience and to help improve the interactive system, during a first set of habituation sessions for each participant.</p><p>We chose for a ball as this is a general shape used often in traditional playful interactions, furthermore we saw one person from the target group-with relatively high capabilities-trying to push a big ball during an observational session. We think a big physical object has benefits over showing objects on a screen, as we expect people with visual disabilities to be more likely to be able to identify and follow this. As a start of the interaction we preferably use the movements of the user that were likely to occur, such as the rocking behavior of the upper body (see Figure <ref type="figure" target="#fig_0">1</ref>), using their 'vocalizations' or the fiddling of their hands. After this (unconscious) initiation of interaction, the user has to learn the link over time between action and response, something we knew beforehand not all users will be capable of. Furthermore, we were not even sure whether some users would recognize the ball and its behavior. To increase chances for these kinds of recognition, sessions were held several times. To further improve chances of success we also tailored to the possible actions of the user, as well as the responses that needed to be stimulating enough for the user to make a cause-effect link more plausible. We added sound feedback as we knew that some people from the user group are more sensitive to this than merely visual feedback.</p><p>The exact interaction protocol depended on the user. More info on this protocol will be published elsewhere; here we only briefly describe the general way of interaction. In this protocol actions where described like, user action: leans to our left, ball: move ball to our left, sound: play sound type 1. For the implemented protocol for the longer term study we started from the premises of what could be expected from a user as well as what we designed that the interactive ball could do: move, make sounds, and change appearance using bright LEDs. This resulted in some links that might not be as intuitive as they could have been. We made the ball respond to upper body movement and/or focus of attention. The ball was not placed within reach of the user, to trigger them to show alertness farther away than their close encounters, as well as to make the ball less likely to bump against a person or to harm the participant in any other way. When the user moved his upper body, the ball was remote controlled by a facilitator to start to roll. The ball also made sounds when an attempt for interaction was made or if the ball was kept in the user's visual focus for some time. Furthermore, for some users the ball moved from side to side when it was not been interacted with for some time. This was done in order to (re)gain the attention of the user. Sounds were also played to further grab the attention of the user, and some categories of sounds were removed or played more often depending on (dis)liking it.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4">Automatic Measurement of Movement: Simplified Motion Energy Analysis (SMEA)</head><p>For one of our outcome measures (the only one we discuss in this paper) we used computer vision techniques to automatically detect movement responses. This allowed us to make use of the benefits of automatic measurements without burdening the participants too much. For this target group it is important to consider downsides of for instance placing body worn sensors, as this might have detrimental effects for overly sensitive people. In our study design we also took into account an extended time for getting accustomed to a new situation.</p><p>We carried out a pilot with five people from the target group, followed by the final study with 9 participants. One of the researchers was present to control the ball, one other researcher was taking notes of noticeable and relevant behavior and (for a set number of sessions) take a structured interview, and one caregiver was present to help the participant if needed (e.g. one participant had phlegm that resulted in coughing severely and several times needed to be helped and calmed). We used three sessions to tailor the interaction protocol to each participant, and then had up to two more additional sessions, where these 3-5 sessions were used to get the user acquainted with the people present, Sixth International Workshop on Behavior Change Support Systems (BCSS18): Automatic Measurements of a Leisure Activity for People with PIMD the ball, and the room. We then used the last 5 sessions to do our measurements. We compared the movement during interaction with periods where the ball was not present; exact details are discussed in the other papers. We compared on a person-to-person basis and used the data only for descriptive statistics including whether they moved more, less, or showed about the same amount of movement. Beside these automatic measures there were also other measures for the final study which are outside the scope of this paper.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.1">SMEA</head><p>To have a measure for movement we used a fairly simple method. It is a slightly simpler version of Motion Energy Analysis as used by Ramseyer &amp; Tsacher <ref type="bibr" target="#b12">[13]</ref> and shows resemblance to the motion history also applied for a similar participant by Iwabuchi et al. <ref type="bibr" target="#b7">[8]</ref>. In the Simplified Motion Energy Analysis we measured the amount of pixels that changed beyond a certain threshold. This seems to be especially fitting for measurements in a planar frame with respect to the camera, which in turn fits an interaction of leaning left and right. Our Simplified Motion Energy Analysis method consists of the following steps, see Figure <ref type="figure" target="#fig_1">2</ref> : 1) grab the video frames, 2) convert those to gray scale, 3) crop the image to only the area of interest, the amount of cropped pixels depended on the position of the wheelchair and presence of people around, 4) copy the image and delay it for the next step, 5) subtract the delayed frame from the current frames with background subtraction and provide either a) absolute pixel differences (as in the figure) or b) a binary version with number pixels that changed (as explained and used for our study), 6) sum all pixels (or differences), and 7) save them to a text file. We had to analyze quite a lot of footage: 5 sessions, 9 participants, and 30 minutes of interaction, which sums up to 1350 minutes where each second is made of numerous frames that had to be subtracted from each other. To do this in a timely manner, we used a computer vision framework (Parlevision) from our research group written in C++ (using the Qt framework), that was build on OpenCV providing a GUI that allowed to alter settings in runtime. This allowed us to run the analysis faster than real time.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.2">Analysis of the SMEA Results</head><p>We investigated the resulting measurements thoroughly. Figure <ref type="figure">3</ref> shows post-filtered results. We used Matlab R2012a to generate the graphs and filter out noise-polluted results, which we will discuss next. The meaning of the results regarding the outcome measure will be published elsewhere.</p><p>Using similar graphs we saw numerous peaks of measured movements for participants <ref type="foot" target="#foot_1">3</ref>where we did not remember seeing this during the session. For instance, one participant slept through the entire session but still we saw several peaks in the measured movement <ref type="foot" target="#foot_2">4</ref> . This made abundantly clear that there was noise interfering with the measurements. Therefore, we tried to sync the measured frames with the video recordings and we saw several reasons for this occurring, we did not notice these aspects in the pilots.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.3">Unexpected Sources of Noise</head><p>One, the camera apparently made use of auto-focus. At certain points the camera would focus differently and this resulted in measured peaks, when changing focus and often a second or so later changing back to original focus, we hypothesize this could have been related to facial recognition of the camera. Two, in a similar fashion we noticed something that seemed like a color filter change, combined with our own experience this seemed to occur when sunlight intensity changed (even though we equipped the room with curtains), see Figure <ref type="figure" target="#fig_2">4</ref>. Three, the camera sometimes shook resulting in short peaks, this occurred when the ball was moved in a certain rhythm back and forth. Fourth, at certain moments feet of people present entered the footage, we cropped the images to prevent this as much as possible, see Figure <ref type="figure" target="#fig_2">4</ref>. Fifth, although very limited the reflections of the moving ball could be seen in one of the wheelchairs. Sixth, we knew from previous work that the clothing of the day can influence the amount of pixels changed (compare a checkered blouse with a black sweater). This mainly made the results harder to compare between days, which was less of an issue with our study Fig. <ref type="figure">3</ref>.</p><p>A not yet published sub-selection of movement measure of clients (with no clear effect).</p><p>The vertical lines represent the start and end of the interaction with the ball. For a more detailed discussion of what the SMEA results mean and how they can be interpreted will be published elsewhere, for now we refer to <ref type="bibr" target="#b1">[2]</ref>.</p><p>setup. Seventh, the wheelchair also moved due to the heavy movements of some of the participants which might influence the results when looking only at the averages.</p><p>Although we did not anticipate any of this, to a certain amount our context thus violated the static camera and light conditions prerequisites of MEA <ref type="bibr" target="#b12">[13]</ref>, which required us to manually inspect the peaks and video, and filter out these sources of noise.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5">Discussion and Conclusion</head><p>For our project we were motivated by being confronted with the issues that the target group had such limited possible leisure activities. The observations of the target group and conversations with caregivers helped us to clarify that for some people of the target group even a bit of movement increase can be of added value. Early on in the project, we realized that expecting statistical results over the generalized population, a population which although small in size is very heterogeneous, might be too much to expect. Even targeting a specific sub-group selected beforehand can be hard as it was not always predictable for which users it might or might not be beneficial. We tried to fit this heterogeneity also in our research approach, placing a lot of emphasis on analysis of individual cases. In short, we knew it would be a time consuming research where there would not be 'a cure for all' of a large population but that we could inspire others to also create more interactive leisure activities for this target group.</p><p>In our study design we took into account the variability within and between participants for this target group. This also had advantages to deal with some of the shortcomings of the measurement tool (e.g. impact of clothing changes on SMEA). Furthermore, we did a pilot study but refrained from in-depth analysis of the entire sessions (they were not an effect study after all), it is however good to mention that if we had done this analysis it might have prevented some of the issues (e.g. auto-focus) from occurring. We do believe an analysis of any 'new' automatic measurement tool should be done thoroughly as unexpected events can influence results. Although there were severe shortcomings of our approach it did give an interesting and useful result, so we still argue that automatic measures can be an effective tool.</p><p>However, in our case the measurement 'errors' could be related quite easily to outliers, but what happens if results are not recognizable so easily as outliers, would they be able to impact results in an unnoticeable way? Furthermore, when data is gathered over a even longer period of time, would it not be impossible to investigate these outliers one by one?</p><p>In our case we knew that habituation for this target group was important. However, in many cases we (can) only measure persuasive systems over a short term. Furthermore, the ethical considerations for this target group and several health related issues, Sixth International Workshop on Behavior Change Support Systems (BCSS18): Automatic Measurements of a Leisure Activity for People with PIMD especially when measured over a longer time period, can also take up a long time <ref type="foot" target="#foot_3">5</ref> . Therefore, is it conceivable that systems can be beneficial for end-users and from a technological point of view be developed in a research setting, but are less likely to be researched as evaluations might become too complicated and time consuming?</p><p>For this target group some are able to crawl others had a hard time even moving their head. It is important that the automatic measurement tool can deal with both. However, even if the tool can measure both, the same kind of data might have a different meaning from person to person in our specific context. How can we take into account such large difference in possibilities, and interpreting their data, between participants?</p><p>In short we like to conclude: 1) that developing persuasive entertainment for people with PIMD is challenging but seems worthwhile to investigate, 2) that automatic measurements can be a useful tool for longer term measurements but that shortcomings should not be overlooked and might require a thorough analysis and manual filtering step.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Fig. 1 .</head><label>1</label><figDesc>Fig. 1. The interactive ball. On the left the inner workings. On the right, an anonymized participant of the pilot study interacting with the ball.</figDesc><graphic coords="3,137.10,115.84,131.42,122.74" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Fig. 2 .</head><label>2</label><figDesc>Fig. 2. An example of the process of Simplified Motion Energy Analysis, where we can see the movement of a caregiver on the right. The blocks from the Parlevision pipeline (green) perform similar steps as written out in the blue blocks.</figDesc><graphic coords="8,134.77,115.83,345.80,160.36" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Fig. 4 .</head><label>4</label><figDesc>Fig. 4. The anonymized uncropped footage of one of the clients, showing a clear unexpected artifact introduced by the camera: seemingly an automatic change in color filter.</figDesc><graphic coords="10,186.64,115.83,242.07,277.37" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0"><head></head><label></label><figDesc></figDesc><graphic coords="9,134.77,115.83,345.83,154.52" type="bitmap" /></figure>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="2" xml:id="foot_0">Currently we also have a simple tablet application to control the ball's movement, change the LEDs, and play sounds from within the ball.</note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="3" xml:id="foot_1">This measured movement is represented in amount of pixels that differed beyond a threshold, where we subtracted the average movement of the entire session of the day for representation purposes.</note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="4" xml:id="foot_2">In a normal population this sleeping participant might be considered an outlier and could be withdrawn from analysis. For our main study, we kept this participant because such "outliers" can in fact occur very regularly with this target group.</note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="5" xml:id="foot_3">The study was approved by the Medical Ethical Committee of the MST (regional hospital in Enschede, the Netherlands), dossier NL 48070.044.14 and the internal science advisory board of the University of Twente</note>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Acknowledgements This research was supported by the Dutch national program COM-MIT, and received additional funding of Dichterbij, the health care organization involved. We like to thank Wietske van Oorsouw and Petri Embregts of University of Tilburg, as well as Sophie Wintels, for the joint setting up and carrying out the study reported elsewhere and for the hours and hours of manual annotation. We also like to thank our professors Dirk Heylen and Vanessa Evers for their time and valuable input in the project, Kitt Engineering for developing the hardware for the ball for us. And most importantly all the participants and the involved family members, as well as all the caregivers and other employees at Dichterbij making time in their already very busy schedules, these are the people actually making this project possible.</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">Enhancing multisensory environments with design artifacts for tangible interaction</title>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">A</forename><surname>Caltenco</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Larsen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Hedvall</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">proceedings of The Seventh International Workshop on Haptic and Audio Interaction Design (HAID)</title>
				<meeting>The Seventh International Workshop on Haptic and Audio Interaction Design (HAID)</meeting>
		<imprint>
			<date type="published" when="2012">2012</date>
			<biblScope unit="page" from="45" to="47" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<monogr>
		<title level="m" type="main">Steering) interactive play behavior</title>
		<author>
			<persName><forename type="first">R</forename><surname>Van Delden</surname></persName>
		</author>
		<imprint>
			<date type="published" when="2017">2017</date>
		</imprint>
		<respStmt>
			<orgName>University of Twente, the Netherlands</orgName>
		</respStmt>
	</monogr>
	<note type="report_type">Phd thesis</note>
</biblStruct>

<biblStruct xml:id="b2">
	<monogr>
		<title level="m" type="main">Multisensory Rooms and Environments Controlled Sensory Experiences for People with Profound and Multiple Disabilities</title>
		<author>
			<persName><forename type="first">S</forename><surname>Fowler</surname></persName>
		</author>
		<imprint>
			<date type="published" when="2008">2008</date>
			<publisher>Jessica Kingsley Publishers</publisher>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Inclusion through design -engaging children with disabilities in development of multi-sensory environments</title>
		<author>
			<persName><forename type="first">P</forename><surname>Hedvall</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Larsen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">A</forename><surname>Caltenco</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Assistive Technology-From Research to Practice</title>
				<imprint>
			<date type="published" when="2013">2013</date>
			<biblScope unit="page" from="628" to="633" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">Consistency, context and confidence in judgements of affective communication in adults with profound intellectual and multiple disabilities</title>
		<author>
			<persName><forename type="first">J</forename><surname>Hogg</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Reeves</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Roberts</surname></persName>
		</author>
		<author>
			<persName><forename type="first">O</forename><forename type="middle">C</forename><surname>Mudford</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">J Intellect Disabil Res</title>
		<imprint>
			<biblScope unit="volume">45</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page" from="18" to="29" />
			<date type="published" when="2001">2001</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">Interaction between persons with profound intellectual and multiple disabilities and their partners: A literature review</title>
		<author>
			<persName><forename type="first">I</forename><surname>Hostyn</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Maes</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Intellectual and Developmental Disability</title>
		<imprint>
			<biblScope unit="volume">34</biblScope>
			<biblScope unit="issue">4</biblScope>
			<biblScope unit="page" from="296" to="312" />
			<date type="published" when="2009">2009</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">Modified ride-on toy cars for early power mobility: a technical report</title>
		<author>
			<persName><forename type="first">H</forename><surname>Huang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Galloway</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Pediatr Phys Ther</title>
		<imprint>
			<biblScope unit="volume">24</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="149" to="154" />
			<date type="published" when="2012">2012</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">Visualizing motion history for investigating the voluntary movement and cognition of people with severe and multiple disabilities</title>
		<author>
			<persName><forename type="first">M</forename><surname>Iwabuchi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Yang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Taniguchi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Sano</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Aoki</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Nakamura</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Computers Helping People with Special Needs</title>
		<imprint>
			<biblScope unit="page" from="238" to="243" />
			<date type="published" when="2014">2014</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Ideation and ability: When actions speak louder than words</title>
		<author>
			<persName><forename type="first">H</forename><surname>Larsen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Hedvall</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 12th Participatory Design Conference (PDC)</title>
				<meeting>the 12th Participatory Design Conference (PDC)</meeting>
		<imprint>
			<date type="published" when="2012">2012</date>
			<biblScope unit="page" from="37" to="40" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Can you know me better? an exploratory study combining behavioural and physiological measurements for an objective assessment of sensory responsiveness in a child with profound intellectual and multiple disabilities</title>
		<author>
			<persName><forename type="first">M</forename><surname>Lima</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Silva</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Magalhaes</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Amaral</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Pestana</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>De Sousa</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of applied research in intellectual disabilities (JARID)</title>
		<imprint>
			<biblScope unit="volume">25</biblScope>
			<biblScope unit="issue">6</biblScope>
			<biblScope unit="page" from="522" to="530" />
			<date type="published" when="2012">2012</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Joining forces: supporting individuals with profound multiple learning disabilities</title>
		<author>
			<persName><forename type="first">H</forename><surname>Nakken</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Vlaskamp</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Tizard Learning Disabil Rev</title>
		<imprint>
			<biblScope unit="volume">7</biblScope>
			<biblScope unit="issue">3</biblScope>
			<biblScope unit="page" from="10" to="16" />
			<date type="published" when="2002">2002</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Persuasive systems design: Key issues, process model, and system features</title>
		<author>
			<persName><forename type="first">H</forename><surname>Oinas-Kukkonen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Harjumaa</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Communications of the Association for Information Systems</title>
		<imprint>
			<biblScope unit="volume">24</biblScope>
			<biblScope unit="issue">28</biblScope>
			<biblScope unit="page" from="485" to="500" />
			<date type="published" when="2009">2009</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">Nonverbal synchrony in psychotherapy: coordinated body movement reflects relationship quality and outcome</title>
		<author>
			<persName><forename type="first">F</forename><surname>Ramseyer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><surname>Tschacher</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Consulting and Clinical Psychology</title>
		<imprint>
			<biblScope unit="volume">79</biblScope>
			<biblScope unit="issue">3</biblScope>
			<biblScope unit="page" from="284" to="295" />
			<date type="published" when="2011">2011</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<title level="a" type="main">Accessible 4d-joystick for remote controlled models</title>
		<author>
			<persName><forename type="first">D</forename><surname>Thaller</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Nussbaum</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Parker</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Computers Helping People with Special Needs</title>
				<imprint>
			<date type="published" when="2014">2014</date>
			<biblScope unit="page" from="218" to="225" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<analytic>
		<title level="a" type="main">Towards an interactive leisure activity for people with pimd</title>
		<author>
			<persName><forename type="first">R</forename><surname>Van Delden</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Reidsma</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><surname>Oorsouw</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Poppe</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Vos</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Lohmeijer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Embregts</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Evers</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Heylen</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">14th International Conference on Computers Helping People with Special Needs</title>
				<meeting><address><addrLine>ICCHP</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2014">2014. 2014</date>
			<biblScope unit="page" from="276" to="282" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<analytic>
		<title level="a" type="main">Passive activities: the effectiveness of multisensory environments on the level of activity of individuals with profound multiple disabilities</title>
		<author>
			<persName><forename type="first">C</forename><surname>Vlaskamp</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><forename type="middle">I</forename><surname>De Geeter</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">M</forename><surname>Huijsmans</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Smit</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Applied Research in Intellectual Disabilities</title>
		<imprint>
			<biblScope unit="volume">16</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="135" to="143" />
			<date type="published" when="2003">2003</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b16">
	<analytic>
		<title level="a" type="main">Leisure provision for persons with profound intellectual and multiple disabilities: quality time or killing time?</title>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">P</forename><surname>Zijlstra</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Vlaskamp</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Intellectual Disability Research</title>
		<imprint>
			<biblScope unit="volume">49</biblScope>
			<biblScope unit="issue">6</biblScope>
			<biblScope unit="page" from="434" to="448" />
			<date type="published" when="2005">2005</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
