Ambient Reflection: Towards self-explaining devices Daniel Burmeister Bashar Altakrouri Andreas Schrader University of Lübeck University of Lübeck University of Lübeck Institute of Telematics Institute of Telematics Institute of Telematics burmeister@itm.uni- altakrouri@itm.uni-luebeck.de schrader@itm.uni-luebeck.de luebeck.de ABSTRACT into an increasing gap between the users’ ability to learn and In the course of ubiquitous and pervasive computing a vari- remember these techniques and the provided functionality. ety of smart devices are developed and entering our every- Currently, documentation of smart devices’ interaction tech- day life. These devices increasingly rely on novel interac- niques in ambient space scenarios is usually spatial dis- tion modalities from the field of Natural Interaction, such as tributed and highly eco-centric, if accessible at all. Concern- gesture control. Common concepts to explain and illustrate ing the ongoing interconnection of devices and interaction devices’ interaction possibilities can’t be applied to these in- behavior in terms of IoT, such documentation can’t be real- teraction techniques due to embedding of devices and as a ized manually. In total, the variety of current Smart Objects consequence disappearing interfaces as well as distribution challenge users in accessing and operating. It is reasonably of functionalities among device ensembles in terms of IoT, assumed, that these interaction challenges will increase sig- AAL and Smart Home. These emerging and currently ex- nificantly caused by complexity and unpredictable intercon- isting problems in accessing devices’ interaction possibilities nections in ambient spaces. present users with new challenges. In addition, current pos- sibilities for device documentation provide only a limited vi- able option to learn devices. Hence, a general documentation SMART AMBIENT SPACES for interconnected devices and thus functionality can not be Ambient spaces are manifested by an expanding world of in- created manually. In order to counteract these problems we terconnected Smart Objects full of rich interaction capabil- present an approach for in-situ generation of an ambient man- ities driven by ubiquitous and pervasive technologies. Re- ual for interconnected smart devices. search and industrial development in this area have resulted into vast increase in the number of smart commodity devices Author Keywords and objects (thereafter, called Smart Objects) seamlessly in- Ambient Computing; Human Computer Interaction terweaving in a wide range of inhabited environments (e.g., Guidance households). A recent study conducted by BITKOM (Fed- eral Association for Information Technology, Telecommuni- ACM Classification Keywords cations and New Media) revealed that every household in H.5.2. User Interfaces: Training, help, and documentation Germany owns at least 50 electrical devices with an increas- ing tendency towards more devices [4] and half of all house- INTRODUCTION hold devices are expected to be connected as part of a network As a result of the ongoing research and development in the ar- by 2018 as reported by RWE Effizienz GmbH [25]. eas of ubiquitous and pervasive computing the variety of het- erogenous commercial devices with rich functionalities and While users are currently familiar with handling normal phys- novel interaction techniques arise. Particularly with regard ical objects and with interacting with simple and often limited to the fields of Ambient Assisted Living (AAL), Internet of number of electrical devices [25], the variety and diversity Things (IoT) and Smart Home, users are increasingly faced of functions and handlings of Smart Objects pose new chal- with natural interaction. While bulk of HCI research strives lenges, especially to enable and familiarize users with inter- to create interaction techniques that are easy to learn, natu- action possibilities in ambient spaces [5, 28]. ral, self-explaining, and novel, documentation of interaction The increasing number of devices as well as the increasing techniques is generally an underestimated and ignored issue diversity of offered functions imposes serious learning issues or simply considered luxury and unnecessary. Considering for the user according to Poppe et al. [20]. In one of his ar- the progressive complexity in ambient scenarios containing ticles, Norman argued that this may easily lead to long-term heterogenous devices and interaction techniques, this results usability obstacles and inflate problematic and irrational use of devices [16]. For instance, in case of a time change, dif- ferent clocks in households offer inconsistent ways and inter- action modalities for modifying the time. Hence, even this simple operation normally challenges the user [11]. Such Workshop on Large-scale and model-based Interactive Systems: Approaches and challenges easily evolve with more emphasis on the required Challenges, June 23 2015, Duisburg, Germany. implicit knowledge of users and the lack of adequate docu- Copyright c 2015 for the individual papers by the papers’ authors. Copying mentation [30]. Sometimes devices can’t even be controlled permitted only for private and academic purposes. This volume is published and copyrighted by its editors. without the use of additional material [27]. The interaction challenges and difficulties with current and cuting such a simple command, for instance movement direc- future smart devices and artifacts were also the subject of tion, involved body parts, timing information, etc. Norman’s book, titled Living with Complexity [17]. In this book, Norman drew a clear distinction between complexity GUIDANCE IN AMBIENT SPACES and complication. While complexity refers to the form of pre- In order to correctly use simple or complex technologies, the sentation of possible interaction states and transitions, com- availability and accessibility of relevant information are es- plication donates the psychological state of a person who tries sential for the user. Therefore, Norman [14] coined the term to learn an interaction with an object. Hence, complex ob- affordances in respect of objects’ self-revealing interaction jects and artifacts are not necessarily complicated to interact possibilities to easily enable users interacting with them. The with. Complication barriers can appear due to different rea- same applies for the interaction in ambient spaces, however sons including changing the environment and simultaneously the current concept of affordances does not apply to the on- changing artifacts. We believe that ambient spaces may result going embedding of devices and accordingly their interac- into various complication barriers due to three inherited char- tion possibilities [26, 22]. The dynamic nature of ambient acteristics reported by Pruvost et al. [23], namely, the hetero- spaces imposes different learning and affordance challenges geneity and distributivity (containing a variety of devices with on users. In this regard, relaying solely on visual appearance various capabilities); dynamic media mobility (interaction ca- and affordances of a smart object to explain its logic and func- pabilities are highly dynamic as interaction devices may join tion are not enough [26]. Hence, adequate documentation and and leave the ambient space at any time), and user mobility presentation of interaction possibilities and the utility of an (challenging users to attend to interaction needs). This leads, object are essential part for learning ambient spaces, which very commonly, to missing the natural mapping of offered aim at correct usage of devices and optimized user mental functionality and adequate interaction modalities [15, p. 12], models. as well as to hindering the user from building the correct men- tal model of the system. In ambient spaces, documentation is not only vital for the use of objects but also for the design process itself and for a suc- cessful share and exchange of components and knowledge. OPPORTUNITIES AND CHALLENGES OF NATURAL Although different device manufacturers pay attention to the USER INTERFACES consistency of interaction patterns and product descriptions, Recent advancements in HCI research have revealed new and there are currently no consistent and unified standards for de- novel interaction techniques to operate and control devices in scribing smart objects and their offered interaction possibil- ambient spaces by using Natural User Interfaces (NUIs) as ities in ambient spaces. Based on this, users have to repeat- in multi-touch gestures, motion-gestures, gaze-interactions, edly remember how to interact with such devices [24]. This etc. [8]. In literature, different definitions of interactions with recurring state of knowledge between beginner and expert in NUIs were elaborated and most of them refer to the user’s interacting with a device is called perpetual intermediate [6, natural abilities, practices, and activities to control interactive p. 42]. systems. Many of those interactions are mostly caused and characterized by motion and movement activities, ranging Our previous work on reviewing existing documentation- from pointing, clicking, grasping, walking, balancing, danc- related tools for NUIs revealed four general observations or ing, etc. as discussed in [1]. shortcomings, namely the lack of widely adopted tools by NUI designers, the absence of dedicated NUI documentation In the last 10 years, NUIs, using touch and motion en- tools, the lack of end-user support, and the lack of support abled technologies, found their way commercially and be- and considerations of body movements and postures as part of came widely accessible to the end user. Moreover, users are the interaction descriptions (if at all found). Furthermore, the becoming more acquainted with using different body parts review revealed that there is a lack of formalized languages to interact with applications such as gaming (e.g., motion- and notations of generic motion documentation [1, 3]. For controlled active play by Microsoft Kinect or the Wii system), the previously mentioned reasons and potentially more, peo- data browsing, navigation scenarios (e.g., tilting for scrolling ple turn to rely on other learning approaches and methods. photos as in iOS and Android devices), and many more. This Trial-and-Error is a very common practice to unveil adequate has encouraged the HCI community to continuously expand system interactions used by users. However, it is not neces- towards the NUI paradigm and currently various new calls sarily the most effective approach in many cases. This was have been arisen to explore new potential in designing for the the subject of many research studies in the area of safety and whole body in motion [7, 9]. Despite the efforts towards in- critical environments. A study has revealed that 70% of sur- tuitive and simple interfaces, the NUI paradigm is challenged geons and 50% of nurses demonstrated problems dealing with by an expanding user population and diversity with respect medical devices in operating theaters [12], where 40% of the to age and physical abilities, as discussed in [1]. On the one respondents indicated that the ignorance of adequate opera- hand, the naturalness of NUI does not imply the simplicity tion guidelines of medical devices have resulted into repeat- to recall and use interaction techniques [18]. On the other edly occurring hazards. In a previous study [1], the majority hand, utilizing the human body and its parts for interaction of reported respondents of a questionnaire (more than 90%) comes with its own set of complexities. Simple commands, rely on try and error to learn interactive techniques of per- like ”raise your arm”, may have very different interpretations. sonal interactive devices (e.g., smart phones, interactive TVs, Different aspects are important to consider for correctly exe- handhelds, and game consoles). This can be due to the limited range and simplicity of interaction features currently avail- able in the users’ commodity devices (e.g., swipe, shake, and pan). Nonetheless, there is a strong evidence that learning and memorizing interaction techniques will become more com- plex due to the vast growth of multi-touch- and motion-based interactions in terms of, but not limited to, the number of in- teractions proposed, the increasing complexity of interaction techniques, expanding diversity of interaction types, involved body parts, involved actions, and runtime ensembles of in- teraction techniques [7, 9, 13, 2]. This clearly advocates the need for reference documentation of interaction techniques as a necessity and an aid tool for users [19]. In fact, interactiv- ity in ambient spaces is becoming even increasingly dynamic (interaction environments are becoming increasingly hetero- geneous and dynamic and no longer static and closed [23]), adaptive (required for sustainable utility and usability), and multi-modal. Hence, interactive ambient spaces are created in an ad-hoc manner, where multiple interaction techniques grouped together to adapt the available interaction resources and possibilities to the user’s physical context and abilities. This shift towards an evolving world of interactivity (smart spaces, user mobility, anthropomorphic abilities and disabili- ties, preferences, etc.) requires new dissemination, deploy- ment, and adaptation mechanisms for NUI. For these rea- sons, documentation for training, demonstration, and refer- ence purposes plays a major role to set the limits and bound- aries for NUI deployment and adoption in interactive ambient spaces. A FRAMEWORK FOR AMBIENT REFLECTION In order to offer a possibility to compensate the previously Figure 1. Ambient Reflection Framework mentioned emerging problems in interaction and documen- tation, we strive for developing a three-divided framework for Ambient Reflection as an integral component of reflective Documentation Fusion systems self-x properties [21, p. 322 et seq.]. By providing Given the assumption, that relevant devices in an ambient this framework as a feasible solution, we foster the multi- space scenario were described by an ambient reflective doc- modal self-description of (interconnected) devices regarding umentation, the documentation fusion will take place. In the interaction possibilities. In total, our envisioned framework following, a device with access to its remote located or at- consists of three building blocks, namely an Ambient Reflec- tached documentation is called documentation entity. Includ- tive Documentation Language, Documentation Fusion and ing the current context and environmental state, the fusion Presentation Oriented Publishing. In the following para- step performs in-situ processing and merging of distributed graphs, these components are described in detail. documentation entities. As result it generates a presentation- neutral adaptive ambient space manual for interconnected en- Ambient Reflective Documentation Language sembles of Smart Objects in ubiquitous and pervasive envi- Current possibilities for technical documentation are limited ronments, considering just involved devices and interaction to unstandardized media entities, i.e., each device is described possibilities. in different modalities using different types of media in var- ious formats. Hence, caused by this diversity an automated Distributing the generated material to a dedicated coordinat- processing is not possible. In order to achieve this prop- ing engine will enable a guidance system to offer further in- erty a unified extensible documentation language for ambi- structions regarding interaction of device ensembles at the ent spaces should be provided, covering a structured descrip- time the user needs or asks for support. Nonetheless, the tion about devices specification and interaction possibilities fusion step might be skipped to provide even single device on a high granularity (further referred as micro-level). More- interaction guidance as well. over, a documentations content should be decoupled from its presentation in order to achieve more flexibility for further Presentation Oriented Publishing processing, which already has successfully been done (e.g. Using concepts of presentation oriented publishing for by [29]). This approach of presentation-neutral describing of markup languages [10] adds an additional abstraction layer Smart Objects may guarantee distributivity, extensibility and between the generated manual and the final presentation of further presentation oriented processing. instructions to the user. The formerly mentioned decoupling properties of presentation-neutral documentation facilitates FUTURE WORK the documentation language translation into other renderable Next, we take to carry out a study to identify different con- languages. Furthermore, the inclusion of a standardized style texts and therefore needs of users with respect of documen- description to the translated language concerning presenta- tation and guidance in interaction. Including these findings tional aspects will offer a possible solution for adaptivity in and further research regarding description languages, we will presentation. Finally, the fused documentation should be de- develop a unified Ambient Reflective Description Language ployed to appropriate rendering devices. E.g. a fused docu- for Smart Objects and apply it to a representative set of Smart mentation might be translated into the Scalable Vector Graph- devices, composed of different device categories. In addition, ics format using different color sets for color-blind users and we try to determine a set of generic rules and processes in or- visualized by an internet browser’s rendering engine. der to achieve a consistent manual generated by the fusion engine. Upon this, the development of interweaving style EXEMPLARY SCENARIO description and documentation and the delivery to rendering In order to further illustrate the frameworks’ working-process devices should enable Smart Objects to describe themselves. the following scenario is assumed (see figure 1): A user re- Finally, we plan to evaluate our framework by carrying out sides in an environment containing of n Smart Objects. Since a scenario-based evaluation to determine the precision of the a device ensemble, consisting of Smart Object 1 and 2, was fusion itself as well as the usefulness of our provided guid- built, the user needs guidance in usage. Object 1 is al- ance for the user. ready described by using the Ambient Reflective Documen- tation Language, where Object 2 still needs to be described. REFERENCES Therefore, an adapter is used to translate existing documen- 1. Altakrouri, B. Ambient Assisted Living with Dynamic tation, written by the manufacturer, into the documentation Interaction Ensembles. PhD thesis, University of language. It is likewise conceivable, that an ambient reflec- Lübeck, the Department of Computer tive documentation is remote located, whereas the device just Sciences/Engineering, published by Zentrale provides the destination (as done by Smart Object n). The Hochschulbibliothek Lübeck, August 2014. Fusion Engine fetches and merges the documentation of the involved Objects as well as applying a stylesheet based on the 2. Altakrouri, B., and Schrader, A. Towards dynamic users preferences. Finally, the generated manual is deployed natural interaction ensembles. In Fourth International to rendering devices 2 and k and by association delivered to Workshop on Physicality (Physicality 2012) co-located the user. It should be noted, that a rendering device might be with British HCI 2012 conference, A. D. Devina equal to a Smart Object in the environment and thus might Ramduny-Ellis and S. Gill, Eds. (Birmingham, UK, 09 also be documented. Hence, a set of documented rendering 2012). devices might form a device ensemble with other Smart Ob- 3. Altakrouri, B., and Schrader, A. Describing movements jects and thus are by definition documentation entities. for motion gestures. In 1st International Workshop on Engineering Gestures for Multimodal Interfaces (EGMI CONCLUSION 2014) at the sixth ACM SIGCHI Symposium on Considering the current development in the areas of IoT, AAL Engineering Interactive Computing Systems (EICS14) and Smart Home face users with new challenges in terms of (Rome, Italy, June 2014). Human Computer Interaction. Devices’ and their functional- ity are progressively interconnected, embedded and new in- 4. BITKOM Bundesverband Informationswirtschaft, teraction techniques within the scope of Natural Interaction Telekommunikation und neue Medien e. V. Leitfaden arise. As a result of the ongoing disappearing of user inter- zur Heimvernetzung Band 2, 2011. faces as well as the emerging usage of gesture control, the 5. Bongers, B. Interacting with the disappeared computer. current concept of affordances may not apply to current de- In Mobile HCI, Physical Interaction Workshop on Real velopments. Existing difficulties in HCI will increase in the World User Interfaces. (Udine, Italy, 2003). areas of ubiquitous and pervasive computing, caused by the environments high complexity, heterogeneity and dynamic. 6. Cooper, A., Reimann, R., and Cronin, D. About Face 3: Beyond this, present technical documentation does not fol- The Essentials of Interaction Design. Wiley, 2007. low a common pattern or is adapted to the user’s needs. As a 7. England, D. Whole body interaction: An introduction. possible solution to tackle these problems, we presented the In Whole Body Interaction. Springer, 2011, 1–5. approach of a conceptual ambient reflection framework, con- sisting of three major components: An Ambient Reflection 8. Estrada-Martinez, P. E., and Garcia-Macias, J. A. Documentation Language for describing interaction possibil- Semantic interactions in the internet of things. ities of Smart Objects on a micro-level, documentation fusion International Journal of Ad Hoc and Ubiquitous using the description language to merge documentation enti- Computing 13, 3 (2013), 167–175. ties of interconnected devices for generating an ambient man- 9. Fogtmann, M. H., Fritsch, J., and Kortbek, K. J. ual tailored to the users’ context and needs as well as the pre- Kinesthetic interaction: revealing the bodily potential in sentation oriented publishing for multi-modal rendering the interaction design. In Proceedings of the 20th manual in-situ. In total, we strongly believe to counteract Australasian Conference on Computer-Human emerging interaction problems in ambient space scenarios by Interaction: Designing for Habitus and Habitat, ACM further investigating this framework. (2008), 89–96. 10. Goldfarb, C. F., and Prescod, P. The XML Handbook. 22. Preim, B., and Dachselt, R. Die Interaktion mit Prentice-Hall, Upper Saddle River, New Jersey, 1998. Alltagsgeräten. In Interaktive Systeme. Springer, 2010, 135–161. 11. Leitner, G., Hitz, M., Fercher, A. J., and Brown, J. N. A. Aspekte der Human Computer Interaction im Smart 23. Pruvost, G., Heinroth, T., Bellik, Y., and Minker, W. Home. HMD - Praxis Wirtschaftsinform. 294 (2013). User interaction adaptation within ambient environments. In Next Generation Intelligent 12. Matern, U., Koneczny, S., Scherrer, M., and Gerlings, T. Environments. Springer, 2011, 153–194. Arbeitsbedingungen und Sicherheit am Arbeitsplatz OP. Deutsches Ärzteblatt 103, 47 (November 2006), A 3187 24. Quesenbery, W. Balancing the 5Es of Usability. Cutter – 3192. IT Journal 17, 2 (2004), 4–11. 13. Navarre, D., Palanque, P., Ladry, J.-F., and Barboni, E. 25. RWE Effizienz GmbH. Wendepunkte der Icos: A model-based user interface description Energiewirtschaft. Online, 2014. technique dedicated to interactive systems addressing http://www.rwe.com/app/Pressecenter/Download. usability, reliability and scalability. ACM Transactions aspx?pmid=4012118&datei=1. on Computer-Human Interaction (TOCHI) 16, 4 (2009), 18. 26. Streitz, N., Prante, T., Röcker, C., Van Alphen, D., Stenzel, R., Magerkurth, C., Lahlou, S., Nosulenko, V., 14. Norman, D. A. Affordance, conventions, and design. Jegou, F., Sonder, F., et al. Smart artefacts as affordances interactions 6, 3 (1999), 38–43. for awareness in distributed teams. In The Disappearing Computer, Interaction Design, System Infrastructures 15. Norman, D. A. The Design of Everyday Things. Basic and Applications for Smart Environments, Springer books, 2002. (2007), 3 – 29. 16. Norman, D. A. Simplicity is not the answer. interactions 27. Thimbleby, H., and Addison, M. Intelligent adaptive 15, 5 (2008), 45 – 46. assistance and its automatic generation. Interacting with 17. Norman, D. A. Living with complexity. MIT Press, 2010. Computers 8, 1 (1996), 51–68. 18. Norman, D. A., and Nielsen, J. Gestural interfaces: A 28. van der Vlist, B. J., Niezen, G., Hu, J., and Feijs, L. M. step backward in usability. interactions 17, 5 (Sept. Semantic connections: Exploring and manipulating 2010), 46–49. connections in smart spaces. In Proceedings of the 15th IEEE Symposium on Computers and Communications, 19. Pham, D. T., Dimov, S., and Setchi, R. Intelligent ISCC 2010, IEEE (June 2010), 1–4. product manuals. Proceedings of the Institution of Mechanical Engineers, Part I: Journal of Systems and 29. Walsh, N. Docbook 5: The definitive guide. Online, Control Engineering 213, 1 (1999), 65–76. 2010. http://docbook.org/tdg5/. 20. Poppe, R., Rienks, R., and van Dijk, B. Evaluating the 30. Zandanel, A. Users and households appliances: Design future of hci: challenges for the evaluation of emerging suggestions for a better, sustainable interaction. In applications. In Artifical Intelligence for Human Proceedings of the 9th ACM SIGCHI Italian Chapter Computing. Springer, 2007, 234–250. International Conference on Computer-Human Interaction: Facing Complexity, CHItaly, ACM (New 21. Poslad, S. Ubiquitous Computing: Smart Devices, York, NY, USA, 2011), 96–100. Environments and Interactions. Wiley, 2009.