Modiquitous 2011 Proceedings Ubiquitous Alignment Florian Haag, Michael Raschke Thomas Schlegel Visualization and Interactive Systems Institute Institute of Software and Multimedia (VIS) Technology University of Stuttgart Technische Universität Dresden Universitätsstraße 38 Nöthnitzer Straße 46 70569 Stuttgart, Germany 01062 Dresden, Germany {haag, raschke}@vis.uni-stuttgart.de Thomas.Schlegel@tu-dresden.de ABSTRACT to be aware of them in order to interact [23]. There are two Ubiquitous thinking means designing human-computer important aspects of how to approach that objective. interaction in a fundamentally new way. The perceived One is the integration of computational capabilities in distance between ubiquitous systems and their users objects that are usually used for non-computing purposes. decreases while the heterogeneity of modalities increases This can refer to both fixed and portable objects. In the case compared to classical human-computer interaction. The of fixed objects, the computing equipment can, for initiation of work phases becomes less formalized. Instead example, be built into buildings or parts thereof. Whole of explicitly declaring the start of an interaction by buildings may be equipped with linked computers and activating a computing device, the interaction starts sensors for specific goals, such as minimizing energy gradually and sometimes implicitly based on an estimation consumption [19], or for general-purpose support in a of the user’s needs. For bridging the gap of initiation, in variety of tasks performed by the people within the this article we present the ubiquitous interaction concept building [11]. Likewise, parts of buildings, such as the floor “Ubiquitous Alignment”. It comprises of the three steps [14] or doorplates [20], can be enhanced with ubiquitous recognition, sparking interest and start of collaboration. computing technology. Computing and sensor devices in The Ubiquitous Alignment concept is based on a portable objects can refer to so-called smart furniture [12] comparison between traditional human-computer and or wearable computing [18], amongst others. They can be human-human interaction. Finally, two examples show the used for similar tasks or even linked with devices applicability of the Ubiquitous Alignment concept. embedded in fixed objects using wireless networks. Keywords The other important aspect to consider in ubiquitous Ubiquitous systems, interaction concepts, mixed initiative, interfaces is how the interaction begins. The necessity to interaction initiation, Ubiquitous Alignment use specific computing devices should be avoided, as INTRODUCTION happened by integrating computer equipment into everyday Traditionally, the boundaries of human-computer objects. Still, computer-specific tasks such as activating a interaction are clearly defined. The interface to the device or looking at (after possibly walking to) the display computer is defined by peripheral input and output devices to gather some information pose an obstacle for a natural and does not extend beyond them. Work with a computing interaction with the systems [13]. Particularly, the devices device starts after it has been switched on by the user. should act proactively in certain situations while still When the work is done, the user switches the device off appearing unobtrusive. For this purpose, we introduce a again. This applies to desktop computer systems just as concept of how interaction between a human user and a much as any other technical device. From a more semantic system in a ubiquitous computing environment can be point of view, the interaction starts when the user turns initiated. This refers to both the first contact with the toward the terminal or concentrates on it in any way, and it ubiquitous technology as well as later, single interaction stops when the user leaves the workplace or concentrates sessions. The goal of this concept is a description of how on something else. the system gradually approaches the user and gets his or her The next step in the development of computing systems is attention. System and user align themselves to each other in ubiquitous computing – an environment where computing order to communicate and collaborate without any devices are, often seamlessly, integrated into everyday obstacles. Therefore, we refer to this concept as Ubiquitous objects and activities in such a way that users do not need Alignment. After discussing related work, we will first analyze how interaction between humans and other humans (human- LEAVE BLANK THE LAST 2.5 cm (1”) OF THE LEFT human interaction) as well as between humans and COLUMN ON THE FIRST PAGE FOR THE computers (human-computer interaction) usually starts, COPYRIGHT NOTICE. then highlight differences between the two situations. Subsequently, we will describe our concept of Ubiquitous Alignment, explain where it differs from human-computer F. Haag, M. Raschke, T. Schlegel: Ubiquitous Alignment. Proc. of 1st International Workshop on Model-based 33 Interactive Ubiquitous System 2011, Pisa, Italy, June 13, 2011, http://ceur-ws.org/Vol-787 Modiquitous 2011 Proceedings interaction and identify parallels with human-human resembles a situation where ubiquitous technology might interaction. Finally, we will present two example scenarios come into play. Basically, the customer could manage well in which the concept can be applied. without any additional help. Generally, for a comfortable RELATED WORK shopping experience, the sales assistant should not behave There has been a large amount of work to provide in a pushy way by insisting on helping the customer against computing devices with additional sensors to perceive an his wish. The assistant may however indicate that she is arbitrary range of signals from the outside world. To name available and ready to help if help is required, and the only a few, sensor systems to detect human means of customer may decide on his own to start a more thorough expression such as pointing at something with the hand [8] interaction. or showing different facial expressions [24] are being Initially, the customer is examining the items in a shelf, developed. Sensors for other contextual parameters, such as reading the information given on the labels and the price the room temperature [17], are also being integrated into tags. The sales assistant is waiting nearby. In order to not computer systems. In order to further improve the appear obtrusive, she should not wait right next to the evaluation of data received from sensors, some research customer or in front of the shelf, as this might make the tries to recognize or model human emotions, which may customer feel controlled. Still, it is important not to express give systems a better understanding of the intentions of disinterest or lack of attention. This can be achieved by users [4] [16]. On a wider scale, Pantic and Rothkrantz displaying an initial sign of responsiveness, such as present their ideas of how a great number of sensors can greeting the customer when he enters the store, or by enable multiple modalities of input [15]. Similarly, the explicitly offering help when the customer has been concept of Perceptual User Interfaces aims for a natural browsing the products for a while. interaction between users and devices [21]. The EasyLiving At the least, when the customer has picked up some items project focuses on the technical side on coupling a variety and placed them in his shopping cart, the sales assistant of sensors and other devices to form a complete system [2]. may carefully indicate that she is willing to start a In a general notion of ubiquitous computing, Rhodes points conversation. This can be expressed by a single casual out some design objectives for wearable computing in his remark about one of the products or by pointing out an article [18] that are also useful in other types of ubiquitous alternative. At this point, it is important to note that the systems. Works such as the classroom-related scenarios information given is not among that which the customer described by Bravo et al. assume that the system is already has likely already seen. Instead, he may be pointed to a there and do not take into account a phase during which feature that is not evident from the labels or to an item that users get acquainted and used to it [1]. is not currently located on the same shelf. In this way, the The behavior of user interfaces that sometimes act customer will perceive the assistant as helpful rather than proactively and sometimes leave the control to the user is merely reiterating known facts. called mixed initiative. There has been much research on If the customer desires to receive more information this topic over the course of the past few decades. It focuses afterward, he will ask the assistant. The assistant can on ways to achieve and employ mixed initiative [10] [22] inform the customer about what information she is able to [9]. With its close resemblance to human-human provide, while the customer gets a feeling of how reliable interaction, mixed initiative systems sometimes aim at the information received from that assistant is. generating a verbal dialogue between user and system Human-Computer Interaction which works the same way as a conversation between Due to the lack of proactivity in most of today’s software, people [6]. Chu-Carroll and Brown distinguish dialogue the gradual start of an interaction as seen in the example of and task initiative, which allows for a more accurate human-human interaction cannot be customarily found in dialogue model as it distinguishes which interaction partner human-computer interaction. Assuming that the computer is guiding the current interaction and which one is deciding is already switched on and the user has logged into her what will be done [3]. account, she starts for the first time the new application she TRADITIONAL INTERACTION STYLES would like to use. This section describes two examples of starting an The application displays a default set of options and interaction between humans and other humans and between commands. Guides to the most important features can be humans and today’s computers, respectively. Both provided. An example location would be a welcome screen. examples are chosen in a way that the participants of the Nevertheless, the user has to start exploring the interface interaction do not have any prior knowledge about each right away. After taking a few steps in the user interface, other or are not yet collaborating. Thus, the situations are the application tries to estimate what the user is trying to do analogous to the interaction between a user and ubiquitous and displays hints accordingly. The application can only devices as described further below. evaluate the user input and does not possess any additional Human-Human Interaction sensors. Hence, it cannot take into consideration any As an example of human-human interaction, we have contextual information about the user and her environment. chosen a customer in a self-service store and a sales The estimation of the user’s intentions is accordingly assistant. This scenario was selected because it closely imprecise; therefore, the displayed hints occasionally fail to 34 Modiquitous 2011 Proceedings be of any help, which in turn makes the user dissatisfied explicitly trigger any operations. He may or may not be with the application. aware that his environment is equipped with ubiquitous Once the user has gathered some experience with the computing systems at all. In order to achieve collaboration, application, she will actively customize the user interface the three steps recognition, sparking interest and start of and create templates and macros for repeating tasks. Due to bad experience with the automatic input analysis, she might eventually choose to completely disable the automatic adaptation of application behavior. Even though this means some additional effort for the user in that some settings have to be done manually, she values the absence of distracting hints that do not provide any helpful information higher than saving some time by allowing the software to automatically adapt itself. Comparison Despite being basically equivalent scenarios of starting an interaction with a previously unknown partner, these two descriptions of human-human and human-computer interaction sport substantial differences. First of all, in human-computer interaction the user has to know and launch the application she wants to use. The application is not just there and ready by default, as it is the case with the sales assistant. By launching that application, the computer Fig. 1. User and ubiquitous system gradually intensify user also explicitly declares the start of the interaction, as their interaction in three steps of the Ubiquitous opposed to the gradual process found in the interaction Alignment concept: At first, they barely know of each between the customer and the sales assistant. other, then they start to interact and intensify that As mentioned above, the lack of variety of input channels interaction further on. available to the system results in a lack of knowledge about the overall behavior and context of the user. Thus, any estimation about the current intentions of the user can at most be a rough guess. Accordingly, helpful clues can only collaboration are performed (cf. Fig. 1). be given based on the experience with average users or by Step 1: Recognition trying to find repeating patterns in the behavior of the The user recognizes the system in a pleasant rather than a current user. The same applies to input interfaces such as pushy way. A pleasant ubiquitous system remains in the menus: Even though some software manufacturers have background until the user wishes to start an interaction. In attempted to automatically restructure menus, a user study this phase, the system is still largely ignorant of what the suggests that any such change is likely to confuse the user user intends to do. This matches the behavior of the rather than support her [5]. That lacking additional assistant from the human-human interaction example, who information about the user is, however, available to the remains passive. Any other behavior might annoy or scare assistant concerning her customer, as she can see and away the other person or the user, respectively. The consider where the customer is located and what he is ubiquitous system visibly exhibits a certain level of doing. Thus, she is also capable of quite reliably assessing proactivity only when it is absolutely certain that an the customer’s current intentions and wishes. This enables intervention is desired by the user. Otherwise it remains her to take over or give away the initiative in the interaction largely invisible, except for some unobtrusive hints that it is process at the right time. The computer application is not there and ready. able to provide this degree of mixed initiative in the Any other operations the ubiquitous system performs go described scenario. unnoticed by the user, who gets the impression that he is UBIQUITOUS ALIGNMENT just using everyday objects. Automatic locks that secure In order to make human-computer interaction more like lids of boxes unless the user actually attempts to open human-human interaction, one can take advantage of the them, the adaptation of fridge power to cool down newly special capabilities of ubiquitous computing technology. inserted warm items or the temporary dimming of lights The additional data gathered by the sensors in a ubiquitous while the user is not in the room do not require any active computing environment allows for a more natural initiation input. That is also why no new interface concepts need to of collaboration between users and systems [2]. be considered at this point. The ubiquitous devices do not The Ubiquitous Alignment concept assumes that a user is have any new input controls. They can be used just like going to perform a particular task in an environment their non-ubiquitous counterparts. equipped with ubiquitous computing devices. The user does not yet have sufficient knowledge about those devices to 35 Modiquitous 2011 Proceedings Step 2: Sparking Interest system is helpful with a new kind of task. At the same time, User and system begin to communicate with each other. As the system should not behave in a paternalistic way and the user finds out what the system is or is not capable of, insist on collaboration in this particular new task just the system output at this point must particularly strive for a because the user makes frequent use of the system on other high reliability. This concerns both the information occasions. provided and the estimations made. This step corresponds To sum up, the main advantages of ubiquitous computing with the customer becoming acquainted with the sales systems over traditional computing systems at assistant and vice-versa. approximating human-human interaction are their greater In order to not appear overzealous at communicating with variety of input channels and their integration into everyday the user where no communication is desired, a good objects. The additional input channels in the form of a strategy is to continue giving small hints of the presence variety of sensors allow for a more accurate and complete and features of the system, just as the sales assistant will try perception and evaluation of the user, his behavior and his to be supportive without flooding the customer with context. By integrating system parts into appliances information. In particular, those hints should spark the previously known to the user, the handling of the interest of the user/customer and motivate him or her to ubiquitous system does not have to be learned right from find out what kind of support can be obtained. At the same the start on; instead, some features can be used by time, the ubiquitous devices may be able to catch some manipulating appliances the usual way, so the prospective clues as to how the user behaves or reacts and what kind of user can gradually extend his knowledge to encompass the output inspires him to further interact with the system, just additional system features that require any special input. as the sales assistant will adapt his behavior to suit the POSSIBLE APPLICATIONS OF THE UBIQUITOUS customer’s preferences to a certain degree. ALIGNMENT CONCEPT Step 3: Begin of Collaboration To underline that the Ubiquitous Alignment concept can After the computer system has been recognized by the user indeed be used in ubiquitous computing scenarios, we and he has indicated that he is willing to collaborate with describe two example scenarios in which our Ubiquitous the system, the system can become more active. As the user Alignment concept is applied. has become interested in the system, he is likely to try and One example of a ubiquitous system that uses mobile explore further capabilities of the ubiquitous devices. This computing devices is the ActiveClass system described by behavior can be encouraged by facilitating the exploration Griswold et al. [7]. ActiveClass is a system which allows process. Amongst others, options related to the current students’ mobile devices to connect to a central component operation that have not yet been employed by the user can while in a lecture hall. Using the ActiveClass system, be recommended to him. Likewise, any means of students can publicly and anonymously ask questions. discovering and learning about unknown system features Without the ActiveClass system, both the size of the lecture must be easy to find. A human sales assistant will too, in a hall and the lack of anonymity may pose obstacles to comparable situation, express what information he is able actually ask questions. When applying the Ubiquitous to provide to the customer. Alignment approach to a situation where a student does not While interacting with the user, a system that follows the yet know the ActiveClass system, the first step might Ubiquitous Alignment approach has gathered and is still present unobtrusive hints about the system. For example, gathering an increasing amount of information about the the student’s mobile device might display an access icon of user and his behavior. This allows for better estimates of the ActiveClass system in the main menu while the student the current intentions of the user and thus provides the is attending a lecture. In the second step, ActiveClass might system with the means to support the user in an optimized display a button for posting a question whenever the way. student starts searching for an explanation about something How Ubiquitous Alignment helps improve HCI which is being discussed by the lecturer right then. In the Ubiquitous Alignment reflects the gradual process used to third step of Ubiquitous Alignment, which starts once the establish contact between two humans with the goal of student has begun to actively use ActiveClass, the system collaboration. These parallels hold true both for situations provides access to its options menu. There, the other where the actors do not have any prior knowledge about features such as polls, class feedback and votes can be each other as well as for cases where they do. In the former found. case, the steps explained serve for the initial contact In the second example, we consider a table that is aware of between two strangers, just as for the initial contact what objects are placed on the tabletop (cf. Fig. 2). This between a future user and a network of ubiquitous devices. awareness can be achieved through a variety of means, In the latter case, the participating persons already know such as load detection, image analysis or tracking of object each other, so the objective is collaboration on a given task. locations (assuming that each object is tagged in some The actors do not yet know whether the collaboration will way), or a combination thereof. In addition, some means of actually turn out to be beneficial, which is why they use the tracking what the user is doing is available. The knowledge same approach of gradually initiating their interaction. about object positions can be used to guide a user by Likewise, a user might already know some parts of a indicating where on the table to find a particular item. This ubiquitous system, but he is not sure yet whether the can be used for workbenches or interactive cookbooks, to 36 Modiquitous 2011 Proceedings name only two examples. In the first step of the Ubiquitous the user. Only when the user keeps searching for something Alignment approach, the user may be using the table just as for a longer time, does the system clarify its output, a table, placing objects on top of it. The system performs its providing more information in an additional message. If the minimum default function, inserting positional indicators user responds by locating objects faster based on those such as “on the left” or “next to ...” in the instructions for hints, step two of the Ubiquitous Alignment concept has the system highlight any references to objects in the displayed instructions (or, in the case of voice output, make clear what is highlighted in text in some other way), pointing the user to the possibility of finding out more about the respective items. Eventually, in the third step the system may display additional information right away as the user requires it, and offer some options to modify how much and what kinds of information the user wants to be displayed about objects referred to in the work instructions. These examples show how the concept of Ubiquitous Alignment can be applied to scenarios where a user starts getting to know a ubiquitous system or one of its features. In all described scenarios and examples, the user had had a certain resistance to using the system, or at least he or she was not assumed to spend a lot of initial effort to learn how to use the system. This is where Ubiquitous Alignment is particularly beneficial. Users who take the time to read a manual first do not require the same degree of gradual initiation of interaction. Nonetheless, striving for a display of reliability towards that kind of users and not annoying them with frequent messages or other possibly undesired output retain their importance. CONCLUSION AND FUTURE WORK In this work, we have examined some exemplary situations of human-human and human-computer interaction. In an effort to make human-computer interaction more alike to human-human interaction, we have described the Ubiquitous Alignment concept. It defines how collaboration between a human user and a computer system can be initiated in a way that closely resembles the interaction between humans, taking advantage of the possibilities found in ubiquitous computing devices. As seen in the comparisons of the Ubiquitous Alignment approach with the previous examples of human-human and human-computer interaction, our approach has a strong resemblance to the former. The main reasons for the differences were found to be the additional sensor input and, similarly, the additional input modalities which can totally match the normal manipulation of everyday objects, as opposed to handling specialized devices such as mice or keyboards to provide input to traditional computers. Fig. 2. Ubiquitous devices embedded into a kitchen allow As this work presents a concept of how a ubiquitous system for displaying the current instruction from a recipe in should behave, a future goal is the implementation of this relation to the current state and location of ingredients on concept. Thereby, we hope to show how the Ubiquitous the table where the food is being prepared. Depending on Alignment concept works in practice and how it can be the Ubiquitous Alignment phase in which the interaction implemented in detail. Also, we expect this to be a starting is taking place, additional information is displayed: hints point for defining processes for the development of about the position in step 1, an explicit offer to provide ubiquitous software components and for gathering a better more information in step 2, and further suggestions in understanding of the user’s behavior. A model system will step 3. not need to incorporate all of the described attributes. With the incorporation of additional sensors, the system could gradually come closer to the ideal form of the Ubiquitous Alignment concept. 37 Modiquitous 2011 Proceedings ACKNOWLEDGMENTS 13 Ley, D. Ubiquitous Computing. Emerging Technologies This research was funded through the IP-KOM-ÖV project for Learning, 2 (2007). (German Ministry of Economy and Technology (BMWi) grant number 19P10003N). Also, we would like to thank 14 Orr, R. J. and Abowd, G. D. The smart floor: a our students Thomas Bach, Steffen Bold and David Kruzic. mechanism for natural user identification and tracking. In CHI '00 extended abstracts on Human factors in REFERENCES computing systems (The Hague, Netherlands 2000), 1 Bravo, J., Hervás, R., and Chavira, G. Ubiquitous ACM, 275-276. Computing in the Classroom: An Approach through Identification Process. j-jucs, 11, 9 (2005), 1494-1504. 15 Pantic, M. and Rothkrantz, L. J. M. Toward an affect- sensitive multimodal human-computer interaction. 2 Brumitt, B., Meyers, B., Krumm, J., Kern, A., and Proceedings of the IEEE, 91, 9 (Sept 2003), 1370-1390. Shafer, S. Lecture Notes in Computer Science - EasyLiving: Technologies for Intelligent Environments. 16 Picard, R. W. Affective Computing. 1997. Springer Berlin/Heidelberg, 2000. 17 Ranganathan, A. and Campbell, R. H. A middleware for 3 Chu-Carroll, J. and Brown, M. K. An Evidential Model context-aware agents in ubiquitous computing for Tracking Initiative in Collaborative Dialogue environments. In Proceedings of the Interactions. User Modeling and User-Adapted ACM/IFIP/USENIX 2003 International Conference on Interaction, 8, 3 (1998), 215-254. Middleware (Rio de Janeiro, Brazil 2003), 143-161. 4 Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, 18 Rhodes, B. J. The wearable remembrance agent: A G., Kollias, S., Fellenz, W., and Taylor, J. G. Emotion system for augmented memory. Personal and recognition in human-computer interaction. Signal Ubiquitous Computing, 1, 4 (1997), 218-224. Processing Magazine, IEEE, 18, 1 (Jan 2001), 32-80. 19 Schor, L., Sommer, P., and Wattenhofer, R. Towards a 5 Findlater, L. and McGrenere, J. A comparison of static, zero-configuration wireless sensor network architecture adaptive, and adaptable menus. In Proceedings of the for smart buildings. In Proceedings of the First ACM SIGCHI conference on Human factors in computing Workshop on Embedded Sensing Systems for Energy- systems (Vienna, Austria 2004), ACM, 89-96. Efficiency in Buildings (Berkeley, California, USA 2009), ACM, 31-36. 6 Graesser, A. C., Chipman, P., Haynes, B. C., and Olney, A. AutoTutor: an intelligent tutoring system with 20 Trumler, W., Bagci, F., Petzold, J., and Ungerer, T. mixed-initiative dialogue. IEEE Transactions on Smart doorplate. Personal Ubiquitous Comput., 7, 3-4 Education, 48, 4 (Nov 2005), 612-618. (July 2003), 221-226. 7 Griswold, W. G., Shanahan, P., Brown, S. W., Boyer, 21 Turk, M. and Robertson, G. Perceptual user interfaces R., Ratto, M., Shapiro, R. B., and Truong, T. M. (introduction). Commun. ACM, 43, 3 (March 2000), 32- ActiveCampus: experiments in community-oriented 34. ubiquitous computing. Computer, 37, 10 (Oct 2004), 73- 81. 22 Walker, M. and Whittaker, S. Mixed initiative in dialogue: an investigation into discourse segmentation. 8 Guan, Y. and Zheng, M. Real-time 3D pointing gesture In Proceedings of the 28th annual meeting on recognition for natural HCI. In 7th World Congress on Association for Computational Linguistics (Pittsburgh, Intelligent Control and Automation ( 2008), 2433 -2436. Pennsylvania, USA 1990), ACL, 70-78. 9 Hearst, M. A., Allen, J. F., Guinn, C. I., and Horvitz, E. 23 Weiser, M. Hot topics-ubiquitous computing. Computer, Mixed-initiative interaction. IEEE Intelligent Systems 26, 10 (Oct 1993), 71-72. and their Applications, 14, 5 (Sept/Oct 1999), 14-23. 24 Zhang, Z., Lyons, M., Schuster, M., and Akamatsu, S. 10 Horvitz, E. Principles of mixed-initiative user interfaces. Comparison between geometry-based and Gabor- In Proceedings of the SIGCHI conference on Human wavelets-based facial expression recognition using factors in computing systems: the CHI is the limit multi-layer perceptron. In Third IEEE International (Pittsburgh, Pennsylvania, USA 1999), ACM, 159-166. Conference on Automatic Face and Gesture Recognition ( 1998), 454 -459. 11 Intille, S. S. Designing a Home of the Future. IEEE Pervasive Computing, April-June (2002), 76-82. 12 Ito, M., Iwaya, A., Saito, M. et al. Smart Furniture: Improvising Ubiquitous Hot-Spot Environment. International Conference on Distributed Computing Systems Workshops, 0 (2003), 248. 38