Capturing and Linking Human Sensor Observations with YouSense Tomi Kauppinen1,2 and Evgenia Litvinova1 and Jan Kallenbach1 1 Department of Media Technology Aalto University School of Science, Finland evgenia.litvinova@aalto.fi jan.kallenbach@aalto.fi 2 Cognitive Systems Group, University of Bremen, Germany tomi.kauppinen@aalto.fi Abstract. Semantic technologies are prominent for gathering human sensor observations. Linked Data supports sharing and accessing of not just data but also vocabularies describing the data. Human sensor obser- vations are often a combination of natural language and categorizable en- tries, thus calling for semantic treatment. Space and time serve as natural integrators of data in addition to concepts. In this paper we demonstrate YouSense tool which supports gathering of experiences about spaces (like generic buildings or office spaces). Our contribution is also a vocabulary for describing the experiences as RDF and tools for visualizing and mak- ing sense of the gathered user experiences. 1 Introduction Understanding how people experience surroundings (like office spaces or confer- ence venues) supports to further develop spaces and to modify them to meet user needs. This Human Sensor Web approach is rather different from monitor- ing just technical parameters such as the indoor temperature, humidity, or CO2 concentration, typically in use to assess the performance of buildings. The promise of Linked Data and semantic technologies in this setting is that they can offer ways to share and access these human sensor observations in a radically novel ways. The crucial tasks are to figure out how to describe the experiences of people in a processable way, and also how to gather the observa- tions. Finally, Information Visualization is a useful step in understanding if the gathered data has a story to tell [1,2]. In this paper we demonstrate and present YouSense, a mobile and web ap- plication for supporting sensing of the spaces by people. We also present the EXPERIENCE vocabulary, a Linked Data compatible collection of terms to en- able creation of RDF about the gathered experiences. The resulting history data provides evidences about comfortable and problematic spaces for both users of buildings and building managers. Our contribution includes also a set of visual- ization tools to make sense of the human sensor observations, and their thematic, spatial and temporal characteristics. The paper is structured as follows. Section 2 presents the tool for gathering the user experiences and discusses the vocabulary for encoding them. Section 3 demonstrates the use of the gathered data and outlines questions one can ask with the system. Section 4 provides future work research agenda and concluding remarks. 2 Gathering Observations for Semantic Descriptions YouSense3 is a web application that can be used from both mobile and desktop browsers. In YouSense, a user makes a sentence to describe how he/she feels in the space (see Figure 1 for an example). The structure of the sentence is defined with the help of the EXPERIENCE4 Vocabulary. EXPRERIENCE is a lightweight vocabulary providing terms for creating descriptions of experiences (as instances of Experience) about the environment, for example how cold or warm one feels in a particular context created by the location, time and activities an Experiencer is involved with. It thus supports for describing user experiences and feelings. Fig. 1. YouSense in action. The structure of EXPERIENCE was designed to include things people gen- erally use when they describe situations in spaces. We conducted a diary study, where people were asked to report about their experience about spaces in a free form. By analyzing the results of the user study we designed the set of terms to be included in EXPERIENCE. For instance, while we supposed people would 3 Demonstration also available online at http://yousense.aalto.fi 4 http://linkedearth.org/experience/ns/ like to relate the experiences to spaces, and times, it was rather surprising that they also wanted to guess about the reason for the experience. Below is a simple example use of the EXPERIENCE Vocabulary, in line with Figure 1 but a completed one. In this example there is a person (Laura) who has experience (VeryCold ) and (FreshAir ) in her office (Room3156 ) in June 2014 while performing certain activity (Sitting). The observer has also communicated about the (possible) reason (“Window is open”) for the experience (VeryCold ). We also gather the action that the expriencer plans to do next (Work ). @prefix experience: . example:exampleExperience_34 a experience:Experience ; experience:hasExperiencer feelingdata:Laura ; experience:hasEventExperience feelingdata:VeryCold, feelingdata:FreshAir ; experience:hasLocation feelingdata:Room3156 ; experience:hasTime "2014-06-20T10:00+02:00" ; experience:hasActivity dbpedia:Sitting ; experience:hasReason "Window is open" ; experience:hasFollowingAction dbpedia:Work . 3 Experimenting with YouSense in Concrete Use Cases The sensemaking part of the YouSense creates diagrams for each of the mes- sage parts (reasons, locations, times, people, spaces, activities) with reasoning support (partonomy, concept hierarchy, temporal abstraction). For instance, the adjectives people reported about the certain experiences (such as cold or warm) about spaces support understanding of spaces. Figure 2 depicts the approach by presenting a set of experiences about spaces as a bubble visualization. The experiences are aggregated to positive (green) and negative (red) ones. The floor plan visualization on the right shows a heat map of these aggregated experiences by rooms. Zooming closer to a room allows to see more detailed information about collected experiences and the spatial configuration of the room. The idea is to retrieve patterns like “rooms facing the sea generally have more positive experiences than ones facing the inner yard” thus supporting to reveal the causes of the experiences. We have experimented YouSense with selected spaces at the Aalto University to evaluate its usefulness. These spaces include the Design Factory5 , Media Fac- tory6 and the spaces of the Department of Media Technology and Department of Automation and Systems Technology. According to the experiments and dis- cussions with building managers the following types questions arose as the ones needing to be answered by making sense of the gathered data. – what is the air quality of this space? – do people feel comfortable in this space? – do people stay in this space to work and study? 5 http://www.aaltodesignfactory.fi 6 http://mediafactory.aalto.fi Fig. 2. Example of visualizing user experiences about spaces – or do they prefer to work and study somewhere else? where? – do people need some additional services or activities in this space? – what kind of things (furniture, games, ..) would people like to have in spaces, like in the lobby area? – do people feel comfortable in this space and why it is so? 4 Conclusions We argued that gathering of user experiences and other human sensor obser- vations is a good use case for Linked Data and semantic technologies. As we demonstrated, EXPERIENCE vocabulary supports describing of user experi- ences about spaces and to link them to reusable terms from DBPedia. The YouSense app enables gathering the experiences via mobile/web compliant in- terface and for storing them to a queryable triple store. We also illustrated the use of YouSense for supporting understanding of spaces with visualizations. As we showed, visualizations support for getting an overview of the data gathered. They also raised a set of questions, partially already answered by the gathered data. There were also interesting new questions which call for answering in the research agenda for the next months. We are particularly interested in studying what recurring, interesting patterns can we find from observation feeds. References 1. D.A. Keim. Information visualization and visual data mining. Visualization and Computer Graphics, IEEE Transactions on, 8(1):1–8, Jan/Mar 2002. 2. Edward Segel and Jeffrey Heer. Narrative visualization: Telling stories with data. Vi- sualization and Computer Graphics, IEEE Transactions on, 16(6):1139–1148, 2010.