Towards an Information-Theoretic Framework for Quantifying Wayfinding Information in Virtual Environments Rohit K. Dubey1 , Mubbasir Kapadia2 , Tyler Thrash 1 , Victor R. Schinazi1 , and Christoph Hoelscher1 1 ETH-Zurich, 2 Rutgers University dubey@arch.ethz.ch Abstract ten have to rely on knowledge that is immediately available in the environment [Golledge, 1999]. However, this relevant Signage systems are critical for communicating en- information must be separated from irrelevant information vironmental information. Signage that is visible and noise. There are a variety of visual cues in the environ- and properly located can assist individuals in mak- ment that can help individuals find their way (e.g., signage, ing efficient navigation decisions during wayfind- maps, landmarks, and building structure) [Montello, 2005; ing. Drawing upon concepts from information Arthur and Passini, 1992]. Signs may be particularly easy to theory, we propose a framework to quantify the interpret (i.e., require less abstraction than a map), adapt (i.e., wayfinding information available in a virtual envi- can accommodate changes to the environment), and quan- ronment. Towards this end, we calculate and vi- tify (i.e., allow for the measurement of relevant information). sualize the uncertainty in the information available An efficient signage system can drastically reduce the com- to agents for individual signs. In addition, we ex- plexity of the built environment and improve the wayfinding. pand on the influence of new signs on overall in- In contrast, an inefficient signage system or lack of signage formation (e.g., joint entropy, conditional entropy, can render a simple built space complex and stressful for mutual Information). The proposed framework can patrons. Indeed, signs are typically used to guide unfamil- serve as the backbone for an evaluation tool to help iar patrons to specific locations within shopping centers and architects during different stages of the design pro- airports [Becker-Asano et al., 2014]. An efficient signage cess by analyzing the efficiency of the signage sys- system can drastically reduce the navigational complexity of tem. such environments, but this reduction in complexity or uncer- tainty (i.e., information) is not always evident to the archi- 1 Introduction tect using existing measures (e.g., space syntax; [Hillier and Information theory is the branch of mathematics used to de- Hanson, 1984]). This is critical for both leisurely shopping scribe how uncertainty can be quantified, manipulated, and trips and emergency evacuations in which the consequences represented [Ghahramani, 2007]. According to Shannon and of inefficient signage can range from getting lost to becom- Weaver [Shannon and Weaver, 1949], a communication sys- ing injured. Individual signs must be placed at an optimal tem can be characterized with respect to this uncertainty and height and be sufficiently salient in different lighting condi- the information being transmitted. Modeling the exchange tions [Jouellette, 1988]. In addition, different signs within of information within a system essentially involves represen- a wayfinding system must continuously provide complemen- tations of uncertainty and can facilitate an understanding of tary information rather than information that conflicts with the behavior and properties of its individual elements. This other elements and confuses the users. Hence, the foundation approach has been applied to several different types of sys- of such a signage design tool should be grounded in research tems, including those from computer science, philosophy, that investigates human perception and cognition. physics, and cognitive science [Smyth and Goodman, 1992; In this paper, we first review previous research on informa- Floridi, 2002; Still, 2009; Resnik, 1996]. For example, infor- tion theory, human wayfinding, and signage systems. Next, mation may be transferred within and between internal and we introduce our framework for quantifying information and external representations of space [Craik and Masani, 1967] uncertainty for systems of signs in complex virtual environ- [Montello et al., 2004]. In the present work, we propose an ments and apply these measures to signs within a virtual air- information-theoretic approach to quantify the spatial infor- port. The results are discussed with respect to the develop- mation provided by individual or sets of signs for wayfinding ment of a novel application to aid architects in the (re)design in a virtual environment. of real environments. Navigation is a process by which internal spatial represen- tations are obtained. In turn, these representations act as the 2 Background and Prior Work basis of future navigation decisions. In unfamiliar environ- This section briefly describes both Shannon’s basic measures ments (i.e., before any initial representation), individuals of- of information and summarizes research on navigation and signage. 2013]. Signs are also capable of providing ecologically rel- evant information but need to be visible and interpretable 2.1 Information theory [Becker-Asano et al., 2014]. Indeed, Norman [1988] differ- Information theory was developed in the 1940s and 1950s as entiates between knowledge in the world (e.g., information a framework for studying the fundamental questions of the presented on signs) and knowledge in the head (e.g., the in- communication process, the efficiency of information repre- terpretation of signs). sentation, and the limits of reliable communication [Atick, The communication of wayfinding information from the 1992] [Shannon and Weaver, 1949]. In information theory, world to the individual observer can be facilitated by an ap- entropy represents the amount of uncertainty in a random propriate signage system or map [Arthur and Passini, 1990] variable as a probability distribution. The Shannon entropy [Allen, 1999]. Previous research has demonstrated that sig- of a discrete random variable X with alphabet χ and proba- nage has distinct advantages over maps for navigating the bility mass function p(x), xχ is defined as built environment [Holscher et al., 2007; O’Neill, 1991] also X found that textual signage led to a reduction in incorrect H(X) = − p(x)log2 p(x) (1) turns and an overall increase in wayfinding efficiency com- xχ pared to graphic signage (i.e., an arrow; see also [Wener and Kaminoff, 1983]). Both simulations and user experi- The probability of x, p(x)[0.0, 1.0] and −log2 p(x) repre- ments have suggested that the focus of visual attention can sents the information associated with a single occurrence of be improved with signage redesigns ([Becker-Asano et al., x. Entropy is always positive and represents the average num- 2014] [Buechner et al., 2012]). In addition, signage can im- ber of bits required to describe the random variable. Entropy prove simulated evacuation times [Xie et al., 2012] and the is a measure of information such that higher entropy repre- perception of crowding and its negative effects [Wener and sents more information. Kaminoff, 1983]. There are several measures that combine information from two random variables [Cover and Thomas, 1991]. Joint en- tropy represents the information provided by either one ran- dom variable or a second random variable in a system. The joint entropy of a pair of random variables (X, Y ) with a joint probability distribution p(x, y) can be represented as XX H (X, Y ) = p(x, y)logp(x, y) (2) xχ yγ In addition, conditional entropy H(X|Y ) is the entropy of one random variable given knowledge of a second random variable. The average amount of decrease in the randomness of X given by Y is the average information that Y provides regarding X. The conditional entropy of a pair of random variables (X, Y ) with a joint probability distribution p(x, y) can be represented as XX p(x) Figure 1: The circular visual catchment area (VCA) of a sign H (X|Y ) = p(x, y)log (3) A. The sign is considered visible if an occupant is standing xχ yγ p(x, y) inside the circular area located at the tangent to the surface of Mutual information quantifies the amount of information pro- a sign. vided by both random variables. The mutual information I(X; Y ) between the random variables X and Y is given as Visual Catchment Area. One existing way of quantify- ing the visibility of a sign is its visual catchment area (VCA; XX p(x, y) [Galea et al., 2001]). VCA represents the area from which I(X; Y ) = p(x, y)log( ) (4) p(x)p(y) a sign is visible when the observer faces the sign. Xie and yεY xεX colleagues [Xie et al., 2007] consider the visibility of a sign In Section 3, we will describe how the aforementioned as a binary value. In other words, the sign is visible within measures can be used to quantify the information provided the VCA and not visible outside of the VCA. The VCA of by two (or more) signs in an environment. a sign is calculated using the location of the sign, the height of the occupant and the sign above the floor, viewing angle, 2.2 Navigation and signage and the maximum distance from which the sign can be seen In an unfamiliar environment, navigation largely depends based on the size of its lettering. According to the National on picking up ecologically relevant information (i.e., affor- Fire Protection Association (NFPA) Life Safety Code Hand- dances; [Gibson, 1977]). In such cases, optic flow can be used book, signs with a lettering height of 152 mm are legible for to guide locomotion by distinguishing between self move- up to 30 m [NFPA et al., 1997]. The calculation of a sign’s ment (relevant for navigation) and object movement [Fajen, VCA are described below:  2  2 with Equation 6 to calculate N oise: b b = x2 + y − (5) P (l, sa ) = N oise(I(sa ), U (l, sa )) (8) sin(o) tan(o) We can then substitute P (l, sa ) for p(x) using Shannon’s en- Here, o is the angular separation of the sign and viewer, b is tropy equation (Equation 1) and Equation 8 to obtain a mea- half of the size of the sign’s surface, and P(x,y) represent the sure of entropy for a sign from the observer’s location. viewer’s location shown in Figure 1. The center at location b b (0, tan(o) ) with a radius of sin(o) . 1.0 3 Proposed Framework 0.9 In this section, we apply the principles of information theory 0.8 to quantify the information provided by signage in a virtual 0.7 environment. While there are a number of physical and psy- chological factors that influence the effectiveness of signage 0.6 Probability systems (e.g., color and contrast, interpretability and and at- 0.5 tentiveness), we will focus exclusively on signage visibility. 0.4 Rather than considering sign visibility as a binary value [Xie et al., 2007], we model sign visibility as a continuous func- 0.3 tion, which depends on distance and direction from the ob- 0.2 server. We model the entropy of a sign’s visible informa- tion P (l, s) as a measure of the navigation-relevant informa- 0.1 tion that is available to an agent at location l from sign s. 0.0 Let X(l, sa ) be a random variable that represents a particular 0 15 30 45 60 75 90 105 120 135 150 165 180 piece of information at a location l and sign sa . The probabil- Orientation (degrees) ity of a particular value for the random variable X(l, sa ) will depend on the distance of sign sa from the location l and the Figure 2: Proposed hypothetical detection probability accord- relative angle between location l and sign sa . The probabil- ing to the orientation between occupant travel direction and ity distribution is generated by sampling information X from sign’s directional vector sign sa at l 1000 times. Based on our experiments, we found 1000 samples to provide a reasonable trade-off between gran- ularity of calculations, and compute time. Further investiga- tion is needed to determine the sensitivity of our calculations 1.0 based on this parameter. 0.9 The uncertainty function U (l, sa ) represents the likelihood 0.8 of viewing information from a sign sa at location l as 0.7 U (l, sa ) = N (µ, σ) (6) 0.6 Probability N is a normal distribution with mean µ and standard devia- 0.5 tion σ. In addition, µ is directly proportional to the distance 0.4 and relative angle between sign sa and location l. Larger distances and relative angles between sign sa and location 0.3 l result in higher values for µ (i.e., closer to 1), and σ rep- 0.2 resents the range of uncertainty values (which are held con- stant). Here, µ is dependent on the mean of the normalized 0.1 distance dn (over 30 m) and normalized relative direction ran 0.0 (over 180 degrees; see Equation 7). ∆ is the weight of the 0 2 5 10 15 20 25 30 sum between distance and the relative direction: Distance (meters) µ = (dn + ∆ran )/2 (7) Figure 3: Proposed hypothetical detection probability accord- The work done in [Filippidis et al., 2006], makes an as- ing to the distance between occupant and the sign sumption between the relationship of the relative direction between the observant and the sign with the probability of Information measures that describe signage systems can visibility (see Figure 2). We use this relationship in the cal- also be extended for the combination of two or more signs. culation of µ (see Equation 7) and add our own assumption Another random variable Y (l, sb ) can be used to represent of probability of visibility with the distance between the ob- the amount of navigation-relevant information available to an servant and the sign (see Figure 3). agent at location l from a second sign sb . An uncertainty These two relationships form the basis of I(sa ) (i.e., the function U (l, sa,b ) can represent the likelihood of viewing in- actual information contained in sign sa ) and can be combined formation from two signs (sa and sb ) at location l: U (l, sa,b ) = N ((µa + µb )/2, σ) (9) Finally, the joint probability distribution can be computed by sampling information X from sign sa and sign sb at l sev- eral times: Pa,b (l, sa,b ) = N oise(I(sa ), U (li , sa,b )) (10) For all locations from which both signs are visible, we can calculate joint entropy, conditional entropy, and mutual infor- mation. The joint entropy of visible information from both signs sa and sb refers to the amount of information contained in either of the two random variables X and Y . For two mutu- ally independent variables X and Y (i.e., when the two signs can be viewed from each other), joint entropy is the sum of the individual entropies H(X) and H(Y ) for each sign. When the two variables are not mutually independent, joint entropy H(X, Y ) can be calculated by using equation 2 in which the joint probability distribution Pa,b is defined by equation 10. In the case of signage, joint entropy indicates the extent to Figure 4: Top-down view of a virtual airport. Letters A and which an observer may navigate from one sign to another to- B indicate the location of two signs in a section of the airport. wards a goal location. Conditional entropy is the reduction in uncertainty (i.e., in- trate the information theoretic approach to signage systems. formation from the sign sa ) due to the presence of another The model was created using Autodesk Revit [Revit, 2012] sign sb and vice versa. For example, an observer is located and then imported into Unity 3D [Unity3D, 2016] to perform between signs sa and sb but closer to sb . Both signs are indi- simulations. In this example, the 3D model of the building cating the same destination along the same route. The proba- includes two signs placed at different locations and pointing bility of viewing the information from sign sa is low because towards a common destination (represented by A and B in the the individual entropy of sa is high. At the same time, the Figure 4). The walkable region of the floor of the 3D built en- probability of viewing information from sb is high because vironment was divided into n square grid cells of 0.5 m x 0.5 the individual entropy of sb is low. Because the two random m. This grid cell size approximates the space occupied by an variables are not mutually independent, conditional entropy average human. for sa given sb is lower than the individual entropy of sa , and Figures 5 (a) illustrate the probabilities of viewing conditional entropy for sb given sa is lower than the individ- wayfinding information provided by signs A from several lo- ual entropy of sb . In other words, both signs become more cations (represented as density maps). These uncertainties visible in the presence of the other sign. In addition, the con- are based on the distance and relative direction of the grid ditional entropy of sb given sa is lower than the conditional cell from each sign according to the functions in Figures 2 entropy of sa given sb . Because entropy is inversely related and 3 as well as Equations 6, 7 and 8. Agents at grid cells in a to the probability of viewing each sign, sb is more visible than lighter shade of gray have a higher probability of viewing the sa from this location. information provided by the sign than agents at grid cells in Mutual Information measures the correlation of the two darker shades of gray. Agents at grid cells at greater distances random variables X (information from sign sa ) and Y (infor- or larger relative directions from each sign have smaller prob- mation from sign sb ). It quantifies the amount of information abilities of viewing the information compared to agents at known about sign A by knowing sign B and vice versa. Mu- grid cells at smaller distances or relative directions from that tual information can be calculated as the difference between sign. Black grid cells do not provide any information to an conditional entropy for any sign and its corresponding indi- agent. Figures 5 (b) and (c) represents the first person view of vidual entropy. Higher mutual information represents higher an agent from a location which has high information (shown redundancy, which may result in improvements in navigation in yellow star in 5 (a)) and an agent from a location which has performance. However, increases in redundancy may not be low information ((shown in yellow circle in 5 (a)). The text on linearly related to improvements in navigation performance. the signage is visible from a high information location since The information measures presented here provide one method the distance between the agent and the signage along with the for estimating the expected increase in performance for each relative angle between them is acute. Which results in lower additional sign. value of entropy and a higher probability of perceiving the In Figure 9 MI can be computed by subtracting the entropy information. Which is not the case from the location which of sign A( in black) with the conditional entropy of sign A is outside the VCA ((shown in yellow circle in 5 (a)). The with sign B( shown in yellow). relative angle between the agent and the sign is high which creates more noise and increases the entropy of visibility. We 4 Experiments and Results showcase the similar effect for the sign B in 6 (a), (b) and (c). In this section, we use a simplified building information Figure 7 (a) illustrates the individual and joint probabilities model (BIM) of a virtual airport (Figure 4) in order to illus- of viewing the information provided by both signs A and B Figure 5: (a) Visualization of the probability of perceiving the information from Sign A. The yellow star and circle represent two locations where information is high and low with respect to sign A (b) First person view from yellow star where information is high with respect to sign A (c) First person view from the yellow circle where information is low with respect to sign A. Figure 6: (a) Visualization of the probability of perceiving the information from Sign B. The yellow star and circle represent two locations where information is high and low with respect to sign B (b) First person view from the yellow star where information is high with respect to sign B (c) First person view from the yellow circle where information is low with respect to sign B. Figure 7: Visualization of the probability of perceiving the in- formation from Sign A, B and the mutual information (lighter grids) between them Figure 8: Entropy of the information perceived from varying distance and relative angle with sign A 6.906 6.905 entropy in bits 6.904 6.903 6.902 6.901 0 500 1000 1500 2000 common grids Figure 9: Entropy of perceived information from sign A (in green) and conditional entropy of sign A with sign B (in black).The difference in the entropy value captures the mutual information between the information from Sign A and Sign B. from each grid cell. The probability of viewing information interpretability and attentiveness) and additional signs (i.e., from both signs (i.e., mutual information) is higher than the more than two). The former addition will provide the ground- probability of viewing the information provided by either sign work for a cognitively inspired, agent-based model of naviga- in isolation for the common grid cells. tion behavior. Additional signs will allow us to address some Figure 8 demonstrates the relationship between the infor- of the difficulties associated with decomposing complex sys- mation viewed from sign A, the distance of the observer from tems in terms of information theory (see [Griffith and Koch, sign A, and the relative direction of the observer from sign A. 2014] [Griffith and Ho, 2015]). An increase in entropy indicates higher uncertainty in view- We also plan to further inform this framework with at least ing the information provided by sign A. Finally, Mutual in- two empirical studies with human participants in virtual real- formation can also visualized in Figure 9 as the difference ity. The first study will investigate the relationship between between individual and conditional entropies. Here, the green the visibility of a sign at different distances and relative di- line represents the individual entropy of sign A, and the black rections at a finer granularity than previous work. The sec- line represents the conditional entropy of sign A given sign ond study will test different signage systems with respect to B. their effect on the wayfinding behavior in complex virtual buildings. Together, these studies will provide the necessary groundwork for incorporating research on human perception 5 Conclusions and Future Works and cognition into evidence-based design. The quantification of information provided by a system of signs can be beneficial to architects attempting to improve References the navigation of building patrons. This approach may be [Allen, 1999] Gary L Allen. Spatial abilities, cognitive particularly useful for buildings that are especially complex maps, and wayfinding. Wayfinding behavior: Cognitive and require redesigns of signage. For this paper, we adapted mapping and other spatial processes, pages 46–80, 1999. Shannon’s entropy measures in order to study the informa- [Arthur and Passini, 1990] P Arthur and R Passini. 1-2-3 tion provided by two signs in a 3D virtual environment. We evaluation and design guide to wayfinding. Public Works then visualized these entropy measures in an understandable Canada, Technical Report, 1990. way for practitioners, including architects and engineers. The benefit of using a gaming engine (Unity 3D) is that these vi- [Arthur and Passini, 1992] Paul Arthur and Romedi Passini. sualization can be dynamically updated during navigation. Wayfinding: People. Signs, and Architecture. McGraw- For simplicity, we have focused on the distance and rel- Hill, 1992. ative direction between the observer and one or two signs. [Atick, 1992] Joseph J Atick. Could information theory pro- Future work will extend this framework to include additional vide an ecological theory of sensory processing? Network: physical and psychological factors (e.g., color and contrast, Computation in neural systems, 3(2):213–251, 1992. [Becker-Asano et al., 2014] Christian Becker-Asano, Felix [Jouellette, 1988] Michael Jouellette. Exit signs in smoke: Ruzzoli, Christoph Hölscher, and Bernhard Nebel. A design parameters for greater visibility. Lighting Research multi-agent system based on unity 4 for virtual percep- & Technology, 20(4):155–160, 1988. tion and wayfinding. Transportation Research Procedia, [Montello et al., 2004] Daniel R Montello, David Waller, 2:452–455, 2014. Mary Hegarty, and Anthony E Richardson. Spatial mem- [Buechner et al., 2012] Simon J Buechner, Jan Wiener, and ory of real environments, virtual environments, and maps. Christoph Hölscher. Methodological triangulation to as- Human spatial memory: Remembering where, pages 251– sess sign placement. In Proceedings of the Symposium on 285, 2004. Eye Tracking Research and Applications, pages 185–188. [Montello, 2005] Daniel R Montello. Navigation. Cam- ACM, 2012. bridge University Press, 2005. [Cover and Thomas, 1991] Thomas M Cover and Joy A [NFPA et al., 1997] Life Safety Code Handbook NFPA, Na- Thomas. Entropy, relative entropy and mutual informa- tional Fire Protection Association, et al. Quincy, 1997. tion. Elements of information theory, 2:1–55, 1991. [Norman, 1988] Donald A Norman. The psychology of ev- [Craik and Masani, 1967] FIM Craik and PA Masani. Age eryday things. Basic books, 1988. differences in the temporal integration of language. British [O’Neill, 1991] Michael J O’Neill. Effects of signage and Journal of Psychology, 58(3-4):291–299, 1967. floor plan configuration on wayfinding accuracy. Environ- [Fajen, 2013] Brett R Fajen. Guiding locomotion in com- ment and Behavior, 23(5):553–574, 1991. plex, dynamic environments. Frontiers in behavioral neu- [Resnik, 1996] Philip Resnik. Selectional constraints: An roscience, 7:85, 2013. information-theoretic model and its computational realiza- [Filippidis et al., 2006] Lazaros Filippidis, Edwin R Galea, tion. Cognition, 61(1):127–159, 1996. Steve Gwynne, and Peter J Lawrence. Representing the in- [Revit, 2012] Revit, 2012. https://www.autodesk. fluence of signage on evacuation behavior within an evac- com/products/revit-family/overview. uation model. Journal of Fire Protection Engineering, 16(1):37–73, 2006. [Shannon and Weaver, 1949] Claude E Shannon and Warren Weaver. The mathematical theory of communication. Ur- [Floridi, 2002] Luciano Floridi. What is the philosophy of bana, 1949. information? Metaphilosophy, 33(1-2):123–145, 2002. [Smyth and Goodman, 1992] Padhraic Smyth and Rod- [Galea et al., 2001] E Galea, L Filippidis, P Lawrence, ney M. Goodman. An information theoretic approach S Gwynne, et al. Visibility catchment area of exits and to rule induction from databases. IEEE transactions on signs, volume 2. Interscience Communications Ltd., 2001. Knowledge and data engineering, 4(4):301–316, 1992. [Ghahramani, 2007] Zoubin Ghahramani. Entropy and mu- [Still, 2009] Susanne Still. Information-theoretic approach tual information. 2007. to interactive learning. EPL (Europhysics Letters), [Gibson, 1977] James J Gibson. Perceiving, acting, and 85(2):28005, 2009. knowing: Toward an ecological psychology. The Theory [Unity3D, 2016] Unity3D, 2016. https://unity3d. of Affordances, pages 67–82, 1977. com/. [Golledge, 1999] Reginald G Golledge. Human wayfinding [Wener and Kaminoff, 1983] Richard E Wener and Robert D and cognitive maps. Wayfinding behavior: Cognitive map- Kaminoff. Improving environmental information: Effects ping and other spatial processes, pages 5–45, 1999. of signs on perceived crowding and behavior. Environment [Griffith and Ho, 2015] Virgil Griffith and Tracey Ho. Quan- and Behavior, 15(1):3–20, 1983. tifying redundant information in predicting a target ran- [Xie et al., 2007] H. Xie, L. Filippidis, S. Gwynne, E. R. dom variable. Entropy, 17(7):4644–4653, 2015. Galea, D. Blackshields, and P. J. Lawrence. Signage Legi- bility Distances as a Function of Observation Angle. Jour- [Griffith and Koch, 2014] Virgil Griffith and Christof Koch. nal of Fire Protection Engineering, 17(1):41–64, 2007. Quantifying synergistic mutual information. In Guided Self-Organization: Inception, pages 159–190. Springer, [Xie et al., 2012] Hui Xie, Lazaros Filippidis, Edwin R 2014. Galea, Darren Blackshields, and Peter J Lawrence. Exper- imental analysis of the effectiveness of emergency signage [Hillier and Hanson, 1984] Bill Hillier and Julienne Hanson. and its implementation in evacuation simulation. Fire and The social logic of space, 1984. Cambridge: Press syndi- Materials, 36(5-6):367–382, 2012. cate of the University of Cambridge, 1984. [Holscher et al., 2007] Christoph Holscher, Simon J Buch- ner, Martin Brosamle, Tobias Meilinger, and Gerhard Strube. Signs and maps–cognitive economy in the use of external aids for indoor navigation. In Proceedings of the Cognitive Science Society, volume 29, pages 377–382, 2007.