International Symposium on Ubiquitous VR 2007 1 CAMAR Core Platform Dongpyo Hong and Woontack Woo order to enhance image analysis-based tracking performance Abstract—In this paper, we propose a software architecture by using contextual cues. For example, dynamic threshold is for context-aware mobile augmented reality application important to recognize or track a marker regardless of lighting developments in which we can utilize contextual information. In condition. To measure intensity of lighting, it is usually to addition, we discuss how contextual information can enhance traditional AR approaches and virtual contents and the analyze images frame by frame. However, we can directly use possibilities. the intensity value if we have lighting observable sensor. This is only a simple example to show usefulness of the proposed Index Terms—Context-awareness, Mobile AR, Framework software architecture. I. INTRODUCTION A S a concept becomes mature like Virtual Reality, ubiquitous computing, context-aware computing, relevant development tools also emerged. Meanwhile, Augmented Reality is one of promising technology to realize virtual reality systems in our daily life due to its lightness compare to VR. In ubiquitous virtual reality, we are able to interact with any relevant information or contents anytime and anywhere. As a means of realizing ubiquitous virtual reality, we try to combine context-aware mechanisms into AR framework. In this paper, we propose Context-Aware Mobile AR Core Platform as a foundational research development tool to help AR application developers to utilize various contextual cues in their applicatons. The ultimate goal of the proposed platform is how to provide context to AR development tools. In Fig. 1. CAMAR Core Platform Structure and Context Flows addition, the platform supports context-aware mechanisms such as context model, selective collaboration as well as personalized augmentation. III. DISCUSSION AND FUTURE WORK In this paper, we proposed CAMAR Core Platform that can II. CAMAR CORE PLATFORM help AR application developers to utilize contextual The proposed software architecture mainly consists of two information easily into their application. In order to show parts: context-aware framework for a mobile user (i.e., feasiblity of the proposed architecture, however, we should wear-UCAM [1]), and AR application toolkit using Open consider the followings: how many types of sensors and how Scene Graph (i.e., osgART [2]). As shown in Fig. 1, many sensors we can support, the performance of context context-aware part includes a sensor interface that is a wrapper processing from sensors to applicatons because most AR class, a service that processes contextual information and applications require intensive image processing, performance manages it. In particular, we exploit ServiceProvider interface of tracking markers, and improvemet of rendering. As future to provide contextual cues to AR applications that is also an works, we will evaluate the proposed software architecture abstract class. On the other hand, AR application part is and implement pragmatic application by using the proposed straight-forward because it only include ServiceProvider library. interface in its main rendering procedure as well as tracking procedure. Of course, we need another wrappr class for REFERENCES tracking module rathr than one that is provided by osgART in [1] D. Hong, Y. Suh, A. Choi, U. Rashid, and W. Woo., “wear-UCAM: A toolkit for mobile user interactions in smart environments,” IFIP International Conference on Embedded and Ubiquitous Computing, This research is supported by the UCN Project, the MIC 21st Century LNCS 4096:1047–1057, 2006. Frontier R&D Program in Korea. CAMAR Core Platform. [2] http://www.artoolworks.com/community/osgart/ Dongpyo Hong and Woontack Woo are with GIST U-VR Lab, 500-712, Gwangju Korea (e-mail: {dhong, wwoo}@gist.ac.kr).