An Ontological Approach to Integrating Task Representations in Sensor Networks Konrad Borowiecki and Alun Preece Cardiff University, School of Computer Science, 5 The Parade, Cardiff, UK {k.borowiecki,a.d.preece}@cs.cf.ac.uk ABSTRACT required operations, leading to satisfaction of the tasks. For The sensor tasking problem in sensor networks involves the example, user-level tasking is concerned with tasks such as representation of users’ tasks (user-level tasking) in a form the detection of vehicles or identification of people, whereas which a sensor network needs to perform required oper- sensor-level tasking is concerned with operations such as col- ations (sensor-level tasking) leading to satisfaction of the lecting video or audio data of a particular quality. Another tasks. We analysed four approaches to task representation way of looking at this is to say that user-level tasking focuses (TR) in sensor networks: the Open Geospatial Consortium’s on issues of “what” whereas sensor-level tasking focuses on Sensor Web Enablement, Goal Lattices, Semantic Streams, issues of “how”. In practice, a complete task specification and Sensor Assignment to Missions. Each approach con- needs to include both aspects, because users will be con- siders distinct aspect of the sensor tasking problem. We cerned with both what they want to know, and how they used the Web Ontology Language, OWL, to define the fea- get the supporting sensor data [4, 2]. Therefore, we see a tures of each TR, which enabled us then to identify map- need for a task representation (TR) that captures user-level pings between them. These mappings allow us to combine tasking requests and links these to sensor-level tasking re- the TRs into one hybrid task representation (HTR) that quests. Such a TR would provide all the necessary input to addresses both user-level and sensor-level tasking, thus pro- a system that would operationalise a user’s request in terms viding a more complete, integrated approach to the sensor of necessary “what” and “how” requirements. tasking problem. In this paper (presented as both poster and demonstration) we introduce our HTR integrated into 2. TASK REPRESENTATIONS a system working on a sensor network. It shows how a rich After a literature review we have identified four existing rep- semantic representation of task, such as the HTR, can be resentations addressing aspects of user-level and/or sensor- used to automatically control system operations on the net- level tasking, for which there were reasonably detailed de- work, thus making it more adaptable to changes within the scriptions of the TR formalism: Open Geospatial Con- state of the network resources (e.g. sensor malfunction) or sortium Sensor Web Enablement (SWE) enables task- tasks (e.g. change of a task’s requirements). ing on a sensor-level, allowing for discovery, access and set- ting of sensor parameters through Web service standards [1]. 1. INTRODUCTION Goal Lattices (GL) assist in user-level tasking, during Sensor networks are becoming increasingly important in many task planning, by defining a lattice of goals and weights, domains, for example, environmental monitoring, emergency where sub-goals contribute to the satisfaction of super-goals response, and military operations. There is considerable and in terms of their relative weight, allowing for goal prioriti- growing interest in developing approaches that allow sen- sation [3]. Semantic Streams (SS) are useful for both sors to be treated as information-providing resources, and user- and sensor-level tasking, as they enable the creation of integrated within Web and Semantic Web information ar- streams representing the flow of sensor-generated informa- chitectures (for example, [1, 4, 5]). A key issue in this is tion and processing required in order to satisfy a task’s in- making these networks more flexible, so they can more eas- formation requirements [4]. Sensor Assignment to Mis- ily be deployed to meet the needs of new tasks. We identify sions (SAM) connects user- and sensor-level tasking, as it two aspects of the sensor tasking problem: user-level tasking enables matching between tasks and sensor types, by map- involves the representation of a user’s tasks in a form that ping of a task’s information requirements to a set of sensor determines the operations a sensor network needs to per- capabilities satisfying them [2]. form; sensor-level tasking involves the specification of those For each TR we have created in Web Ontology Language (OWL1 ) ontology. The model of hybrid task representation (HTR) was created through alignment of the ontologies in- tegrating the aforementioned capabilities of the four TRs. Fig. 1 shows the mappings which create the HTR ontology. Here we use the following notation: classes are depicted as ovals; subclass relations are shown as unlabelled solid arcs; the OWL sameAs property is represented by a solid bidirec- 1 http://www.w3.org/TR/owl-ref/ Figure 1: Mappings of ontologies creating the HTR. tional arcs; all other properties are shown as labelled dashed arcs; we use namespace notation to indicate which TR on- Figure 2: Interface of a system utilizing the HTR. tology the concepts are from. The ontologies are available online2 . The role of a task representation in a system is to capture “what” and “how” requirements of a user’s task in a assigning the resources, by taking the next available bundle machine processable form which lets the system figure out with platform P2 containing an acoustic array. how to satisfy the tasks needs. Thanks to the combination of user- and sensor-level TRs and creation of mappings be- 4. SUMMARY tween their concepts using Semantic Web technologies we The demo shows that through use of a rich semantic repre- have obtained an HTR able to express a user’s task in hu- sentation of a user’s task, which allows a user to state his man readable terms (e.g. detect a vehicle) that is visible needs while capturing all information required to operate for to the sensor-level task representations dealing with setting, the underlying technologies, it is possible to automatically collection and processing of sensor information (e.g. camera, control a system working on a sensor network. In result we radar or acoustic sensor). obtain a system that is responsive, adaptive and useful in situations or sensor networks where change of sensor and/or 3. THE APPLICATION tasks state is expected. Fig.2 shows the interface of a system using the HTR. The top right tree exposes the SAM TR functionality, where the Demo Requirements The demo runs on a self-contained laptop user can select from currently defined National Imagery In- but requires an Internet connection. terpretability Rating Scale3 Tasks (e.g. detect vehicle) and specify Required Sensing Capabilities in terms of intelligence Acknowledgement This research was sponsored by the U.S.Army types (e.g. acoustic, imagery or radar). The bottom tree Research Laboratory and the U.K. Ministry of Defence and was presents bundles that satisfy a task i.e. platforms with sen- accomplished under Agreement Number W911NF- 06-3-0001. The sors mounted on them, where their combined capabilities are views and conclusions contained in this document are those of the satisfying the requirements of a task. The map serves three author(s) and should not be interpreted as representing the official purposes: to allow a user to specify the area of a task, to policies, either expressed or implied, of the U.S. Army Research present the location of assets providing some of the capabil- Laboratory, the U.S. Government, the U.K. Ministry of Defence ities required by a task, and to deliver processed sensor in- or the U.K. Government. The U.S. and U.K. Governments are formation where appropriate (the jeep icon, representing de- authorized to reproduce and distribute reprints for Government tected vehicle). The top tab uses the GL TR’s capabilities to purposes notwithstanding any copyright notation hereon. express relation between tasks, thus prioritising assignment of resources accordingly. The SS & SWE TRs capabilities 5. REFERENCES since they have more to do with the sensor- then user-level [1] M. Botts, G. Percivall, C. Reed, and J. Davidson. OGC tasking are not exposed to the user.The role of SS TR is pro- Sensor Web Enablement: Overview and High Level cessing of incoming data from sensors, e.g. from the acoustic Architecture. In GeoSensor Networks, pages 175–190. array mounted on the PackBot platform, P2, thus pinpoint- 2008. ing a detected vehicle (the jeep icon) on the map. Where [2] M. Gomez, A. Preece, M. Johnson, G. de Mel, the exploited functionality of SWE TR is discovery, config- W. Vasconcelos, C. Gibson, A. Bar-Noy, K. Borowiecki, uration and use of a network’s resources. Other elements T. L. Porta, G. Pearson, T. Pham, D. Pizzocaro, and of the interface are the mission tab used to switch between H. Rowaihy. An Ontology-Centric Approach to missions, and options tab with the application’s settings. Sensor-Mission Assignment. In EKAW 2008, pages 347–363, 2008. This interface is presented for a vehicle detection task. It [3] K. Hintz and M. Henning. Instantiation of dynamic tells the story of what was happening during the execution goals based on situation information in sensor of this task. The small visible window shows interrupted management systems. In SPIE, volume 6235, 2006. output that was coming from the camera mounted on the [4] J. Liu and F. Zhao. Composing semantic services in Reaper Unmanned Aerial Vehicle, P1 from the first listed open sensor-rich environments. Network, IEEE, bundle. In the moment when the signal was interrupted the 22(4):44 –49, 2008. system automatically switched to an alternative solution, re- [5] J. Wright, C. Gibson, F. Bergamaschi, K. Marcus, 2 http://users.cs.cf.ac.uk/K.Borowiecki/Ontologies.zip T. Pham, R. Pressley, and G. Verma. ITA Sensor 3 http://www.fas.org/irp/imint/niirs.htm Fabric. In SPIE, volume 7333, 2009.