<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>On-Site Monitoring of Environmental Processes using Mobile Augmented Reality (HYDROSYS)</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Ernst Kruijff</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Erick Mendez</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Eduardo Veas</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Thomas Gruenewald</string-name>
          <email>gruenewald@slf.ch</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>TU Graz, Institute for Computer Graphics and Vision</institution>
          ,
          <addr-line>Inffeldgasse 16, 8010, Graz</addr-line>
          ,
          <country country="AT">Austria</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>WSL Institute for Snow and Avalanche Research SLF</institution>
          ,
          <addr-line>Flüelastrasse 11, 7260 Davos Dorf</addr-line>
          ,
          <country country="CH">Switzerland</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2010</year>
      </pub-date>
      <abstract>
        <p>HYDROSYS is a project targeted at improving monitoring and understanding of environmental processes and their management. The project introduces innovative concepts of on-site monitoring and event-driven campaigns using mobile interactive visualization systems.</p>
      </abstract>
      <kwd-group>
        <kwd>Mobile applications</kwd>
        <kwd>environmental monitoring</kwd>
        <kwd>augmented reality</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1 Introduction</title>
      <p>HYDROSYS aims at analyzing environmental processes where they truly happen,
namely in the field. As such, it advances current practice of informal observations in
the field and specific analyses done in the office. Environmental monitoring can be
defined as the process of observing continuously and measuring regularly
environmental parameters of a specific area in order to identify environmental
changes and aid the decision making process related to the site. In this context, on-site
environmental monitoring comprises all activities that are done in the field such as
identifying key variables of the problem, taking measures, data and images
communication, understanding the site as a whole, and validation of a technical
solution in order to optimize the management of natural areas. At all times, it is
important to understand that on-site monitoring is not replacing nor can it be replaced
by remote monitoring: it is a complementary task. The system is expected to aid in
better understanding physical processes while they happen, by being able to combine
mentally the results of measurements and the actual site being viewed. Furthermore,
and highly interconnected, the system allows for better communication between
various parties while discussing solutions to mitigate environmental degradation.</p>
      <p>The HYDROSYS interactive system deploys a large sensor network system setup
at sites in Switzerland and Finland that feeds mobile units with live data while users
monitor processes in the field. Additionally, new data acquisition methods are
applied, such as a blimp (zeppelin) that captures optical and thermal data to refine
terrain models and provide for detailed textures and thermal image maps. The novelty
of the system is in the deployment of so-called augmented reality (AR) techniques on
mobile units (Feiner et al 1997). Whereas some solutions exist for outdoor AR in the
field of construction work (Schall et al 2007), the usage of AR in environmental
monitoring is novel. AR merges real world views captured by video cameras with
synthetic data. It allows the user to walk around and observe the environment,
continuously getting a “correct view” on the sensor data. The task of augmented
reality is to render computer generated artifacts correctly registered with the real
world in real time. Coupled to the interactive visualization, HYDROSYS enables
communication and exchange of data (e.g. images, data, graphs) from an on-site
observer to decision makers who are generally at workplace and vice versa. In fact,
end users are expected to have a better view of the global situation before, during and
after an event with the image transmission and overlay techniques. It will save time
and money doing onsite and online analysis and users will have a better visualization
of the field when they are in office. In this publication, we will describe the technical
framework, the interactive visualization methods, and a short application example.</p>
    </sec>
    <sec id="sec-2">
      <title>2 Interactive visualization system for sensor data</title>
      <p>Data is gathered from sensors, and possibly by a blimp to generate a dense
information space from a small area. The blimp is equipped with an optical and
thermal camera: it can capture image footage that is used in cohesion with computer
vision methods to generate refined digital terrain models, and thermal imaging that
can be verified with spot measurements. The data is fed over cell phone networks or
WiFi into the Global Sensor Network system (GSN,
http://sourceforge.net/apps/trac/gsn/) where it is processed and stored.</p>
      <p>The simulation and warning system acts as a node to the GSN system, processing
the sensor data using both simple and complex models. Users can trigger simulations
in the office using a web interface to see the results later on in the field once complex
simulations are used. Data is accessed through the SmartClient to the handheld
platforms, which are both the in this article briefly described augmented reality
system, and cell phones. Potentially, WiFi bridges are used to relay networks at
remote locations. The SmartClient also deals with static data such as terrain models
and other legacy data. The handheld platforms form the front-end to the sensor data,
visualizing the various sensor data types and simulation results that are pre-processed
off-site. Hereby, the user will be able to access various cameras that are located
onsite, to get a better overview from multiple perspectives to aid in understanding the
problem at hand. Experiments have been performed with various user interface
techniques to understand the premises of such a “multi-camera framework”, on a
cognitive and technical level, showing positive results (Veas et al 2010).</p>
      <p>The handheld augmented reality system leverages advanced visualization
techniques to render and overlay views of the data depending on the user’s location.
Hardware-wise, the platform consists of a robust outdoor computer (UMPC), a
camera, an orientation and a GPS sensor and a special encasing. Meanwhile, a more
advanced and smaller handheld platform has been designed (compare Figure 2, left
and Figure 3). The sensors mounted in the encasing are important to define the “pose”
of the user to correctly render the digital content over the video image, hence, the
perspective of the user: the required accuracy can, by far, not be met by other
platforms such as cell phones that are also used for simple Augmented Reality
applications. An inferior pose is likely to result in some interpretation problems due
to mismatching of video and graphics. Nonetheless, the sensor data is correctly
matched to the underlying model (the DTM), meaning that it is always possible to
view the 3D model with the sensor data visualizations. In general, users make use of
“real-world verification” to compare the data on the screen to the real world. Still, a
cell phone application would have difficulties to show the computationally
demanding information, since even the latest platforms have limited processing and
graphics performance. At this point, it is important to mention that HYDROSYS
actually includes cell phones as a second visualization platform: cell phones are being
used to run a graphically less demanding application which shows a 3D model of the
environment, and associated sensor data. This platform is predominantly used in
several application scenarios that take place in Finland, and fall outside the scope of
this article.</p>
      <p>Returning to the issue of localization, at current more refined tracking methods are
under development that solves some of the problems related to registration of digital
content over video imagery, as can also be seen in Figure 2 (isophotes offset from
actual environment). The current state of development affords 1.5m accuracy and
better orientation drifting and offset handling, among others by using ultra wideband
localization mounted on a vehicle setup.</p>
      <p>At the handheld, the user can make use of several advanced interface modules to
perform a multitude of actions in the field. First, the user can select sensor data from
the various sensors available in the field. This data is, in general, rendered as
“registered labels” that show the sensor data in numeric format, connected to the
actual location the data is retrieved from (the sensor or sensor station). Users can also
transfer to exploration mode: in this mode, the video imaging is replaced by a
fullscreen representation showing the list of sensor types connected to a specific sensor
station with their latest readings. Potentially, users can analyze sensor readings over
time using plots that are shown in the same mode: these plots can be generated at the
GSN server at predefined intervals and transferred to the handheld upon request. An
additional view mode of interpreting the sensor data as well as the simulation data
described hereafter is to switch to map-mode, which provides a top-view of the site.
Simulations are started by using a simple web front-end that can directly access the
data stored at the sensor network, and the physical models that are used for
simulation. After the simulation results have been produced a semi-automated, user
supervised is required to transform the simulation results in a form appropriate to run
at the handheld software platform.</p>
      <p>Users can select simulation results that are produced by the various simulation
engines available in the system. The pre-processed data is shown as registered overlay
over the terrain model (and thus the video image), similar to interpolated maps. As a
result, the user can compare various sensor readings and simulation results at one
glance. In addition, users can access further information that is useful in the field.
This information encompasses height information, and network coverage. The latter is
very much useful when installing sensors in rough terrain, where an initially selected
location may not turn out to be ideal once inspected on-site.</p>
      <p>The information being visualized is separated in 1D/2D and 3D data. The user
interface allows smooth transition between the various information sources: during
the end-user workshops performed at the start and middle of the project, a clear
tendency can be seen towards the usage of 3D visualizations, still, many users want to
access the 1D and 2D sources too. Moreover, some users are skeptical about 3D
information visualization. We hope that by mixing the various information sources in
a direct and easy way, more users will actively make use of 3D visualizations.</p>
      <p>One potential issue of interpreting the visuals in outdoor situations are both the size
of the screen being watched, and perceptual interferences such as bright sunlight and
reflections. Whereas the outdoor computer being used has been optimized for such
conditions, viewing conditions are still limited. The consortium is producing methods
that perceptually optimize visuals to cover for this issue, as for example the usage of
isophotes as shown in Figure 2. Additionally, there are several other modules that are
currently being finalized, which allow for other tasks to be performed. The
collaboration module affords making annotations for noting down problems and
ideas, and users may make use of voice calls to communicate over longer distances
once dispersed over the site. The camera module focuses at providing access to
different cameras located at the site, observing the site from different locations to get
a better overview. These modules actively make use of a WiFi bridge system that has
been developed as part of the project. The WiFi bridge allows high-speed network
access at remote sites, and can be quickly set up: once a high-bandwidth network
connection is available within the vicinity of the site being monitored, network can be
relayed successfully over several kilometers.</p>
      <p>Furthermore, a simulation module is under development that focuses at segmenting
simulation results for better analysis, and combines to a sensor placement module that
is used for taking manual measurements with a sensor connected directly to the
handheld unit. The sensed data can be read and analyzed directly using the unit.</p>
    </sec>
    <sec id="sec-3">
      <title>3 Application example</title>
      <p>In mountainous regions wet-snow avalanches are important natural hazards occurring
especially in late winter and springtime. They are characterized by high frequency and
a high degree of potential damage to infrastructure. So far the processes which cause
the formation and triggering of these avalanches are poorly investigated. The
Dorfberg (Davos, Switzerland), a new field site for wet-snow avalanche research has
been equipped with several sensors including a complete meteorological station. With
steep slopes and a southwards aspect the Dorfberg, is a proper place for this purpose
and frequent small wet snow avalanches and some big events have been monitored in
the area in the last years.</p>
      <p>When being on site the HYDROSYS interactive visualization system can be used
to access data from the sensors (e.g. air temperature, snow surface temperature,
radiation) in almost real-time. As a first step, the blimp can be used to monitor the
site, capturing detailed optical and thermal data of the surface. These data can be
queried and displayed with the handheld. The sensor data can be plotted on the
handheld and are used as input for a simulation system running Alpine3D (Lehning et
al. 2006), a physical based model which has been developed to describe alpine surface
processes. As output Alpine3D produces area wide simulations for different
parameters (e.g. snow surface temperature, snow water content) which are important
for understanding the formation of the wet-snow avalanches. If deviations between
model result and real observation are observed, a simpler one-dimensional model can
be run with real-time data. This might help to understand the processes which caused
the observed deviations. All together the HYDROSYS system enables the researcher
to get an impression of the current conditions while being on the field site. This could
help to understand the obtained manual field observations and the processes which
affect these conditions.
The HYDROSYS project is in the final stage, with a first range of successful
monitoring actions performed. Whereas the system development and integration is not
finished yet, at current, there is no system known that is as advanced as the presented
system and can operate in the rough and remote locations as is reflected in the project
test sites. Still, there are some improvements needed, such as the further refinement of
the localization, and the integration of all sub-systems in a single prototype.
After finalization, the developed research system prototype will likely yield a strong
basis for further development and usage by a wider public. Most results will be made
available as Open Source. Information on how to obtain the software will be made
available at the HYDROSYS website (www.hydrosysonline.eu).</p>
    </sec>
    <sec id="sec-4">
      <title>Acknowledgements</title>
      <p>This work is partially funded through the EC-funded 7th Framework project
HYDROSYS (224416, DG-INFSO). We thank our partners for providing input to this
article on their relevant developments.
Feiner, S., et al. A Touring Machine: Prototyping 3D Mobile Augmented Reality
Systems for exploring the Urban Environment. In Proceedings of ISWC'97. 1997.</p>
      <p>Schall, G., et al. Handheld Geospatial Augmented Reality Using Urban 3D Models. In
Proceedings of the Workshop on Mobile Spatial Interaction, ACM International
Conference on Human Factors in Computing Systems (CHI´07). 2007</p>
    </sec>
  </body>
  <back>
    <ref-list />
  </back>
</article>