=Paper= {{Paper |id=None |storemode=property |title=SmartReality: Integrating the Web into Augmented Reality |pdfUrl=https://ceur-ws.org/Vol-932/paper10.pdf |volume=Vol-932 |dblpUrl=https://dblp.org/rec/conf/i-semantics/NixonGRS12 }} ==SmartReality: Integrating the Web into Augmented Reality== https://ceur-ws.org/Vol-932/paper10.pdf
                  Proceedings of the I-SEMANTICS 2012 Posters & Demonstrations Track, pp. 48-54, 2012.
                  Copyright © 2012 for the individual papers by the papers' authors. Copying permitted only
                  for private and academic purposes. This volume is published and copyrighted by its editors.




                SmartReality: Integrating the Web into
                        Augmented Reality

       Lyndon Nixon1, Jens Grubert2, Gerhard Reitmayr2, and James Scicluna3
                  1
                      STI International, Neubaugasse 10/15, 1070 Vienna, Austria
                                      lyndon.nixon@sti2.org
       2
           Graz University of Technology, Inffeldgasse 16c, 2. Floor, 8010 Graz, Austria
                           {grubert, reitmayr}@icg.tugraz.at
                       3
                           Seekda GmbH, Grabenweg 68, 6020 Innsbruck, Austria
                                james.scicluna@seekda.com



       Abstract. This poster and accompanying demo shows how Semantic Web and
       Linked Data technologies are incorporated into an Augmented Reality platform
       in the SmartReality project and form the basis for enhanced Augmented Reality
       mobile applications in which information and content in the user’s surroundings
       can be presented in a more meaningful and useful manner. We describe how
       things of interest are described semantically and linked into the Linked Open
       Data cloud. Relevant metadata about things is collected and processed in order
       to identify and retrieve related content and services on the Web. Finally, the us-
       er sees this information intuitively in their Smart Reality view of the reality
       around them.


       Keywords. Augmented Reality, mobile, Semantic Web, Linked Data, Web ser-
       vices, Web APIs, Web content, Things of Interest.


1      Introduction

SmartReality is a project which began in October 2010 as a nationally funded project
in Austria. The participating partners are STI International, Technical University of
Graz, Seekda GmbH and play.fm GmbH. Together they represent expertise and inno-
vation in Augmented Reality (AR), semantic technology, Web services and online
media. In the context of the project, they explore how people may be able to access
relevant information, content and media about things of interest in their vicinity via
AR. The AR is enhanced dynamically by the Web of data and the Web of services.
This enhancement is made possible by a SmartReality platform which mediates be-
tween the client device and the Web-based data and services, focused on using avail-
able metadata to select the most appropriate content and services for display to the
user. This poster and accompanying demonstrator will present the first results and
prototypes of the project. We will explore and demonstrate how semantic technology
and Linked Data can be integrated with AR in order to make a user’s reality “smart-
er”, based on a scenario with street club posters.




                                                                48
SmartReality: Integrating the Web into Augmented Reality


2      Sm
        martRealitty Vision an
                             nd Scenario
                                       o

The valuee of the ideas of Semantic W   Web and Link   ked Data to thhe AR domainn has been
reflected in recent possition papers, not only from  m SmartRealitty [1] but alsoo other re-
searcherss [2]. A mobiile applicationn, ‘Mobile cu                   ge guide’ [3] , also ex-
                                                       ultural heritag
plored usse of Linked Data and facceted browsin       ng in combinaation with deevice GPS
informatiion and culturral artefact meetadata. Thesee apps make use  u of (often iimprecise)
GPS posiitioning and are
                       a designed too work in specific domainss, with limitedd ability to
make usee of new data sources
                        s        or adaapt the presentation of resullting content aaround the
objects inn the AR focuus. The vision of SmartReallity is to leverage a richer ddescription
of pointss of interest (POIs)
                       (        interlinnked with thee broad sourcce of conceptt metadata
found in Linked Data. This forms thhe basis of en    nabling dynam  mic and relevan
                                                                                  ant content
selection and presentattion at the AR   R client. Thiss results in a more
                                                                     m     useful, aaware and
flexible Augmented
         A            Reeality experiennce.




                            Fig. 1. Sm
                                     martReality postter scenario

   For ouur initial appliication domaiin, SmartReallity is focusing on music. M  Music is a
common and increasin    ngly shared eexperience viaa the Internett. The user grroup most
likely to be
          b early adopters of Smart Reality solutiions are young   g professionalls who are
typically interested in listening to m music, discov
                                                   vering artists and
                                                                   a attendingg concerts.
Together with our parttner Play.fm G   GmbH, whosee web site and   d mobile appss offer ac-
cess to more
         m     than 18 000
                         0 DJ mixess and live reccordings to 15  50 000+ userss a month,
our goal is to use semaantics and Linnked Data to dynamically
                                                    d             link referencess to music
around us in our curreent reality to vvirtual inform
                                                   mation, content and servicess from the
Internet which
          w      enhancce our experieence of musicc in our curren  nt reality. In the Smar-
tReality use
         u case, for example,
                        e          we cconsider a mu
                                                   usic-consciouss person wanddering city




                                              49
SmartReality: Integrating the Web into Augmented Reality

streets annd seeing the ubiquitous sttreet posters advertising
                                                       a            co
                                                                     oncerts and clulub nights.
Now, at best,
          b     if this peerson is intereested in the co
                                                       ontent they maay have mobille Internet
and can begin
          b       to searcch on aspects like the artistt, event or vennue. Howeverr, this pre-
supposes they can iden    ntify these asppects from the poster and thhey still must ggather and
link togetther the inform  mation they aare interested in across variious Web searrches, e.g.
find out about
          a      the artist, find some aaudio to listen to from the arrtist, check whhere is the
venue whhere the artist is playing, finnd out how to get there, find out how to gget tickets
for the cooncert. Howev   ver, with Smarrt Reality theyy can view an enriched verssion of the
poster wiith related con ntent automatiically overlaid d over the poster referencess to artists,
events, veenues and other things. Thiis is illustrated d in Figure 1.


3        Sm
          martRealitty impleme ntation

A SmartR Reality server handles the iinteraction bettween the clieent and the W Web of data
and serviices. A simpliified illustratioon of the step            martReality is given be-
                                                     ps taken in Sm
low (Fig.. 2). First, thee object in thee mobile devices camera view is identifi fied via an
image reccognition serv vice (we use K Kooaba1), whicch returns an identifier
                                                                   i          for tthe object.
This idenntifier is linked with a desccription of a “Thing
                                                     “      of Interest” (TOI) iin a datas-
tore we teerm a “TOI Repository”.
                        R              T
                                       The TOI descrription, enrich
                                                                  hed by links too concepts
in the Web of Data, iss processed inn order to seleect the most appropriate
                                                                   a            coontent and
services from
          f     the Web  b. The resultiing content is packaged an nd sent to the client for
display inn the AR view w.




                          Fig. 2. SmarttReality workflo
                                                      ow illustration

11
     http://w
            www.kooaba.com



                                                50
SmartReality: Integrating the Web into Augmented Reality

3.1    An
        nnotation
To ease the
         t process off generating thhe initial metaadata about Th   hings of Interrest (TOI),
we impleemented a Weeb based annootation tool (F      Fig. 3). The tool currently only sup-
ports seleecting street posters
                        p       from play.fm’s image database. Here, the usser selects
        fo triggering the appearancce of content as well as reg
regions for                                                         gions where tto actually
display thhe content over the poster. Instead of ad   dding a concreete link to conntent from
the selected regions, users
                        u     rather sselect a LinkedData URI representing a concept
from an existing
          e        conceeptual schemee. In this case we use the Linked Data iidentifiers
                        vents, clubs) and support their addition
for play.ffm (artists, ev                                          n by allowingg free text
entry andd Ajax-based concept selecction (automaatically filling in the full U    URI of the
concept). The user is also
                        a free to usse a full URI from any oth     her Linked Daata source.
When edditing is finishhed, the annotaation tool gen               ng of Interest (TOI) an-
                                                      nerates a Thin
notation for
          f the poster and stores it iin a TOI repository. Additio  onally, the im
                                                                                 mage of the
event poster is upload  ded to Kooabba to make it     i possible to identify the poster at
runtime. The
          T TOI data model has beeen created in RDF/S speciffically for Sm       martReality
and is pubblished at http
                        p://smartrealityy.at/rdf/toi.




                            Fig. 3. Sm
                                     martReality anno
                                                    otation tool.




                                              51
SmartReality: Integrating the Web into Augmented Reality

3.2    Server
The server is developed as a set of components which interchange messages to realize
the SmartReality functionality expressed in the above workflow (Fig. 2). The platform
has been developed in Ruby on Rails and exposes a RESTful API to clients. The re-
positories and APIs used by the components to retrieve data into the workflow are
separated from the code of the core components so that different storage and remote
API solutions (including cloud) could be used as required. After parsing the TOI’s
metadata (the TOI being identified via the Kooaba identifier which is included in its
description in the repository), it provides an initial response to the client which identi-
fies the TOI’s regions of interest for display in the AR view (see below, Fig. 4 left).
Two further functional steps are realized on the server to provide the content bundle
for the enrichment of the TOI’s regions of interest in the AR view with links to con-
tent:
    • Linked Data consumption. The (Linked Data) concepts used to annotate the
TOI’s regions and extracted from the TOI’s metadata are crawled and further, related
concepts extracted as defined in a set of Linked Data crawling rules. The rule syntax
makes use of LDPath2. As a result, a local repository of relevant structured metadata
about the concepts of interest in the TOI has been created. We use mainly play.fm
Linked Data3 in the current demo, while the approach is vocabulary-independent, i.e.
any Linked Data could be used in the annotation and supported in this step. For the
Linked Data step, we make use of the Linked Media Framework (LMF4), which pro-
vides on top of a RDF repository the means to directly cache Linked Data resources
and synchronize their metadata with the description on the Web. This means for a
new annotation the LMF will automatically use the locally cached resource metadata
in available rather than repeatedly retrieve it from the Web which can lead to latency
in the platform response.
    • Service selection and execution for content retrieval. Based on this local metada-
ta, a service description repository is queried. Services or APIs are described in terms
of their conceptual inputs and outputs so that, for the given class of a concept in the
annotation, appropriate services can be found to provide content for the enrichments
of the TOI in the AR view. In the current demo, we use an API provided by play.fm
to access audio streams of recordings by artists as well as an API provided by Seekda
to link an event to a booking interface for nearby hotels with rooms available on the
night of the event. Service execution will require querying the concept descriptions to
extract the necessary input values – e.g. for the hotel booking interface, the service
API needs the event’s data and its location’s longitude and latitude to be passed in the
input request. Likewise, the service response needs to be parsed to return to the Smar-
tReality platform the content inks which can be used to enrich the TOI with respect to
the original concept. For this, we use the concept of “lowering” and “lifting” in the
Linked Services approach [4] where the semantic concept is ‘lowered’ to datatype
values for the input request to the service, and the datatype values from the service

2
  http://code.google.com/p/ldpath/wiki/PathLanguage
3
   http://data.play.fm
4
  http://code.google.com/p/lmf/



                                             52
SmartReality: Integrating the Web into Augmented Reality

output response are ‘liifted’ to new ssemantic conccepts (e.g. from a Place to images of
Maps of the
         t place).
  The “llifted” respon
                     nses are colleccted and sent as a content bundle to thee client in
JSON.


3.3      Cllient
We builtt a client app plication protootype for An   ndroid smartphones. It levverages an
Augmentted Reality in  nterface visuallizing relevan nt information n that is spatiaally regis-
tered on the physicaal object via Natural Feaature Trackin          ng. A user ppoints her
smartphoone on the phy ysical thing off interest to iniitialize a TOI query. After ssuccessful
initializattion, segmentts containing relevant info  ormation are highlighted
                                                                      h           thhrough an
Augmentted Reality intterface on thee physical objject (Fig.4 below left). Thee user can
now poinnt towards ind dividual segmeents and obtaain detailed information (Fiig.4 below
right). Thhe rendering of
                       o content thatt is spatially reegistered to seegments on thhe physical
object in 3D space is based on OpennSceneGraph5.
   The Sm  martReality demo
                       d      will use two real club  b event posterrs with the insstalled cli-
ent on Anndroid smartp phones to givee visitors the experience
                                                      e            of SmartRealityy for them-
selves.




 Fig. 4. SmmartReality view in the client. Left: poster is recognized and d the regions wiith content
are indicatted. Middle: onn pressing a conntent region, thee available content items are shhown. Here
   the artisst Fabio Almeriia is associated with an audio stream from plaay.fm and a webb link to
 booking a hotel room forr after his next concert from Seekda. Right: following
                                                                          fo            the weeb link the
  user is att a hotel room booking
                           b        screen (date and locattion is not inputt, as it is knownn from the
                                          evvent’s descriptio
                                                            on)




5
    http://ww
            ww.openscenegraph.org



                                                    53
SmartReality: Integrating the Web into Augmented Reality


4      Future Work

The SmartReality project has focused on a proof of concept with club event posters
and enrichment via LOD from mainly the play.fm database. The infrastructure devel-
oped has been deliberately designed to separate distinct data and content sources from
the workflow which realizes a SmartReality experience, i.e. the use of other objects as
“Things of Interest”, the annotation with other LOD sources, or the linkage to content
from other providers for display in the AR view, should be feasible as a configuration
issue and not require any changes to the SmartReality platform or client.



5      Acknowledgements

This work has been performed in the Austrian                        project   SmartReality
(http://www.smartreality.at) which is funded by the FFG.



6      References
 1. Nixon, L., Grubert, J. and Reithmayer, G.: Smart Reality and AR Standards, at the 2nd in-
    ternational Augmented Reality Standards meeting, Barcelona, Spain, 2011.
 2. Reynolds, V., Hausenblas, M., Polleres, A., Hauswirth, M., Hegde, V.: Exploiting Linked
    Open Data for Mobile Augmented Reality. In: W3C Workshop on Augmented Reality on
    the Web, Barcelona, Spain, 2010.
 3. Van Aart, C., Wielinga, B. and van Hage, W.: Mobile cultural heritage guide: location-
    aware semantic search. In: Proceedings of the 17th International Conference on
    Knowledge Engineering and Knowledge Management (EKAW ‘10), Lisbon, Portugal,
    2010.
 4. Pedrinaci, C., Liu, D., Maleshkova, M., Lambert, D., Kopecky, J., Domingue, J.: iServe: a
    linked services publishing platform. In: Ontology Repositories and Editors for the Seman-
    tic Web Workshop, Heraklion, Greece, 2010.




                                              54