=Paper= {{Paper |id=Vol-2180/paper-76 |storemode=property |title=Exploration of Grateful Dead Concerts and Memorabilia on the Semantic Web |pdfUrl=https://ceur-ws.org/Vol-2180/paper-76.pdf |volume=Vol-2180 |authors=Thomas Wilmering,Florian Thalmann,Mark B. Sandler |dblpUrl=https://dblp.org/rec/conf/semweb/WilmeringTS18 }} ==Exploration of Grateful Dead Concerts and Memorabilia on the Semantic Web== https://ceur-ws.org/Vol-2180/paper-76.pdf
      Exploration of Grateful Dead Concerts and
          Memorabilia on the Semantic Web

           Thomas Wilmering, Florian Thalmann, and Mark B. Sandler

     Centre for Digital Music (C4DM), School of Computer Science and Electronic
             Engineering, Queen Mary University of London, London, UK
               {t.wilmering, f.thalmann, mark.sandler}@qmul.ac.uk



        Abstract. With the increasing importance attributed to intangible cul-
        tural heritage, of which music performance is an important part, public
        archive collections contain a growing proportion of audio and video ma-
        terial. Currently used models have only limited capabilities for their rep-
        resentation. This demo illustrates our proposal for a unified ontological
        model of live music recordings and associated tangible artefacts with a
        Web application for the exploration of live music events of the Grateful
        Dead.

        Keywords: live music · cultural heritage · multimedia · Grateful Dead.


1     Overview
With the ubiquitous and immediate availability of recorded music on stream-
ing services and internet platforms, live music events are becoming increasingly
central to the music world, due to the unique and physical musical experience
delivered to the audience [7]. For many decades, music fans have been doc-
umenting such events and accumulating artefacts and memorabilia, many of
which are now being transferred to public archives and often digitised [3]. The
use of Semantic Web technologies presents an opportunity for linking the nu-
merous archives online. However, commonly used conceptual models for this
purpose lack a consistent representation of sound and multimedia objects [4, 5].
In [8] we propose a data model for a unified representation of cultural artefacts
of time-based and non-time-based nature, which allows their alignment along a
hierarchy of timelines. This demo introduces a web application1 based on this
model for the exploration of Grateful Dead concerts through digitised artefacts
and audio recordings. A particular focus is placed on semantic audio analysis
and thus enabled functionality.

2     Data Model
The Web application is based on an application-based ontological model that
conceptualises different types of live music events and their relationships based
1
    https://grateful-dead-live.github.io (in development)
                                   time                     Event

                                                       sub_event


      Interval              time           CapturingEvent                                  ArtifactItem
                                                                     associated_with

      timeline
                                          time
                                                                                                item           DigitalArtifactItem         PhysicalArtifactItem

                        Instant                                                                                        url                      digital_image
                 timeline
                                                            documents                  ArtifactManifestation
                                                                                                                      URL            url      DigitalImage
     Timeline                                    produced                                   manifestation

      timeline
                                                                                        ArtifactExpression
                                                                                                                                             Live Music Event Vocabulary
      Interval
                                                                                                                                             Event Ontology
                                                    CapturingArtifactExpression
                                                                                                                                             Music Ontology
        time
                                                                                                                                             Timeline Ontology
                                                                                                                                                       property relation
    AudioSignal                       VideoSignal                  EventPhoto           PromotionArtifact      FanmadeArtifact                         subclass relation
                                                                                                                                                       subproperty relation




Fig. 1: Data model for associating artefacts with the live music event and its
timeline with examples of artefact expressions.


on the Event Ontology2 and live-music related artefacts including musical and
non-musical ones. We relate artefacts with events at any scale, e.g. tours, fes-
tivals, or concerts, and place them in a hierarchy of timelines associated with
the events, using the timeline ontology3 . The ontology allows connecting any
event to Music Ontology4 concepts such as artists, groups, compositions, etc.
An overview of the data model for artefacts is given in Figure 1. In [8] we pro-
pose mappings to concepts of the Music Ontology and FRBRoo5 , facilitating
the mediation and interchange of music-related, bibliographic and museum in-
formation, based on the Group 1 entities of the Functional Requirements for
Bibliographic Records (FRBR) entity-relationship model [6].

3      Web Application
The Web application focuses on live concerts of the band Grateful Dead. The
band’s history continues to garner interest by both fans and scholars, both with
respect to their music and their cultural impact [2]. The application combines
information collected from existing Web resources combined with automatic se-
mantic analysis of audio content to infer higher-level musical information, allow-
ing users to explore the band’s concert history with an audiovisual experience.
Knowledge Acquisition and Architecture The Grateful Dead collection of
the Live Music Archive (LMA)6 consists, at the time of writing, of more than
12,000 concert recordings recorded on over 2,000 dates spanning the years 1965
to 1995. The audio material is accompanied by basic unstructured metadata
covering information such as dates, venues, set lists and recording lineage. The
recordings are digital transfers of fan-made audience recordings encouraged by
2
  http://motools.sourceforge.net/event/event.html
3
  http://motools.sourceforge.net/timeline/timeline.html
4
  http://musicontology.com/
5
  https://www.ifla.org/publications/node/11240
6
  https://archive.org/details/GratefulDead
                                            etree                     Web
                                                      CALMA
                                            triple                 resources
                                                       data
                                            store


                                                                               audio feature
                                                             Web parser
                                                                                extractors




                                                                                       triple        Web audio
                                                           RDF generation
                                                                                       store          player




                                                                                        API



                                                                                       Web
                                                     data flow                         application
                                                     query
                                                     audio data



    Fig. 2: GUI of the Web application. Fig. 3: System architecture of the Web
                                        application.


the band for non-commercial use, and recordings taken from the audio engineers’
mixing desks. The application combines audio recordings with data from several
Semantic Web resources, including Live Music Archive Linked Data7 and DB-
pedia8 , for information about venues, cities, artists, etc. Data taken from other
Web resources related to Grateful Dead concerts includes scans of artefacts such
as tickets and posters, and setlist and lineup details. Data from these sites has
been aggregated using dedicated scripts which parse information and automati-
cally generate RDF data. Audio feature extraction results are accessible via links
to the Computational Analysis of the Live Music Archive (CALMA) dataset [1].
These features are for instance used for constructing a reference timeline for the
alignment of different recordings of a given concert, which the archive provides
as separate files with inconsistent segment boundaries and completeness, sourced
from tape recordings of varying speeds. Figure 3 illustrates the system architec-
ture and a high-level representation of the linking process. The Web application
uses its own API acccepting SPARQL queries, as well as a Web Audio player
which recombines and streams audio from the LMA.

Functionality and Frontend Figure 2 shows a screenshot of the prototype
with a view of a particular concert, which juxtaposes the information, artefacts,
and recordings available for that particular date. Users can listen to the record-
ings, read about the venue, location, setlist, the musicians, etc, and find images
and historical weather data. Via links at each of the concepts and the timeline
of the band’s career, related concerts can be reached and explored. For example,
users can find any concert where a particular song was played directly through
such links. Additionally, a search function allows users to find more specific in-
formation which is then compiled and visually presented. Enabled by the audio
7
    http://etree.linkedmusic.org/about/
8
    https://wiki.dbpedia.org/
features stored in the graph, users will also be able to look for similar versions
of songs, compare aligned different recordings of the same concert, as well as
create playlists and audible collages based on particular search results.

Further Development Future work includes the evaluation of the application
with a user-study conducted in collaboration with the Internet Archive9 . The
study will consist of automated analysis of user-behaviour, as well as surveys for
the assessment of aspects of the user experience. We are currently working on
improving the functionality based on audio features extracted from live music
recordings, as well as exploring new possibilities of interacting with the audio
archive, continuing the work discussed in [9], implementing immersive audio
applications using different recordings of a given concert. Further work on the
data model will lead to an extension to the Music Ontology covering live music
events and associated artefacts.
Acknowledegments This paper is supported by EPSRC Grant EP/L019981/1,
Fusing Audio and Semantic Technologies for Intelligent Music Production and
Consumption. Mark B. Sandler acknowledges the support of the Royal Society
as a recipient of a Wolfson Research Merit Award.

References
1. Bechhofer, S., Page, K., Weigl, D., Fazekas, G., Wilmering, T.: Linked data publi-
   cation of live music archives and analyses. 16th International Semantic Web Con-
   ference (ISWC) 2017 (2017)
2. Benson, M.: Why the Grateful Dead Matter. ForeEdge Press (2016)
3. Collins, J.: Doing-it-together: Public history-making and activist archiving in online
   popular music community archives. In: Preserving Popular Music Heritage, pp. 91–
   104. Routledge (2015)
4. ICOM/CIDOC: Definition of the CIDOC conceptual reference model ver-
   sion     6.2.2.   Online:     http://www.cidoc-crm.org/sites/default/files/2017-09-
   30%23CIDOC%20CRM v6.2.2 esIP.pdf (2017)
5. International Federation of Library Associates and Institutions: Definition of FR-
   BRoo: A conceptual model for bibliographic information in object-oriented formal-
   ism. Online: https://www.ifla.org/files/assets/cataloguing/FRBRoo/frbroo v 2.4.
   pdf (2015)
6. International Federation of Library Associations and Institutions : Functional re-
   quirements for bibliographic records – final report. Online: https://www.ifla.org/
   files/assets/cataloguing/frbr/frbr 2008.pdf (2009)
7. Kjus, Y.: Live and Recorded: Music Experience in the Digital Millennium. Springer
   (2018)
8. Thalmann, F., Wilmering, T., Sandler, M.B.: Cultural heritage documentation and
   exploration of live music events with linked data. Workshop on Semantic Applica-
   tions for Audio and Music (SAAM 2018), Monterey, California USA (2018)
9. Wilmering, T., Thalmann, F., Sandler, M.B.: Grateful live: Mixing multiple record-
   ings of a Dead performance into an immersive experience. In: Proceedings of the
   Audio Engineering Society Convention 141 (2016)

9
    https://archive.org/