<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Collective user experience: Community-driven story co- authoring in live events</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Omar Niamut TNO The Netherlands</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jacco Taal Bitnomica The Netherlands</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>3rd International Workshop on Interactive Content Consumption at TVX'15</institution>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Nicholas Race Lancaster University Lancaster</institution>
          ,
          <country country="UK">UK</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Steven Simpson Mu Mu Lancaster University Lancaster University Lancaster</institution>
          ,
          <addr-line>UK Lancaster</addr-line>
          ,
          <country country="UK">UK</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Audio-visual narratives are becoming the most popular medium for information sharing and social storytelling around a live event. This paper explores the collective experience of users of an online creative storytelling ecosystem. The system provides an ideal platform to study community-driven story co-authoring helped by social networks and networked media, as highlighted in an eventbased user experiment.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>SYSTEM OVERVIEW</title>
      <p>The eco-system is comprised of the following four
components.
A mobile application to streamline the capturing, tagging,
sharing and browsing of user content. Metadata is essential
for social sharing, but manual entry can be cumbersome
during the live event, when users prefer to focus on the
creative process of capturing content. The application
should also capture metadata such as the geographical
location of the user device at the time of capturing.
An online editing tool for creative story authoring and
sharing. Conventional video editing tools such as Adobe
Premiere provide rich editing tools and effects for offline
professional editing. For online storytelling, story editors
can benefit more from direct referencing and editing of
videos online (without having to download them first).
A “lightweight” multimedia story-authoring engine suitable
for heterogeneous user devices and networks. The biggest
challenge of online storytelling using multimedia content is
the process and delivery of audio-visual content for
interactive user operations. An online story-authoring
system should aim to make online video editing as easy as
editing a shared text document and minimize the processing
and network load at user devices [1].</p>
      <p>Integration of social context. Widely adopted social
networks greatly influence user preferences and activities.
The rich context information and social atmosphere
embedded in social media is crucial in improving user
experience within social applications such as online
storytelling, particularly for live events. Figure 3 illustrates
the changing attributes and intensity of social interactions
during the life-cycle of a live event, and how integration of
social context benefits storytelling at different stages of the
event.
itsye
n
itt
n
n
e
v
e</p>
    </sec>
    <sec id="sec-2">
      <title>USER EXPERIMENTS</title>
      <p>An experiment was arranged at Schladming, Austria during
the Nightrace 2014 event to evaluate the storytelling
system. This experiment examined how the design of
multimedia systems can help to facilitate social interaction,
and how the integration of social context improves user
experiences within such a system. A number of test
participants travelled from the UK and the Netherlands to
the venues prior to the event as the main storytellers while a
few others joined the experiment from various locations in
the UK. Caching nodes were also installed in the UK, Italy,
and the Netherlands to study the effectiveness of chunk
caching for story playback.
The consensus among the test participants is that
storytelling of group experience of an event is “a very
natural thing to do”. Most participants found that using the
storytelling system for capturing and sharing their own
creations throughout the course of a live event made them
feel that they were “telling a live story to their friends”.
They were mostly adding the narratives while recording by
talking to the microphone. Sometimes a member of a group
spontaneously acted like a reporter and let the other group
members talk about what had just happened.</p>
      <p>Social-context integration is effective in improving video
annotation and in enhancing the search function during the
experiments. Using information from Twitter, an adaptive
event profiler (AEP) provides a list of related keywords and
a metric to quantify the relevance for user search requests.
An example for the search of “Schladming” is given below,
with lower values representing higher relevance. The list
covers a range of items such as sport, location (planai
mountain), and popular competitors.
Because of AEP’s ability to recognize trending events, the
integration of social context presents an ideal solution to the
classical “cold start” problem in content recommendation.
Given a user location, the storytelling system may suggest
stories related to socially trending keywords nearby.</p>
    </sec>
    <sec id="sec-3">
      <title>CONCLUSIONS</title>
      <p>Collaboratively authoring a story by joining shared
multimedia content with different perspectives is becoming
a popular way to enable collective user experience in live
events. Our online multimedia storytelling eco-system
enables the capturing, sharing, and authoring of user stories
online with a low footprint, while an event-based user
experiment highlighted the contribution of social features to
enhancing the user experience of loosely collaborating
groups in content annotation, retrieval and media
distribution as part of a storytelling eco-system. We
witnessed the unique role of collaborative user creativity in
the entire development cycle of engaging social stories, and
we have shown that social context derived from social
media, location-based services and emerging mobile
technologies can also greatly improve the creative story
capturing and authoring process.</p>
    </sec>
    <sec id="sec-4">
      <title>ACKNOWLEDGMENTS</title>
      <p>The work presented is supported by the European
Commission within the FP7 Project STEER (grant no
318343).</p>
    </sec>
    <sec id="sec-5">
      <title>REFERENCES</title>
      <p>1. Mu, M., Simpson, S., Race, N., Niamut, O., Koot, G.,
Kaptein, R., Taal, J. &amp; Mori, L.. “Let’s share a story”:
Socially-enhanced multimedia storytelling, In IEEE
Multimedia. 07/2015</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <article-title>time cold warm 1. Start event profiling: Register location name &amp; user-provided keywords 2. Enrich video metadata: Suggest related tags based on current location 3. Enhanced search: Suggest related keywords based on location or user-provided keywords Figure 3 Social trends</article-title>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>