<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>A Fundamental Element for Narrative Parsing</article-title>
      </title-group>
      <contrib-group>
        <aff id="aff0">
          <label>0</label>
          <institution>Richard Doust</institution>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>The Open University</institution>
          ,
          <country country="UK">UK</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Computational approaches to complex cultural objects such as narrative have tended to focus on the complete modelling problem, proposing high-level concepts that are manipulated to generate or analyse stories. For example, Tale-Spin [13] uses the characters' goals and Mexica [14] uses a tension curve to represent love, emotion and danger in order to drive the generation process. In this paper, we put forward a di erent view based on the idea of a fundamental element that di erentiates narrative from other forms of human cognition and that we call a narrative thread. Furthermore, this paper supports the position that splitting phenomena such as narrative into constituent parts and researching each one separately, thereby postponing research into their precise interaction, is a pertinent and fertile approach. Our narrative model focusses speci cally on suspense and relies on the assumption that the empathetic e ects of a given narrative can be isolated from its plot-level description. This assumption has enabled us to build and get feedback from a model that should increase our understanding of the di erent roles played by event order and empathy in story-telling. We implemented our model computationally for some simple textual narratives and obtained promising evaluation results (N=46) (see [5] and [6]).</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>If we have a pile of bricks, we know that if we put them together in a few standard
ways, we can end up building a wall and maybe even a whole building. But what
are the bricks of narrative? For sentence understanding, the bricks may be words
or perhaps morphemes. For vision, there are edges and surfaces from which we
derive the existence of objects. But, what is the right elemental object to capture
the `storiness' of narratives?</p>
      <p>In this paper, we explore the potential explanatory power of one answer
to that question, and show that our particular narrative brick can at least be
used to build a wall, and perhaps soon, a whole building. Of course, like all
scienti c processes, the real world will give our model feedback on the accuracy
of the analysis it proposes, pointing to faults and weaknesses and enabling new
constraints and modi cations to be imposed.</p>
      <p>
        Additionally, our focus for now will be on one ingredient which we believe
is constitutive of what makes a good story, namely: suspense. As discussed in
[
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], suspense is a pervasive narrative phenomenon that is associated with greater
enjoyment and emotional engagement. We thus ask: what media and
domainindependent model of suspense can we develop that could be useful in a wide
variety of situations for improving our techniques for creating new narratives?
      </p>
      <p>We frame this article by generalising the development steps leading to our
model with the aim of encouraging similar `fundamental' formal and
computational approaches aimed at understanding complex human cognition. The rest of
this paper details these steps.</p>
      <p>The rst step we follow is to look for something similar to a syntax/semantics
distinction. Secondly, we collect the constraints on a potential fundamental
element that could generate the syntax observed. We then create an element that
potentially satis es them and with it, derive a variety of theoretical behaviours
and e ects which can be tested and modi ed or refuted. Once the model achieves
a degree of explanatory power for some very speci c human experiences, we can
test it further and also use it generatively to create new cultural artefacts.</p>
    </sec>
    <sec id="sec-2">
      <title>Step 1: Making a syntax/semantics distinction</title>
      <p>1.1</p>
      <sec id="sec-2-1">
        <title>Event segmentation</title>
        <p>
          Before we can start our process, we must make the initial step of dividing up the
`stu ' we have into its basic parts. Stories and the real world have in common that
human cognition divides them up into events. [
          <xref ref-type="bibr" rid="ref18">18</xref>
          ] suggests that this may re ect
the existence of a general network for understanding event structure. Recent work
by [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ] has also found very strong convergence on event boundary judgments
across lm and textual media. Therefore, the rst front-end process we consider
is event segmentation. [
          <xref ref-type="bibr" rid="ref21">21</xref>
          ] and [
          <xref ref-type="bibr" rid="ref22">22</xref>
          ] propose an event indexing model which lists
features such as space, time, causality, and we base our event segmentation on
this model.
1.2
        </p>
      </sec>
      <sec id="sec-2-2">
        <title>Narrative syntax</title>
        <p>
          In `Why anyone would read a story anyway', [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ] categorised narrative interest
into two kinds: the cognitive interest that arises from the story structure and
the emotional interest that arises from the emotional context of the story.
Consequently, the rst step we propose is to separate the narrative content concerned
with emotions and empathy for story characters from the simple sequences of
interlinked events called a plot.
        </p>
        <p>This distinction is, of course, analoguous to the syntax/semantics
distinction prevalent in language understanding. To emphasise the parallels, we can
say the following: syntax/plot is the collection of rules that govern how words/
events are assembled into meaningful sentences/narrative sequences, whilst
semantics/empathy is concerned with the meaning/importance of the words/events
themselves.</p>
        <p>Of course, in stories, creating empathy is essentially linked to the concept of
a character. In our view, perhaps 70% of the sentences of a novel are concerned
primarily with creating and maintaining the basic empathetic link between the
reader and the main characters. However, we postpone work on story semantics
by assuming the existence of an `empathy module' that produces a certain output.
Our focus will be on the narrative parsing of the story plot.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Step 2: Creating a fundamental building block</title>
      <p>We next postulate the psychological existence and computational utility of
a fundamental element with which we can capture all the narrative-speci c
processing that is needed to generate and understand the `story syntax', or plot.
To create such an element, we rst identify the available and derivable constraints
such an element must satisfy.1
2.1</p>
      <sec id="sec-3-1">
        <title>Collecting the constraints on the building block</title>
        <p>
          Here we combine research in psychology on the requirements for knowledge
structures used for sentence understanding (see for example [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ]), research on the
requirements for narrative understanding by [
          <xref ref-type="bibr" rid="ref1">1</xref>
          ] and some of the requirements
from the Glaive project ([
          <xref ref-type="bibr" rid="ref19">19</xref>
          ]) to provide the basis for a de nition of the useful
size and complexity of a fundamental building block for narrative.
        </p>
      </sec>
      <sec id="sec-3-2">
        <title>Starting point and end-point</title>
        <p>{ The element must have a clear starting point and end-point.</p>
        <p>This is derived from Brewer and Lichtenstein's psychological theory of
narrative understanding which suggests that three major discourse structures account
for the enjoyment of a large number of stories: surprise, curiosity and suspense.
Their approach is based on the existence of Initiating Events (IE) and the
corresponding Outcome Events (OE) that are triggered by them.</p>
        <p>To produce suspense, the IE and OE must be ordered chronologically. In
addition, `often additional discourse material is placed between the initiating
event and the outcome event, to encourage the build-up of suspense' ([1, p. 17]).</p>
      </sec>
      <sec id="sec-3-3">
        <title>Linearity of events</title>
        <p>{ The element must have a clear linear path to completion.</p>
        <p>
          This is derived from the constraints on causal consequences grounded in
psychological research such as [
          <xref ref-type="bibr" rid="ref21">21</xref>
          ], the constructionist and prediction-sustantiation
models of narrative comprehension [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ] and scripts [
          <xref ref-type="bibr" rid="ref10 ref16">10, 16</xref>
          ]. This psychogical
research places the following conditions on ease of recall from long-term memory:
1 A complete account of the derivation and motivation of the following constraints is
given in [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ].
{ there is either a strongly supporting context: many di erent sources point to
the same object.
{ or a strongly directive context: very few alternative inferences are possible.
        </p>
        <p>We interpret these conditions together as indicating that our element must
consist of a linear series of events.</p>
      </sec>
      <sec id="sec-3-4">
        <title>Consistency</title>
        <p>{ The element must be internally consistent: no event can contradict any other
event in the same narrative thread.</p>
        <p>
          These constraints are inspired by the Glaive project ([
          <xref ref-type="bibr" rid="ref19">19</xref>
          ]) which distinguishes
between causal chains and intentional paths and imposes the following conditions:
{ No event in a causal chain can negate the preconditions of another event in
that chain.
{ A character must consent to all steps in a intentional path and intends the
nal e ect of the last step during all the preceding steps.
        </p>
      </sec>
      <sec id="sec-3-5">
        <title>Interruptibility</title>
        <p>{ The element must be interruptible.</p>
        <p>Much as events in the same fundamental element should not be in con ict,
events in di erent elements must have some potential for con ict in order to
produce the uncertainty necessary for suspense. We therefore need to model
information about the following type of interaction between events: if Event
E occurs in a story, then Event F can no longer occur in this story. This is
analogous to syntactic rules that exclude certain category sequences in sentence
grammar.
2.2</p>
      </sec>
      <sec id="sec-3-6">
        <title>Creating the fundamental element</title>
        <p>We now use all these constraints to deduce the fundamental element that will
form the basis of our model of narrative. We call this element a narrative thread.</p>
        <p>Combining constraints from 2.1 and 2.1, we extend the simple IE-OE pair to
obtain the following:</p>
        <p>IE ! Event1 ! Event2 ! Event3 ! ::: ! OE
(1)</p>
        <p>This simple script-form has the necessary clear nal result or outcome in the
storyworld that we need for suspense and, according to the constructionist model,
is also habitually and easily generated during narrative comprehension.</p>
        <p>Using the constraints in 2.1, we can allow our narrative threads to be built by
combining both causal chains and intentional paths whilst checking for internal
consistency.</p>
        <p>Finally, to create the necessary interruptibility mentioned in 2.1, we postulate
the existence of a set of disallowing pairs (E; F ) such that an event E in one
narrative thread can disallow an event F in a di erent thread.</p>
        <p>
          Here is an example of a narrative thread taken from [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]:
1. A wants to kill B
2. A plants a bomb in B's car
3. A checks that B gets in the car
4. A triggers a remote control device
5. The countdown starts on the remote control
6. The countdown starts in the car too
7. The countdown goes on for some time
8. The countdown reaches the end
9. The bomb explodes
10. B gets killed
        </p>
        <p>Narrative threads thus model a reader's expectations about what might
happen next in a given story. For a given storyworld, there will be multiple
narrative threads in competition with each other. Their interaction models what
goes on when a reader experiences stories in that storyworld.</p>
      </sec>
      <sec id="sec-3-7">
        <title>Checking for the modularity between construction/function Narrative</title>
        <p>threads are simply lists of events that are likely to follow each other. In general,
threads are informed by a variety of inferential and associative mechanisms:
scripts, models of story characters involving beliefs, goals and desires, principles
of naive physics and even previous narrative experiences2.</p>
        <p>Our claim is that it is not necessary to know exactly how di erent information
sources were used to construct narrative threads for them to be used successfully
to model narrative. The available storyworld information and inferences can
always be translated into the linear structure that we call a narrative thread. We
thus postulate a separation between the inferential sources that build narrative
threads, from the structure of the ongoing suspense processes they trigger and
maintain.</p>
        <p>
          Checking for generality We must distinguish certain typical narrative
instantiations from the fundamental narrative processes that underpin them. This is
the token/type distinction. Thus, for example, the fairytale models developed in
[
          <xref ref-type="bibr" rid="ref15">15</xref>
          ], should be seen rather as the socially calibrated and sedimented result of
narrative play than as fundamental models of narrative creation in themselves.
Moreover, these models should be explainable in terms of narrative threads.
2 Although models of intention such as plans and goals structures may be key elements
for many narrative threads, to use such structures as a fundamental element would
exclude narratives based more on causal interactions with the physical world. We
claim that it is possible to feel suspense about, say, an ice oe melting where no clear
plans and goals are present.
        </p>
        <p>Similarly, the structures used to model narrative should not be entangled
with the speci c types of narrative for which they create narrative e ects. Just
as with syntax, we should be able to create two narratives that have the same
formal structure even though they exist in two completely di erent storyworlds.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Step 3: Creating a sca old for new concept generation</title>
      <p>We now create a theoretical sca old for the story-telling phenomena upon which
we can formalise a variety of narrative phenomena.
3.1</p>
      <sec id="sec-4-1">
        <title>Modelling world knowledge</title>
        <p>Firstly, to model a given storyworld, we derive the set of narrative threads that
model the causal and intentional links it contains or suggests. We also create a
set of disallowing event-pairs that de ne the interactions between the threads3.
3.2</p>
      </sec>
      <sec id="sec-4-2">
        <title>Modelling the input</title>
        <p>The next step is to model the input to the storyworld. In this case, a story is just
one particular sequence of events chosen from all possible events, that is, from all
the events in the narrative threads used to model the storyworld. Telling a story,
under this account, is then simply to evoke one by one this sequence of events.
3.3</p>
      </sec>
      <sec id="sec-4-3">
        <title>Modelling their interaction</title>
        <p>When an event E is told in the story, any narrative thread T that contains
event E is activated. Intuitively, this means that the reader predicts that the
events that follow E in thread T will be told in the story. As a story progresses,
narrative threads are activated and de-activated. Also, some pairs of upcoming
events in di erent active narrative threads may be in con ict with each other,
that is, they may belong to the set of disallowing event-pairs.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Step 4: De ning new concepts</title>
      <p>
        Using our basic building block and this theoretical story-telling sca olding, we
can now derive some structurally re ned de nitions of a number of narrative
concepts. In our case, we make the claim that the following three fundamental
mechanisms occur in stories to create suspense and are usefully de ned as distinct
narrative phenomena:
3 For the moment, the narrative threads used in our computational model are derived
by hand. However, our narrative thread is very similar to the narrative schemas
developed by [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] and our disallowing relations are similar to the exclusion relations
derived in [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ], both of which can be harvested automatically. In future work, we
aim to apply our method to such automatically harvested schemas and extend our
model to di erent storyworlds and story variants.
{ completion-based suspense
{ con ict-based suspense
{ revelatory suspense
4.1
      </p>
      <sec id="sec-5-1">
        <title>Completion-based suspense</title>
        <p>Translated into a context of goals and plans, completion-based suspense would
correspond to the achievement of a goal. An example would be the captain of
a football team walking slowly towards the tribune to receive and raise in the
air the prize cup her team has just won. As the outcome event becomes ever
more imminent the suspense increases; we can talk of completion imminence.
There may be absolutely no expectation that the achievement of this goal will
be interrupted and yet we experience a kind of suspense in such cases4.</p>
        <p>Completion imminence is a function of the number of events in the thread
that remain to be told before it is complete. Figure 1 shows a thread with a
completion imminence of 4 events at this stage in the story. We have tagged
these events to show how the cup-raising episode could appear.
This occurs when at least two active narrative threads contain upcoming
incompatible events of which only one can actually occur. Furthermore, a big di erence
in story outcomes is expected depending on which event actually does occur.
An example would be a chase between a policeman and a thief who is trying to
cross a border. As the policeman gets closer and closer to catching the thief, the
imminence of the catching event increases; we can talk of interruption imminence.</p>
        <p>
          The con ict-based suspense of a thread A is related to the smallest number
of events still to be told in some other thread B before an event can be told
which interrupts A. To illustrate con ict-based suspense between two threads,
in Figure 2, we show thread A which has an interruption imminence of 3 events
due to thread B. We have tagged these events to show how the chase episode
described could appear.
4 Completion-based suspense may be linked to a phenomenon discovered by Lithuanian
psychologist Bluma Zeigarnik who observed the e ect of interruption on memory
processing. [
          <xref ref-type="bibr" rid="ref20">20</xref>
          ] proposes that when a task is started, it creates a quasi-need for its
completion such that when the task's progress is momentarily halted, the subject
remains in a state of tension until it is completed.
Curiosity in Brewer and Lichtenstein's model of narrative is about the past; we
wonder what happened before an event. If we see a man in sunglasses smiling as
he sees another man some distance away getting into his car, we might imagine
that it is because he had planned to meet his friend, or because he had planned
some sinister plot. Such situations evoke what we call relevatory suspense which
is linked to the disambiguation of a story event (smiling) that belongs to more
than one narrative thread.
        </p>
        <p>To model this, we imagine that in such situations a common event is present
in two di erent narrative threads as illustrated in Figure 3. When the common
event is told in the story, both threads become activated as candidates to explain
the event's presence in the story. Subsequent story events may disallow one of
the threads leaving one thread as the correct `explanation' of the common event.
This epistemological gap- lling process is suspenseful in itself: we know that
every time we nd out new information, it is likely to have a major e ect on the
set of active narrative threads.
Relative importance At any one moment in a given story, a variety of
suspenseful situations may be present. Relatively unimportant suspenseful situations
may coexist with life-or-death situations. The importance of these situations will
also often depend on the reader's emotional involvement with them and this
could be low or high and positively or negatively valenced.</p>
        <p>In our model, we presuppose the existence of a `empathy module' that ascribes
a relative emotional importance value to all the narrative threads. The value
ascription of a narrative thread is related to the reader's appraisal of the state of
the storyworld when the last event in the thread has been told. This use of a
single value to encompass a multitude of factors enables us to take into account
the modelling of emotions but at the same time keep our focus on the structure
of the information ow and its relation to suspense.</p>
        <p>Often, the importance of events is related to the fate of some story character.
In this case, we can base the importance values on the i) the current level of
sympathy (or antipathy) towards a character involved in an event, and ii) the
perceived desirability (or undesirability) of the event in relation to that character.
Foregroundedness If a given narrative thread T is not evoked for several story
steps during the telling of a story, we assume its e ect on suspense will drop
because it is less present in the reader's mind. At the same time, other narrative
threads are of course active and competing for the reader's attention. Of course,
as soon as a new event belonging to thread T is told, it comes again to the
foreground and regains its full potential for suspense creation. Therefore, in
addition to a measure of the relative importance of the narrative threads, we
also need a measure for the degree of foregrounding of a thread.5
4.5</p>
      </sec>
      <sec id="sec-5-2">
        <title>De ning a complete model of behaviour</title>
        <p>With a suitable heuristic, we can now combine our measures for the three types
of suspense: completion-based, con ict-based and revelatory with the additional
contextual measures of importance and foregroundedness to obtain values for
the suspense contributions of all active threads. We then combine these values
to create an overall suspensefulness value for each story-step. We now have the
apparatus needed to predict a step-by-step suspense pro le for a given story.</p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>Step 5: Testing the conceptual framework</title>
      <p>
        To test this conceptual framework, we rst implemented our narrative model
computationally. We then derived storyworld-speci c information in the form
5 In our model, foregroundedness is roughly equivalent to recency of mention. Recency
has been extensively researched in the psychological eld as an important factor
in uencing memory (see for example [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]).
of threads and disallowing pairs of events for a story about an anti-Ma a judge
driving back home one evening. We then created an online experimental set-up
that used participants' self-reporting of perceived suspense levels whilst reading
step by step through a story. We carried out a rst study to calibrate some of the
importance levels of our narrative threads. We then created a story variant, used
our computational model to create a predicted suspense pro le for this variant,
and again used our online set-up to gather participants' self-reported suspense
levels for this variant (N=46 for 31 steps).
      </p>
      <p>Analysis of the results using a variety of statistical tests show that our
narrative model predicts step-by-step uctuations in suspense levels for short
stories that have a high correlation with average self-reported human suspense
judgements6.</p>
      <p>Our computational implementation is fully independent of the storyworld
information. We thus claim to have created a model of suspense that could be
used for any media or genre.</p>
    </sec>
    <sec id="sec-7">
      <title>Step 6: Extending the model</title>
      <p>We see this work as a signpost towards further development of models of narrative
based on what we see as its fundamental ingredients. Further work should explain
how higher-level narrative concepts, such as certain types of plot and central
characters, naturally include the key ingredients of what is needed to build a
successful story. We now examine other directions for further research.
6.1</p>
      <sec id="sec-7-1">
        <title>Scene-switching: the power of meanwhile</title>
        <p>Scene-switching can be de ned as the alternation between di erent narrative
view-points that show di erent sequences of events belonging to the same story.
It is ubiquitous in suspenseful lms. One possible explanation for its use is that
scene-switching increases the length of time that the suspense generated by a
particular narrative thread is present in the story. If we have two narrative
threads A and B, and we show rst A then switch to B, then as long as A does
not get forgotten, its suspensefulness can continue to a ect the reader or viewer,
even as we are watching events in narrative thread B7.
6.2</p>
      </sec>
      <sec id="sec-7-2">
        <title>The narrative cycle</title>
        <p>
          The inspiration for our model came from work on suspense, curiosity and surprise
introduced in [
          <xref ref-type="bibr" rid="ref1">1</xref>
          ]. Future work should therefore extend the narrative thread model
6 The Pearson Correlation Coe cient was 0:82 and Spearman's Rho Coe cient was
0:79. See [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ] and [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ] for the details of the experiment and the results.
7 The power of scene-switching may also be linked to what is called the Zeigarnik
e ect ([
          <xref ref-type="bibr" rid="ref20">20</xref>
          ]), which suggests that sequences that are interrupted have more longer
lasting cognitive e ects and, for example, are remembered more precisely. Conversely,
sequences that achieve closure can be quickly forgotten.
to the concepts of curiosity and surprise in order to complete the typology of
story-building elements. Because situations where curiosity is evoked necessarily
involve uncon rmed threads, such situations often evoke surprise.
        </p>
        <p>Our narrative model can also model surprise. If thread A has many told
events and thread B has no told events, then surprise will occurs if an event from
thread B occurs in the story which suddenly disallows thread A.</p>
        <p>In our view, the key moments of many narratives in lm or text combine
all three of these entertaining narrative e ects in what we dub the `narrative
breathing cycle', illustrated in Figure 4 on page 11. In this cycle,
{ rst a suspenseful situation is interrupted by a surprising event
{ this event sets up a new phase of revelatory or con ict-based suspense
{ another surprising event occurs which interrupts this suspense phase</p>
        <p>Of course, stories di er in the amounts of con ict-based and revelatory
suspense they evoke and also in the length of time that suspense is maintained
before a new surprising event occurs. We can say that stories have di erent
suspense and surprise pro les.
6.3</p>
      </sec>
      <sec id="sec-7-3">
        <title>A functional theory of narrative</title>
        <p>
          Earlier work on a net-linguistic implementation of an Earley Parser (see [
          <xref ref-type="bibr" rid="ref17">17</xref>
          ])
in uenced the development of our model. A clear analogy can be drawn between
linguistic theories on sentence disambiguation and the model of suspense we
propose. Narrative threads are analogous to lists of grammatical categories and
the disambiguating process between narrative threads is analogous to syntactic
parsing. Revelatory suspense is analogous to words that seems to belong to two
or more linguistic categories. Surprise could be seen as analogous to the e ect of
garden-path sentences.
        </p>
        <p>In this light, our research raises the possibility of constructing a complete
functional theory of narrative. Such a theory would postulate that all the story
steps that an author produces must modify, disallow or activate at least one
narrative thread and thus have some e ect on the surprise, curiosity or suspense
of the story at that point. Thus, just as we identify the function that each word
plays in a sentence, we could identify the function that each story step plays in a
story. We could derive a functional content-independent map of a story, much
like the syntactic analysis of a sentence.</p>
      </sec>
    </sec>
    <sec id="sec-8">
      <title>Conclusion</title>
      <p>In our quest for a fundamental element underpinning narrative, we have been
able to separate out suspense as an essential ingredient of narrative and also
given a possible theoretical analysis of how it functions. We have also proposed a
typology of suspense. This not only enables us to predict the suspense pro les of
textual or lmic stories, but potentially also of other narrative-like artefacts such
as music, advertising and humour. We think that it should be possible to use our
model to generate suspenseful stories from a pre-de ned storyworld ex nihilo.</p>
      <p>In making these contributions, we have followed the position that splitting
phenomena such as narrative into its constituent parts is an approach that
allows a high degree of portability into a variety of domains. We believe that the
increased creativity and inter-disciplinarity that this way of proceeding fosters is
a good measure of its potential.</p>
      <p>
        Furthermore, as readers of suspenseful novels know, suspense makes us focus.
If the balance between skill and challenge is just right for a reader constantly
attempting to understand and predict what will happen next in a suspenseful
story, they may enter into the creativity-boosting state of Flow (see [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]). Suspense
can be therefore be seen as having a direct e ect on creativity.
      </p>
    </sec>
    <sec id="sec-9">
      <title>Acknowledgments</title>
      <p>Most of this research was undertaken as part of a Ph.D. program at the Open
University, UK under the supervision of Richard Power and Paul Piwek.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Brewer</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lichtenstein</surname>
          </string-name>
          , E.:
          <article-title>Stories are to entertain: A structural-a ect theory of stories</article-title>
          .
          <source>Journal of Pragmatics</source>
          <volume>6</volume>
          (
          <issue>5</issue>
          -6) (
          <year>1982</year>
          )
          <volume>473</volume>
          {
          <fpage>486</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Chambers</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jurafsky</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          :
          <article-title>Unsupervised learning of narrative schemas and their participants</article-title>
          .
          <source>In: Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP. Volume</source>
          <volume>2</volume>
          ., Association for Computational Linguistics (
          <year>2009</year>
          )
          <volume>602</volume>
          {
          <fpage>610</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Csikszentmihalyi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Flow: The psychology of optimal experience</article-title>
          . Volume
          <volume>41</volume>
          . HarperPerennial New York (
          <year>1991</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Delatorre</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Arfe</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gervas</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Palomo-Duarte</surname>
            ,
            <given-names>M.:</given-names>
          </string-name>
          <article-title>A component-based architecture for suspense modelling</article-title>
          .
          <source>Proceedings of AISB, 3rd International on Computational Creativity (CC2016)</source>
          (
          <year>2016</year>
          )
          <volume>32</volume>
          {
          <fpage>39</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Doust</surname>
            ,
            <given-names>R.:</given-names>
          </string-name>
          <article-title>A domain-independent model of suspense in narrative</article-title>
          .
          <source>PhD thesis</source>
          , The Open University (
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Doust</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Piwek</surname>
            ,
            <given-names>P.:</given-names>
          </string-name>
          <article-title>A model of suspense for narrative generation</article-title>
          .
          <source>In: International Natural Language Generation (INLG2017)</source>
          . (
          <year>2017</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Graesser</surname>
            ,
            <given-names>A.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Singer</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Trabasso</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          :
          <article-title>Constructing inferences during narrative text comprehension</article-title>
          .
          <source>Psychological review 101(3)</source>
          (
          <year>1994</year>
          )
          <fpage>371</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Jones</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Love</surname>
            ,
            <given-names>B.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Maddox</surname>
          </string-name>
          , W.T.:
          <article-title>Recency e ects as a window to generalization: separating decisional and perceptual sequential e ects in category learning</article-title>
          .
          <source>Journal of Experimental Psychology: Learning, Memory, and Cognition</source>
          <volume>32</volume>
          (
          <issue>2</issue>
          ) (
          <year>2006</year>
          )
          <volume>316</volume>
          {
          <fpage>332</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Kintsch</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          :
          <article-title>Learning from text, levels of comprehension, or: Why anyone would read a story anyway</article-title>
          .
          <source>Poetics</source>
          <volume>9</volume>
          (
          <issue>1</issue>
          ) (
          <year>1980</year>
          )
          <volume>87</volume>
          {
          <fpage>98</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Lebowitz</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Story-telling as planning and learning</article-title>
          .
          <source>Poetics</source>
          <volume>14</volume>
          (
          <issue>6</issue>
          ) (
          <year>1985</year>
          )
          <volume>483</volume>
          {
          <fpage>502</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lee-Urban</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Johnston</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Riedl</surname>
            ,
            <given-names>M.:</given-names>
          </string-name>
          <article-title>Story generation with crowdsourced plot graphs</article-title>
          .
          <source>In: AAAI</source>
          . (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Magliano</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kopp</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>McNerney</surname>
            ,
            <given-names>M.W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Radvansky</surname>
            ,
            <given-names>G.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zacks</surname>
            ,
            <given-names>J.M.</given-names>
          </string-name>
          :
          <article-title>Aging and perceived event structure as a function of modality</article-title>
          .
          <source>Aging, Neuropsychology, and Cognition</source>
          <volume>19</volume>
          (
          <issue>1-2</issue>
          ) (
          <year>2012</year>
          )
          <volume>264</volume>
          {
          <fpage>282</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Meehan</surname>
          </string-name>
          , J.:
          <article-title>Tale-spin, an interactive program that writes stories</article-title>
          .
          <source>In: Proceedings of the Fifth International Joint Conference on Arti cial Intelligence (IJCAI)</source>
          , Cambridge, MA, USA. Volume
          <volume>1</volume>
          . (
          <year>1977</year>
          )
          <volume>91</volume>
          {
          <fpage>98</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Perez</surname>
            y Perez,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sharples</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Mexica: A computer model of a cognitive account of creative writing</article-title>
          .
          <source>Journal of Experimental and Theoretical Arti cial Intelligence</source>
          <volume>13</volume>
          (
          <issue>2</issue>
          ) (
          <year>2001</year>
          )
          <volume>119</volume>
          {
          <fpage>139</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Propp</surname>
            ,
            <given-names>V.I.</given-names>
          </string-name>
          :
          <article-title>Morphology of the Folktale</article-title>
          . Volume
          <volume>9</volume>
          . University of Texas Press (
          <year>1968</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Schank</surname>
            ,
            <given-names>R.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Abelson</surname>
            ,
            <given-names>R.P.</given-names>
          </string-name>
          :
          <article-title>Scripts, plans, goals and understanding: an inquiry into human knowledge structures</article-title>
          . Lawrence Erlbaum Associates Publishers, Hillsdale, NJ (
          <year>1977</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Schnelle</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Doust</surname>
            , R.:
            <given-names>A</given-names>
          </string-name>
          <string-name>
            <surname>Net-Linguistic</surname>
          </string-name>
          \
          <article-title>Earley" Parser</article-title>
          . In: Connectionist Approaches to Language Processing. Routledge (
          <year>1992</year>
          )
          <volume>170</volume>
          {
          <fpage>205</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Speer</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zacks</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Reynolds</surname>
          </string-name>
          , J.:
          <article-title>Perceiving narrated events</article-title>
          .
          <source>In: Proceedings of the 26th Annual Meeting of the Cognitive Science Society</source>
          , Chicago, IL. (
          <year>2004</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          19.
          <string-name>
            <surname>Ware</surname>
            ,
            <given-names>S.G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Young</surname>
            ,
            <given-names>R.M.:</given-names>
          </string-name>
          <article-title>Glaive: A state-space narrative planner supporting intentionality and con ict</article-title>
          .
          <source>In: Proceedings of the 10th International Conference on Arti cial Intelligence and Interactive Digital Entertainment, AIIDE2014</source>
          , North Carolina State University, Raleigh, NC USA (
          <year>2014</year>
          )
          <volume>80</volume>
          {
          <fpage>86</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          20.
          <string-name>
            <surname>Zeigarnik</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          :
          <article-title>On nished and un nished tasks. A source book of Gestalt psychology 1 (</article-title>
          <year>1938</year>
          )
          <volume>300</volume>
          {
          <fpage>314</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          21.
          <string-name>
            <surname>Zwaan</surname>
            ,
            <given-names>R.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Langston</surname>
            ,
            <given-names>M.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Graesser</surname>
            ,
            <given-names>A.C.</given-names>
          </string-name>
          :
          <article-title>The construction of situation models in narrative comprehension: An event-indexing model</article-title>
          .
          <source>Psychological Science</source>
          <volume>6</volume>
          (
          <issue>5</issue>
          ) (
          <year>1995</year>
          )
          <volume>292</volume>
          {
          <fpage>297</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          22.
          <string-name>
            <surname>Zwaan</surname>
            ,
            <given-names>R.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Radvansky</surname>
            ,
            <given-names>G.A.</given-names>
          </string-name>
          :
          <article-title>Situation models in language comprehension and memory</article-title>
          .
          <source>Psychological bulletin 123(2)</source>
          (
          <year>1998</year>
          )
          <fpage>162</fpage>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>