<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Modeling User Interface Adaptation for Customer- Experience Optimization</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Christian Märtin</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Christian Herdin</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Bärbel Bissinger</string-name>
          <email>baerbelbissinger@web.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Augsburg University of Applied Sciences Faculty of Computer Science Augsburg</institution>
          ,
          <country country="DE">Germany</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Christian.Maertin</institution>
        </aff>
      </contrib-group>
      <fpage>68</fpage>
      <lpage>72</lpage>
      <abstract>
        <p>-The customer journey in digital marketing defines several touch points, where interested users can directly interact with an e-business platform. In order to convert a user into a buyer, persona-based a priori adaptations of the user interface can be combined with dynamic adaptations at runtime with the goal to optimize individual customer experience and guide task accomplishment. This paper examines customer experience optimization for scenarios from a cosmetics industry e-business portal with the SitAdapt 2.0 system. Dynamic adaptations are triggered by situation rules based on the continuous analysis of the users' varying cognitive and emotional situations during a session. The model-based adaptation process exploits models and patterns for the rapid generation of user interface modifications.</p>
      </abstract>
      <kwd-group>
        <kwd>customer journey</kwd>
        <kwd>user experience</kwd>
        <kwd>customer experience</kwd>
        <kwd>situation analytics</kwd>
        <kwd>situation rules</kwd>
        <kwd>emotion recognition</kwd>
        <kwd>eye-tracking</kwd>
        <kwd>HCI-patterns</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>I. INTRODUCTION AND RELATED WORK</title>
      <p>
        Digitalization in marketing can be seen as a straightforward
approach to designing and implementing IT-based solutions for
the generic steps of the customer journey. A customer journey
is a customer’s interaction at several touch points with a
service or several services of one or more service providers in
order to achieve a specific goal [9]. More focused on
purchasing a product, the customer journey can be defined as
an iterative process that includes touch point based interactions
with a provider or a business during a pre-purchase, a purchase,
and a post-purchase phase [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. The journey could include
experiences from earlier purchases and affect future purchases.
In this view no fixed a priori purchase goal is necessary, but the
service provider would try to arouse the interest of potential
customers in the pre-purchase phase. At all touch points
between the provider and the customer, one has to distinguish
between the customer view and the provider view. It must be
the provider’s goal at every touch point, to create a situation
that leads to optimum user experience (UX) for the potential
customer.
      </p>
      <p>
        UX during the customer journey is often described as
customer experience. As an extract and synthesis of earlier
research efforts customer experience can be seen as “a
multidimensional construct” that focuses on “a customer’s
cognitive, emotional, behavioral, sensorial, and social”
reactions to the offerings of a provider or a business “during
the customer’s entire purchase journey” [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ].
      </p>
      <p>
        With SitAdapt [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ], [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ] we have developed a software
architecture for situation analytics and for integrating adaptive
behavior into web- or app-based interactive applications.
SitAdapt fulfills the requirements for automating essential parts
of the customer experience optimization process as well as for
various other domains from medical monitoring to driver
assistance systems. Possible adaptations are modeled within the
PaMGIS MBUID framework [5], [6]. They are triggered by
situation rules and generated by activating and exploiting
domain-dependent and independent HCI-patterns. In this paper
we present our preliminary lab-based results for using the
current implementation SitAdapt 2.0 with a new rule editor and
an advanced situation interpreter within the e-commerce
domain1.
      </p>
      <p>The paper includes the following main contributions:</p>
      <p>
        Discussion of a new model-based approach [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ] for
automating customer experience optimization
Defining the potential for software adaptation [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ],
[
        <xref ref-type="bibr" rid="ref12">12</xref>
        ], [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ], [
        <xref ref-type="bibr" rid="ref20">20</xref>
        ] based on situation analytics [3],
contextawareness [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ], and situation-awareness [7]
Demonstrating the suitability of emotion recognition
and bio-signal tracking for triggering user interface
modifications [8], [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ], [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ], [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ].
      </p>
      <p>Detailing the adaptation process and workflow for the
e-business domain</p>
    </sec>
    <sec id="sec-2">
      <title>The remainder of the paper is structured as follows:</title>
      <p>Chapter II introduces the SitAdapt 2.0 system with its new
rule editor. Chapter III first introduces possible adaptation
features and defines example scenarios for generic and
individual situations that are occurring in different phases of
the customer journey when visiting a cosmetics business portal.
Some of the possible SitAdapt 2.0 use-cases are demonstrated.
After this, the chapter discusses the modeling and generation of
adaptations. Chapter IV concludes the paper.
1 Part of this work was carried out in cooperation with Dr.
Grandel GmbH, Augsburg, Germany. We greatly
acknowledge the opportunity to run the SitAdapt 2.0 tools and
user tests on their enterprise e-business platform.</p>
    </sec>
    <sec id="sec-3">
      <title>II. SITADAPT 2.0</title>
      <p>The SitAdapt 2.0 runtime environment is integrated into the
PaMGIS (Pattern-based Modeling and Generation of
Interactive Systems) development framework. The framework
allows for modeling and generating responsive behavior in the
user interface and has now been enhanced towards dynamic
adaptation by situation interpretation at runtime.</p>
      <p>The architecture (Fig. 1) consists of the following parts:
•</p>
      <p>The data interfaces from the different devices (Tobii
eye-tracker2, Empatica wristband3, Noldus Facereader4,
metadata from the application)</p>
      <p>
        The recording component synchronizes the different
input records with a timestamp, records the eye- and
gaze-tracking signal of the user and tracks the emotional
video facial expression as a combination of the six basic
emotions (happy, sad, scared, disgusted, surprised, and
angry) based on Ekman’s model [4]. Other recorded
data about the user are, e.g., age-range and gender [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ].
The stress-level and other biometric data are recorded in
real-time by a wristband. In addition, mouse movements
and keyboard logs are protocolled [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ].
      </p>
      <p>The database writer stores the data from the recording
component and from the browser in the database in the
form of discrete raw situations and manages the
communication with the rule editor. Raw situations are
generated at each tick of a predefined time frame
varying from 1/60s to 1s.</p>
      <p>The rule editor allows the definition and modification
of situation rules, e.g. for specifying the different user
states (e.g. if a happy state is observed, it will only
become relevant, if the state lasts more than five
seconds and the grade of the emotion surpasses a certain
activation level). For experimenting with rule heuristics
and observing users we built a prototypical web
application for long distance travel booking. In addition
2 www.tobii.com
3 https://www.empatica.com/en-eu/research/e4/
4
www.noldus.com/human-behaviorresearch/products/facereader
•
•
we used results from our cosmetics industry user study
[1] for finding plausible situation rules. Fig. 2 shows the
creation of a simple situation rule with two conditions
and one action. In this case only a dialog with the user
is created. However, situation rules can also activate
HCI-patterns in the PaMGIS pattern repository. These
patterns are exploited at runtime to generate user
interface adaptations from predefined UI-, task-, or
domain-model fragments.</p>
      <p>The situation analytics component analyzes the
sequences of raw situations with their parameters
varying over time and condenses them to a situation
profile holding the most significant information about
the currently applying situations. Typical situations can
be described in the form of situation patterns. The
situation analytics component matches the raw
sequences to such situation patterns. A set of typical
domain-dependent and independent situation patterns is
available in the PaMGIS pattern repository. Such
situation patterns can serve as templates for creating
situation rules with the rule editor, where an action part
with one or more actions is added. New situation
patterns can be discovered by running offline data
mining tools, e.g., RapidMiner5, on the raw situation
sequences recorded during multiple sessions.</p>
      <p>The evaluation and adaptation component uses the
situation profile provided by the situation analytics
component to decide whether an adaptation of the user
interface is meaningful and necessary at a specific
moment. For this purpose the component evaluates
given situation rules. Whether an adaptation is
meaningful depends on the predefined purpose of the
situation-aware target application. Goals to meet can
range from successful marketing activities in
ebusiness, e.g. having the user buying an item from the
e-shop or letting her or him browse through the latest
special offers, to improved customer experience levels,
or to meeting user desires defined by the hidden mental
states of the user. The adaptation component finally
generates the necessary modifications of the interactive
target application.</p>
      <p>These architectural components are necessary for enabling
the PaMGIS framework to support automated adaptive user
interfaces. In the user interface construction process, the
SitAdapt 2.0 evaluation and adaptation component cooperates
with the models of the interactive application (abstract,
concrete and final user interface model, context of use models,
task and concept model) and can also access the HCI-patterns
(not to be confused with the situation patterns) residing in the
PaMGIS repositories to build the necessary modifications of
the user interface at runtime.</p>
      <p>III. AUTOMATING CUSTOMER EXPERIENCE OPTIMIZATION IN
E</p>
      <p>BUSINESS</p>
      <p>As a promising candidate domain for exploring situation
analytics and situation-aware adaptation we have selected the
e-business and e-commerce fields. In our current project we
focus on a commercial cosmetics e-business portal.</p>
      <sec id="sec-3-1">
        <title>A. Dynamic Adaptation Features</title>
        <p>We have implemented dynamic adaptation features for
presession, first session and recurring session adaptation. Typical
adaptation features are related to the following areas:
Visual appearance of the application
• Gender or age specific coloring
• Gender or age specific image selections
• Soothing image or color selection
• Age specific element size
• Element ordering or widget selection dependent on age
or emotional state
• Screen contrast dependent on clock time, bio-physical
or emotional user state
• Font type, font size dependent on age, clock time,
biophysical or emotional user state
New user interface or content elements
• Tutorial-offering at first session or dependent of user
age
• Help functionality, e.g. chat window, help menu item,
tool tips, UI element tips dependent on user behavior
• Personalized fields and panes (user- and
behaviorspecific advertisement</p>
      </sec>
      <sec id="sec-3-2">
        <title>Content-based adaptation</title>
        <p>• Personalized product offers or suggestions
• Voucher offering dependent on user behavior
• User feedback functionality dependent on user behavior</p>
      </sec>
      <sec id="sec-3-3">
        <title>B. Complex Situation Examples for a Cosmetics Portal</title>
        <p>In a comprehensive user study with 9 female test persons
we tested the usability, user experience and emotional behavior
for several scenarios when interacting with a real-world
cosmetics industry web-portal [1]. These tests served as the
basis for finding domain-dependent situations and formulating
situation rules. Due to the limited space we can only discuss
some of the most interesting findings in this section.</p>
        <p>In order to illustrate the potential of situation-aware
adaptation we present some real-world situation examples and
possible adaptive reactions. In the first example (Fig. 3) a test
person is searching for a specific winter skin cream. Upon
reading the detailed description of the product Winter Silk
Crème, the user’s emotional state significantly changes to
happy. A situation rule could now exploit this knowledge to
give additional information about other winter products. The
improved customer experience near the purchase touch point
can directly lead to a purchase of this and similar products.</p>
        <p>In the next example (Fig. 4 and 5), the system has gathered
a priori knowledge about the varying gaze behavior of test
persons, who are known customers of the business or who are
here for the first time, by distinguishing between the
labcreated heat maps. The gaze behavior with respect to this
image can be used to categorize anonymous users. The
customer experience during the pre-purchase phase can be
improved. When the system assumes a returning customer, the
focus of her further customer journey will be put on showing
aesthetic images, while in the other case more descriptive
information will be given during the rest of the customer
journey.</p>
        <p>Another application area for using situation analytics in the
e-business field is the evaluation and fine-tuning of pre-defined
customer personas, which are used for pre-runtime adaptations
and configurations of an application. Focusing on personas for
a priori adaptation of the cosmetics portal can e.g. affect the
visual appearance, the product content structure, the level of
the product description language, the appearance of special
advertisements, or the gaming and social media orientation of
the website. Are test persons behaving like their respective
personas or are there significant deviations from the expected
behavior? This can be evaluated by comparing the situation
profiles that come up during persona-adapted user tests with
the typical situation profiles specified during the persona
definition process. Vice-versa SitAdapt 2.0 can classify
unknown customers or first time visitors into one of the given
persona categories by analyzing the situations appearing during
the session and by analyzing the users’ behavior after
situationrule triggered adaptations.</p>
        <p>All of these user observations and behavior evaluations as
well as the adaptations of the interactive software are currently
done in our situation analytics lab environment. The rapid
evolution of visual and biophysical user tracking and
monitoring technology will enable situation-aware individual
adaptations for the end user in the near future.</p>
      </sec>
      <sec id="sec-3-4">
        <title>C. Adaptation Modeling and Adaptation Process</title>
        <p>
          By applying our MBUID approach, the modeling,
generation and adaptation of the target website is done with the
help of the PaMGIS framework and the integrated SitAdapt 2.0
system (Fig. 1). The PaMGIS framework is based on the
Cameleon Reference Framework (CRF) [2], [
          <xref ref-type="bibr" rid="ref18">18</xref>
          ]. In the
construction process first of all, the abstract user interface
model (AUI) is generated from the information contained in
the domain model of the application that includes a task model
and the concept model (i.e. business model) and defines
abstract user interface objects that are still independent of the
context of use. This AUI model can then be transformed into a
concrete user interface model (CUI) that already exploits the
context of use model that includes the user, device, UI-Toolkit
and environment model, and the dialog model, which is
responsible for the dynamic user interface behavior. In the next
step, the final user interface model (FUI) is generated by
parsing the CUI model and by exploiting the context of use
model and the layout model [
          <xref ref-type="bibr" rid="ref15">15</xref>
          ].
        </p>
        <p>The first displayed version of the product e-commerce
website is already adapted to the user. For example by using
the age and target device information from the context of use
model. The SitAdapt 2.0 system permanently monitors the user
and recognizes the situations she or he is experiencing while
viewing the webpage and interacting with the user interface.
The evaluation component recognizes in the first example (Fig.
3), that the user reads the text attentively and that the level of
the happy emotion surpasses a given minimum duration (e.g.
more than 5 seconds). These data come from the raw situation
sequences stored in the database by the recording component.
The various inputs from the Facereader (emotion) and eye
tracking (text screen field) and the metadata of the website
(URL) were evaluated just in time by the situation analytics
component that has created the following situation profile of
the current situation:
&lt;SituationProfile&gt;
&lt;TargetApplication&gt;Desktop_PC
...
&lt;Situation_product_view&gt;
&lt;FUI_link&gt; product_view
&lt;AUI_link&gt;
model_AUI_product_view_1
&lt;CUI_link&gt;
model_CUI_product_view_1
&lt;Dialog_link&gt; product_view
&lt;Task_link&gt; product_view
&lt;Concept_link&gt;
model_concept_Product_view_1</p>
        <p>&lt;Eye_Tracking&gt;
Product_Textbox_Product_1(10 s)
&lt;USRUA_Age_Range&gt;30-50
&lt;USRUA_Gender&gt;female
&lt;EmotionalState&gt; happy
&lt;UserPsychologicalState&gt;
&lt;BiometricState&gt;
&lt;Pulse&gt; normal
&lt;StressLevel&gt; green
...</p>
        <p>&lt;/Situation_product_view&gt;
&lt;/SituationProfile&gt;</p>
        <p>The evaluation and adaptation component examines the
situation profile to decide, if an adaptation can take place. This
is usually achieved by activating the responsible sub-set of
situation rules in the rule editor (Fig 2.). Alternatively, the
programmer or web designer can directly provide code for
interpreting the situation profile in the web application client or
server, which is triggered when the user interacts with specific
elements of the user interface</p>
        <p>In the concrete example, the situation rules specify that
additional information about other winter cosmetic products
should be displayed in this particular situation. The decision
can be refined by also taking into account the user persona, if it
is already known. For a strictly goal-oriented persona, a new
window with additional product information may be shown.
For a more cautious persona, a question text may appear,
whether additional product information about winter products
is welcome.</p>
        <p>The evaluation and adaptation component now starts the
adaptation process, which leads to the generation of a modified
user interface. A new CUI and subsequently a FUI is generated
and displayed to the user. The PaMGIS modeling environment
must provide all the necessary models, model variants and
model fragments necessary for user interface modifications.
User interface models may contain links to HCI-patterns that
can facilitate user interface code generation. More complex
adaptations may also activate different tasks specified in the
task model and require the activation of non-UI service code.</p>
        <p>By observing the users’ emotional behavior after such
adaptations, the quality of the situation rules and the respective
adaptations can be evaluated and rated. Such information can
be used offline for refining the situation rule set for later use.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>IV. CONCLUSION</title>
      <p>SitAdapt 2.0 is an advanced architecture for automating
user interface adaptation for responsive web-applications that
were constructed with the PaMGIS MBUID framework. This
paper has demonstrated the flexibility and versatility of this
new approach by testing it with different scenarios and touch
points of the customer journey in a commercial e-business
platform for cosmetics products. It could be demonstrated that
emotion recognition combined with eye- and gaze-tracking can
be a powerful method for assessing situations and finding
possible adaptations at application runtime. By the lab-based
observation of users through multiple visual and physical
channels we could establish a basis for improving the customer
experience in the pre-purchase and after purchase phases,
because the gathered knowledge can be used for optimizing a
priori and persona-based adaptations and can lead to improved
situation rules. A follow-up study that is currently under way
will evaluate the effectiveness of the persona-based a priori and
situation-aware runtime adaptations for the perceived
individual customer experience.</p>
      <p>In the future we will combine SitAdapt 2.0 functionality
with web and big-data analytics to further improve the
customer experience of e-business applications and to evaluate,
which of the applied user monitoring technologies can be
helpful in situation-aware end-user environments.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>Bissinger</surname>
            ,
            <given-names>B.C.</given-names>
          </string-name>
          :
          <article-title>Messung und Analyse von bio-physischen und visuellen Daten zur Optimierung der User Experience und des Digitalmarketings</article-title>
          ,
          <source>Master Thesis</source>
          , Business Information Systems, Augsburg University of Applied Sciences,
          <year>2018</year>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <surname>Calvary</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coutaz</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bouillon</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          et al.,
          <year>2002</year>
          . “
          <article-title>The CAMELEON Reference Framework”</article-title>
          .
          <source>Retrieved August 25</source>
          ,
          <year>2016</year>
          from http://giove.isti.cnr.it/projects/cameleon/pdf/CAMELEON%20D1.
          <article-title>1RefF ramework</article-title>
          .pdf
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>Chang</surname>
            ,
            <given-names>C.K.</given-names>
          </string-name>
          :
          <article-title>Situation Analytics: A Foundation for a New Software Engineering Paradigm</article-title>
          , IEEE Computer, Jan.
          <year>2016</year>
          , pp.
          <fpage>24</fpage>
          -
          <lpage>33</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <given-names>P.</given-names>
            <surname>Ekman</surname>
          </string-name>
          , “
          <article-title>An argument for basic emotions</article-title>
          ,” Cogn. Emot., vol.
          <volume>6</volume>
          , no.
          <issue>3-4</issue>
          , pp.
          <fpage>169</fpage>
          -
          <lpage>200</lpage>
          ,
          <year>1992</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <surname>Engel</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Märtin</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Forbrig</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          :
          <article-title>Practical Aspects of Pattern-supported Model-driven User Interface Generation</article-title>
          .
          <source>Proc. HCI International</source>
          <year>2017</year>
          , Vancouver, Canada,
          <fpage>9</fpage>
          -
          <issue>14</issue>
          <year>July</year>
          , Vol. I,
          <string-name>
            <surname>Springer</surname>
            <given-names>LNCS</given-names>
          </string-name>
          ,
          <year>2017</year>
          , pp.
          <fpage>397</fpage>
          -
          <lpage>414</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <surname>Engel</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Märtin</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Forbrig</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          :
          <article-title>A Concerted Model-driven and Pattern-based Framework for Developing User Interfaces of Interactive Ubiquitous Applications</article-title>
          ,
          <source>Proc. First Int. Workshop on Large-scale and Model-based Interactive Systems</source>
          , Duisburg, (
          <year>2015</year>
          ), pp.
          <fpage>35</fpage>
          -
          <lpage>41</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <surname>Flach</surname>
            ,
            <given-names>J.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mulder</surname>
            , M, Van Paassen,
            <given-names>M.M.:</given-names>
          </string-name>
          <article-title>The Concept of the Situation in Psychology</article-title>
          , in: Banbury,
          <string-name>
            <given-names>S.</given-names>
            and
            <surname>Tremblay</surname>
          </string-name>
          , S. (eds):
          <article-title>A Cognitive Approach to Situation Awareness: Theory and Applications</article-title>
          , Ashgate Publisching,
          <source>Oxon (UK)</source>
          , (
          <year>2004</year>
          ), pp.
          <fpage>42</fpage>
          -
          <lpage>60</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <surname>Galindo</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          et al.:
          <article-title>Using user emotions to trigger UI adaptation</article-title>
          ,
          <source>Proc. 12th Int. Conf. on Research Challenges in Information Science (RCIS)</source>
          , (
          <year>2018</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <surname>Halvorsrud</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kvale</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Følstad</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Improving Service Quality through Customer Journey Analysis</article-title>
          ,
          <source>J. of service theory and practice</source>
          , vol.
          <volume>26</volume>
          ,
          <issue>6</issue>
          , (
          <year>2016</year>
          ), pp.
          <fpage>840</fpage>
          -
          <lpage>867</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Herdin</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Märtin</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Forbrig</surname>
            ,
            <given-names>P.:</given-names>
          </string-name>
          <article-title>SitAdapt: An architecture for situation-aware runtime adaptation of interactive systems</article-title>
          .
          <source>Proc. HCI International</source>
          <year>2017</year>
          , Vancouver, Canada,
          <fpage>9</fpage>
          -
          <issue>14</issue>
          <year>July</year>
          , Vol. I,
          <string-name>
            <surname>Springer</surname>
            <given-names>LNCS</given-names>
          </string-name>
          ,
          <year>2017</year>
          , pp.
          <fpage>447</fpage>
          -
          <lpage>455</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Hibbeln</surname>
          </string-name>
          , Martin; Jenkins, Jeffrey L.;
          <string-name>
            <surname>Schneider</surname>
          </string-name>
          , Christoph; Valacich, Joseph S.; and Weinmann, Markus.
          <year>2017</year>
          .
          <article-title>"How Is Your User Feeling? Inferring Emotion Through Human Computer Interaction Devices,"</article-title>
          <source>MIS Quarterly</source>
          , (
          <volume>41</volume>
          : 1) pp.
          <fpage>1</fpage>
          -
          <lpage>21</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Hudlicka</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          and
          <string-name>
            <surname>M. D. Mcneese</surname>
          </string-name>
          , “
          <article-title>Assessment of user affective and belief states for interface adaptation: Application to an Air Force pilot task,” User Model</article-title>
          .
          <source>User-Adapt. Interact.</source>
          , vol.
          <volume>12</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>47</lpage>
          ,
          <year>2002</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Lemon</surname>
            ,
            <given-names>K.N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Verhoef</surname>
            ,
            <given-names>P.C.</given-names>
          </string-name>
          :
          <article-title>Understanding Customer Experience Throughout the Customer Journey</article-title>
          , J. of Marketing: AMA/MSI Special Issue, Vol.
          <volume>80</volume>
          (
          <issue>Nov</issue>
          .
          <year>2016</year>
          ), pp.
          <fpage>69</fpage>
          -
          <lpage>97</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Märtin</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Herdin</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>Enabling Decision-Making for Situation-Aware Adaptations of Interactive Systems</article-title>
          ,
          <source>Proc. 10th Forum Media Technology, FMT</source>
          <year>2017</year>
          ,
          <volume>29</volume>
          -
          <fpage>30</fpage>
          Nov.,
          <string-name>
            <surname>St</surname>
          </string-name>
          . Pölten, Austria, (
          <year>2017</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Märtin</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Herdin</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Engel</surname>
          </string-name>
          , J.:
          <article-title>Model-based User-Interface Adaptation by Exploiting Situations, Emotions and Software Patterns</article-title>
          ,
          <source>Proc. CHIRA</source>
          <year>2017</year>
          , Funchal, Madeira, Portugal,
          <volume>31</volume>
          <fpage>October</fpage>
          -2
          <string-name>
            <surname>November</surname>
          </string-name>
          , SCITEPRESS (
          <year>2017</year>
          ), pp.
          <fpage>50</fpage>
          -
          <lpage>59</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Märtin</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rashid</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Herdin</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>Designing Responsive Interactive Applications by Emotion-Tracking and Pattern-based Dynamic User Interface Adaptation</article-title>
          .
          <source>Proc. of HCII</source>
          <year>2016</year>
          , Toronto,
          <fpage>17</fpage>
          -
          <issue>22</issue>
          <year>July</year>
          , Vol. III,
          <string-name>
            <surname>Springer</surname>
            <given-names>LNCS</given-names>
          </string-name>
          ,
          <year>2016</year>
          , pp.
          <fpage>28</fpage>
          -
          <lpage>36</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Meixner</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Calvary</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coutaz</surname>
          </string-name>
          , J.:
          <article-title>Introduction to model-based user interfaces</article-title>
          .
          <source>W3C Working Group Note 07 January</source>
          <year>2014</year>
          . http://www.w3.org/TR/mbui-intero/. Accessed 27 May 2015
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Melchior</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vanderdonckt</surname>
            , J., Van Roy,
            <given-names>P.</given-names>
          </string-name>
          :
          <article-title>A Model-Based Approach for Distributed User Interfaces</article-title>
          ,
          <source>Proc. EICS '</source>
          <year>2011</year>
          ,
          <string-name>
            <surname>ACM</surname>
          </string-name>
          (
          <year>2011</year>
          ), pp.
          <fpage>11</fpage>
          -
          <lpage>20</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Meudt</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          et al.:
          <article-title>Going Further in Affective Computing: How Emotion Recognition Can Improve Adaptive User Interaction</article-title>
          , in: A. Esposito and
          <string-name>
            <surname>L.C.</surname>
          </string-name>
          Jain (eds.),
          <source>Toward Robotic Socially Believable Behaving Systems - Vol. 1, Intelligent Systems Reference Library 105</source>
          , Springer Int. Publishing
          <string-name>
            <surname>Switzerland</surname>
          </string-name>
          (
          <year>2016</year>
          ), pp.
          <fpage>73</fpage>
          -
          <lpage>103</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Nasoz</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          :
          <article-title>“Adaptive intelligent user interfaces with emotion recognition</article-title>
          , University of Central Florida Orlando, Florida,
          <year>2004</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <surname>Picard</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          :”Recognizing Stress, Engagement, and Positive Emotion”,
          <source>Proc. IUI</source>
          <year>2015</year>
          , March 29-April 1,
          <year>2015</year>
          , Atlanta,
          <string-name>
            <surname>GA</surname>
          </string-name>
          , USA, pp.
          <fpage>3</fpage>
          -
          <lpage>4</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <surname>Schilit</surname>
            ,
            <given-names>B.N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Theimer</surname>
            ,
            <given-names>M.M.</given-names>
          </string-name>
          : Disseminating Active Map Information to Mobile Hosts, IEEE Network, vol.
          <volume>8</volume>
          , no.
          <issue>5</issue>
          , pp.
          <fpage>22</fpage>
          -
          <lpage>32</lpage>
          , (
          <year>1994</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <surname>Schmidt</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <article-title>Biosignals in Human-Computer Interaction</article-title>
          ,
          <source>Interactions Jan-Feb</source>
          <year>2016</year>
          , (
          <year>2016</year>
          ), pp.
          <fpage>76</fpage>
          -
          <lpage>79</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <surname>Yigitbas</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sauer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Engels</surname>
          </string-name>
          , G.:
          <article-title>A Model-Based Framework for Multi-Adaptive Migratory User Interfaces</article-title>
          .
          <source>In: Proceedings of the HCI</source>
          <year>2015</year>
          ,
          <string-name>
            <surname>Part</surname>
            <given-names>II</given-names>
          </string-name>
          , LNCS 9170, Springer (
          <year>2015</year>
          ), pp.
          <fpage>563</fpage>
          -
          <lpage>572</lpage>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>