=Paper= {{Paper |id=Vol-3407/paper2 |storemode=property |title=Methodology to Automate Intelligent User Interfaces’ Development |pdfUrl=https://ceur-ws.org/Vol-3407/paper2.pdf |volume=Vol-3407 |authors=Alberto Gaspar Villalba |dblpUrl=https://dblp.org/rec/conf/caise/Villalba23 }} ==Methodology to Automate Intelligent User Interfaces’ Development== https://ceur-ws.org/Vol-3407/paper2.pdf
Methodology to Automate Intelligent User Interfaces’
Development
Alberto Gaspar Villalba
Escola Tècnica Superior d'Enginyeria, Departament d’Informàtica
Universitat de València,
Avenida de la Universidad, s/n, 46100, Burjassot, València, Spain

                               Abstract
                               The key to reach high software acceptability depends on how end-users interact with systems.
                               In this line, Intelligent User Interfaces have emerged as a solution that aims to learn about
                               users’ behavior and adapt the interfaces to the user’s characteristics, likes and preferences. The
                               design of Intelligent User Interfaces involves a series of challenges, including the enhancement
                               of usability and adaptability of these interfaces; and maintaining a high user experience. To
                               overcome these challenges, Intelligent User Interfaces rely on automatic learning to adapt
                               widgets according to users s. The main contribution of this thesis is to design a methodology
                               to automate the process of collecting and classifying user characteristics as much as possible
                               and to customize, in execution time, the widgets to such characteristics. These customizations
                               will affect both the user who is currently interacting and future users who share the same profile
                               as the current user. The proposal is based on the use of conceptual models to: 1) define a user
                               model to represent user characteristics abstractly, and 2) specify the widgets that can be
                               modified and the adaptation rules to modify them in an interaction model. The methodology
                               used in the thesis is Design Science, which includes the definition of the artifacts and their
                               evaluation.

                               Keywords 1
                               Intelligent user interfaces, user characteristics, methodology, adaptation, Science Design.


1. Context, problem statement and Research Questions

One of the current challenges in software development is quality, which is a software characteristic with
several subcharacteristics [1]. One of these subcharacteristics is usability, which is strongly related to
the subjective preferences of end-users. Among the different attributes that aim to get usable systems,
one of them is the customization. This means that systems can be adapted to end-user preferences, goals,
interests, etc. However, how the systems are adapted is a research challenge, as adaptations should be
done in a subtle way and enhancing the usability. For this purpose, the concept of Intelligent User
Interfaces (IUI) has arisen in recent years to improve human-computer interaction [2]. IUIs are defined
as a subfield of Human Computer Interaction (HCI) that relies on adaptivity at runtime, modifying
interaction features based on the user’s current usage [3]. To develop IUIs, the following questions must
be answered: 1) What is considered intelligent?, 2) How is this intelligence characterized?, and 3) What
abilities are attributed to an intelligent entity? [4]. In the process of implementing IUIs, the system must
detect the characteristics of the end-user and the context to customize the Graphical User Interface
(GUI) to them. These interfaces require the support of an artificial intelligence algorithm (IA algorithm)
to learn about the user profile and save their interaction preferences.
    Designing IUIs involves several challenges that motivate the work of the current thesis:


Proceedings of the Doctoral Consortium Papers Presented at the 35th International Conference on Advanced Information Systems Engineering
(CAiSE 2023), June 12–16, 2023, Zaragoza, Spain
EMAIL: alberto.gaspar@uv.es (A.Gaspar);
ORCID: 0000-0002-9236-4391 (A. 1);
                            © 2023 Copyright for this paper by its authors.
                            Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
 CEUR
 Wor
 Pr
    ks
     hop
  oceedi
       ngs
             ht
             I
              tp:
                //
                 ceur
                    -
             SSN1613-
                     ws
                      .or
                    0073
                        g
                            CEUR Workshop Proceedings (CEUR-WS.org)

                                                                                                11
     1. Definition of a user model to abstractly represent end-user characteristics and context.
         This challenge aims to analyze existing user models and propose a new conceptual model that
         gathers as many end-user characteristics as possible. This new user model can include elements
         not present in previous user models in case of necessity [5, 6]. This new user model should
         identify which of these end-user’s characteristics are necessary to customize the GUI,
         distinguishing between static characteristics (such as gender or age) and dynamic
         characteristics (such as context or emotional status) [7].
     2. Definition of an artificial intelligence algorithm to learn end-user preferences while the
         user interacts with the system. This algorithm is intended to learn about user’s characteristics
         to determine the user’s profile. This algorithm must detect when the model misidentifies the
         user profile, in other words, it must detect false positives-negatives [8]. Moreover, it should
         reduce the time needed to detect and learn about the user profile [9], and identify the target of
         the end-users with the system. This target may involve different ways of interaction [10].
     3. Definition of an algorithm to customize the GUI’s widgets to be compliant with the user
         profile. Based on the preferences stored in the user model and the end-user’s target, the system
         must automatically adapt the GUI by customizing its widgets [11]. It is also necessary to
         establish a mechanism that allows the user to modify the GUI’s customization performed by
         the system.
    This thesis aims to overcome all these challenges from a perspective of conceptual models. The
problem statement of the thesis is to define a methodology to automate the customization of IUIs
using conceptual models. This methodology starts with the definition of a user model that contains the
static and dynamic characteristics of users. Next, the methodology uses an interaction model that
abstractly represents the GUIs, such as UsiXML [12], to identify the metrics that collect the user
characteristics to complete an instance of the user model. The interaction model is based on abstract
representations of the GUI. Specifically, there is an Abstract Model to represent GUIs independently of
the platform and a Concrete Model that specify a GUI for a specific platform. Once the user model has
been populated, an artificial intelligence algorithm is used to dynamically modify the interaction models
to customize the widgets to end-user preferences. This methodology enables the automation of the
customization process for IUIs in execution time, reducing the need of manual intervention, and
enhancing user experience. For example, the GUI of autonomous car will hide some components when
a user is driving and will display these components when the car is stopped. Similarly, The GUI of an
e-commerce, will display the graphical components with more resolution when a user interacts from a
personal computer, and the GUI will reduce the resolution of these components when a user interacts
from a mobile device.
    To address the previously problem statement, we propose the following high-level research
question: How can we specify intelligent user interfaces based on conceptual models? To answer
this question, we decompose it into several sub-questions:
    RQ1: What is the user model that defines the most important user’s characteristics?
    RQ2: How to collect user characteristics and how to analyze these user characteristics?
    RQ3: Which widgets should be customized to the user and how to conduct the customization?
    This work is structured as follows. Section 2 provides an overview of the previous literature on IUIs.
Section 3 presents the methodology used to perform this thesis. Section 4 describes the proposed
objectives. Finally, Section 7 presents the conclusions.

2.Related works

In the development of Intelligent User Interfaces (IUI), researchers have explored various approaches
to make interfaces more adaptive and personalized to users' needs. Some of the related works in the
development of IUI include user modeling and profiling techniques, which involve collecting
information about users and using it to personalize their experience. In this field, Al Seraj et al. [13]
provides a survey of existing user modeling techniques, including both rule-based and data-driven
approaches. Zanker et al. [14] explore a machine learning approach to user modeling, showing how this
approach can be used to provide better recommendations. The limitations of these works include the


                                                   12
use of limited data sets, the difficulty of obtaining accurate user information, and the challenge of
developing algorithms that can effectively model user behavior and context.
    Other related works in the development of IUI deal with adaptive interfaces, which can change and
evolve based on user feedback and behavior. In this line, the Adaptive User Interfaces Toolkits (AUIT)
[15] focus on the scaling and rotation of different widgets. AUIT includes sensors that detect user
movements, the direction in which the user is looking for, the distance between the user and a device,
and if there is an obstruction between the user and the system. Mayer et al. [16] present a study in which
they compare different adaptable user interfaces based on user context and define the difference
between adaptable and adaptive user interfaces. The study aims to evaluate the differences and
similarities of adaptive interfaces to analyze the applications, limitations, and possibilities of
harmonization. The authors claim that each system implements adaptive widgets differently, making it
difficult to propose a general method for designing any IUI. In the same line, Pfeuffer et al. [17] conduct
a study to determine the user experience and design parameters in augmented reality (AR) applications
based on user’s gaze. According to the authors, the user’s gaze can provide information about the world
without overloading the user. They conclude that the design of AR adaptive interfaces has several
difficulties caused by many underlying factors, ranging from the scenarios and interaction tasks to the
dynamic characteristics of the users. All these works highlight the complexity of designing and
implementing IUI, and the need of a user-centered design and evaluation methods.
    Furthermore, there have been many prior studies on the use of IUI. In the field of recommendation
systems, there are frameworks such as the Personality and Emotion Integrated Attentive (PEIA)
framework [18]. This framework considers user's likes, emotions, and preferences to provide music
recommendations. MemoMusic [19] uses the user's current emotional state and previous musical tastes
to rank songs and suggest the one that is aligned with their current emotional state. Kim et al. [20]
presents a music recommendation system based on human activity measured by real-time sensors such
as accelerometer and gyroscope. However, the main limitation of these works is that they are all focused
on user’s emotional status and preferences, and these data are difficult to obtain automatically.
    Finally, there are others works focused on developing IUI from conceptual models. Woodward et
al. [21] present a conceptual model for developing IUI that improves the children’s cognitive abilities
and motor skills. This model identifies seven major high-level themes that are specific for children.
Three of these themes, User Behavior, User Input, and User Context, are present in users; three others,
System Output, System Behavior, and System Context, are present in the system, and the System
Intelligence, serves as a link between the user and the system. In the same line, User Interface-Dynamic
Software Product Line (UI-DSPL) methodology [22] combines the final user interface model defined
in Model-Based User Interface Development (MBUID) and the real-time adaptation defined in
Dynamic Software Product Line (DSPL) to define the different rules to adapt the IUI. UI-DSPL
describes two types of adaptation: context constraint where the relationship between the user context
and the system context are described, and the aspect constraint, where the modifications of the graphical
components are defined. Yigitbas et al. [23] present a Model-Driven Development approach to define
Self-Adaptive User Interfaces (SAUIs) with real-time adaptation. This approach is based on the IFML
language and combines ContextML (to adjust to the context-of-use) with AdaptML (to adapt the
graphical components). The limitation of these works is that they are focused on the user’s context,
avoiding other important user characteristics such as emotional status or disabilities.
    The main limitations of the previous research of IUI include the use of limited datasets, the difficulty
for obtaining accurate information from the user automatically, the challenge of developing algorithms
that can effectively model user behavior and context, and the lack of a systematic approach to customize
and personalize GUIs at runtime. This thesis aims to overcome these limitations by providing a more
sophisticated user model to capture relevant user characteristics and context, a machine learning
algorithm to analyze the diverse user characteristics extracted, and a methodology to systematize and
automate the GUIs customization.

3. Research Methodology

We use the Design Science Research (DSR) approach [24, 25] to answer the research questions. This
methodology is based on the design and research of the different artifacts within a specific context. It

                                                    13
consists of two main phases: Design phase where an analysis is carried out with the aim of making a
real-world change and Research phase where a real or hypothetical analysis of the real world with the
goal of changing certain aspects is conducted. Multiple solutions can be proposed by designers in the
project, and these solutions are evaluated according to their utility to achieve the project objectives.
These stages iterate cyclically defining a series of questions in the design stage and solving them in the
research stage.
For this thesis, in the design phase, we will analyze what are the most important user characteristics to
establish the different groups of users; which characteristics should be collected automatically by the
system, and which ones should be defined by the end-user manually; how users are classified according
to these characteristics; what widgets abstractly represented in an interaction model can be customized
to the end-user’s profile; and how to customize widgets dynamically from interfaces defined in
interaction models. In the research phase, we plan to conduct several empirical experiments with
students to analyze whether they consider these user characteristics as relevant for the GUI
customization. Moreover, we plan to conduct an experiment to evaluate if the metrics to collect users’
characteristics work properly. Finally, we plan to evaluate if customized GUI meets the users’ usability
expectations. This evaluation will be done using System Usability Scale (SUS) [26], a 5-point Likert
scale questionnaire.




   Figure 1 Definition of the work points formulated in this thesis.

    To implement these phases, we will design a series of work packages, represented in Figure 1.
Additionally, we estimate the time for each work package. For the design phase, we define the
following packages:
        • WP1: To define a user model with the most relevant user characteristics. (Month 1 to
           Month 6)
        • WP2: To conduct a survey to test the level of acceptance of the chosen user
           characteristics. (Month 7 to Month 8)
        • WP3: To define metrics to collect the user characteristics. (Month 9 to Month 14)
        • WP5: To define an algorithm to analyze the user characteristics and identify the user
           profile. (Month 18 to Month 24)
                                                             14
       • WP6: To define the widgets that are going to be customized. (Month 25 to Month 30)
   For the research phase, we design the following work packages:
       • WP4: To conduct an experiment to check the correctness of the proposed metrics.
           (Month 14 to Month 17)
       • WP7: To conduct an experiment to validate the user’s satisfaction of the customization
           and usability of IUI after applying our proposed method. (Month 30 to Month 35)

4. Research Contributions of this work and actual status

The main contribution of this thesis is to provide a methodology to automate the process of
developing intelligent user interfaces using conceptual models. To achieve this objective, more
specific subobjectives have been defined:

4.1 Define a user model with the user’s characteristics.

This subobjective aims to determine which user characteristics must be included in the user model to
customize the GUI according to the user’s profile. We have reviewed the existing literature to identify
the characteristics that need to be included in a user model to represent different users’ profiles. All
these characteristics were gathered in the proposed user model. To validate this new user model, we
conducted a survey where users chose which user characteristics, they considered relevant for the
customization of GUIs in different contexts. The results were compared with the characteristics most
rated in existing works. Finally, we conducted a study to look for a correlation between the demographic
characteristics of the users and the characteristics expressed in the user model. Currently, the user model
has been proposed and validated with 62 subjects. The results yielded that users consider interests,
disabilities, and language as the most important characteristics to customize the GUI.

4.2 Definition of metrics to extract the user’s profile.

This subobjective aims to define metrics that specify how the user characteristics expressed in the user
model are extracted from the interaction between the user and the GUI. Some user characteristics must
be manually defined by the user in a form, and some others can be extracted automatically considering
the interaction actions. All the collected characteristics will be used to learn about the user profile. Users
that share the same profile will have the same GUI customizations.
    The metrics that can be applied automatically will be defined using conceptual primitives of an
interaction model that represents GUI abstractly. These metrics must specify the source widgets used
to extract the users’ characteristics. For example, to know user’s preferences, we can specify a metric
to count the number of clicks done in a conceptual primitive that represents an action in the GUI. The
more clicks receive this action, the more preference has this widget for such user’s profile. As
interaction model, we plan to use UsiXML [12] , that is widely used in the Model-Driven context. This
model divides the GUI into different abstraction levels. Our approach focuses on two models: the
abstract model and the concrete model. The abstract model aims to specify what elements are required
in the GUI; while the concrete model specifies how the widgets are displayed in the GUI.
    The metrics that depend on the information provided by the user require a form where the user must
fill in manually. An example of metrics of this type is the choice of hobbies from a list. The proposed
metrics will be evaluated comparing the extracted information from the user with the real information.
We need to study the actions done with the GUI to analyze automatic metrics and to study the
information provided in the forms for the manual metrics. After defining these metrics, we will define
an IA algorithm to analyze the user characteristics proposed in the user model. Currently, we have
defined which metrics must be entered manually by the user and which metrics can be applied by the
system automatically. The definition IA algorithm to analyze user characteristics have not been started
yet.


                                                     15
4.3. Specify the widgets adaptation automatically.

This subobjective aims to specify how to dynamically customize widgets based on the information
saved in the user model for users with the same profile as the user currently interacting with the system.
To achieve this, we need to analyze previous works that deal with how user characteristics may affect
widgets. We plan to use the UsiXML language to model the GUI and specify an algorithm to customize
widgets at runtime based on the user’s profile. We also aim to conduct an empirical experiment to
compare the customized GUI after applying our proposed method with the original GUI to analyze the
usability. Usability will be measured in terms of effectiveness, efficiency, and satisfaction to analyze
whether the customized GUI improves usability. Once the widgets have been adapted automatically,
the users can correct the adaptations according to their preferences. These preferences will have more
priority in the adaptation than the characteristics expressed in the user model. This subobjective has not
been started yet.

5. Conclusion

This thesis aims to define a method to customize GUI to users’ preferences using conceptual models.
We aim to automate the customization process as much as possible, considering that the results of the
customization must improve the usability. We have defined a user model to gather users’ characteristics
that share the same profile. As future work, we plan to define metrics to extract users’ characteristics
and an algorithm to customize widgets in real-time. This thesis is in the context of Model-Driven
Development, where models are used to express system features.

Acknowledgements

I would thank to Universitat de Valencia Profs. Jose Ignacio Panach Navarrete (joigpana@uv.es) and
Miriam Gil Pascual (Miriam.gil@uv.es) for their supervision and revision of this thesis. This work was
developed with the support of the Generalitat Valenciana with GENI (CIAICO/2022/229) and the
support of the Spanish Ministry of Science and Innovation co-financed by FEDER in the project SREC
(PID2021-123824OB-I00).

References.

[1] ISO/IEC: ISO/IEC 25000 - Software engineering - Software product Quality Requirements and
    Evaluation (SQuaRE) - Guide to SQuaRE. pp. (2010). citeulike-article-id:10949862
[2] Helldin, T., Bae, J., Alklind Taylor, A.-S.: Intelligent User Interfaces: Trends and application areas.
    University of Skövde (2019)
[3] Miraz, M.H., Ali, M., Excell, P.S.: Adaptive user interfaces and universal usability through
    plasticity of user interface design. Comput. Sci. Rev. 40, pp. 100363 (2021).
    10.1016/j.cosrev.2021.100363
[4] Völkel, S.T., Schneegass, C., Eiband, M., Buschek, D.: What is "intelligent" in intelligent user
    interfaces? a meta-analysis of 25 years of IUI. Proceedings of the 25th International Conference
    on Intelligent User Interfaces, pp. 477–487, Cagliari, Italy (2020)
[5] Abrahão, S., Insfran, E., Sluÿters, A., Vanderdonckt, J.: Model-based intelligent user interface
    adaptation: challenges and future directions. Software Systems Modeling 20, pp. 1335-1349
    (2021).
[6] Jalil, N.: Introduction to Intelligent User Interfaces (IUIs). Software Usability. IntechOpen (2021)
[7] Levy, S., Kraut, R.E., Yu, J.A., Altenburger, K.M., Wang, Y.-C.: Understanding Conflicts in
    Online Conversations. pp. 2592-2602 (2022)
[8] Rottmann, M., Maag, K., Chan, R., Hüger, F., Schlicht, P., Gottschalk, H.: Detection of False
    Positive and False Negative Samples in Semantic Segmentation. pp. 1351-1356 (2020)

                                                    16
[9] Tyagi, A., Rekha, G.: Challenges of Applying Deep Learning in Real-World Applications. pp. 92-
     118 (2020)
[10] Vázquez Ingelmo, A., García-Peñalvo, F.J., Therón Sánchez, R., Conde González, M.Á.:
     Extending a dashboard meta-model to account for users’ characteristics and goals for enhancing
     personalization. In: Proceedings of LASI-SPAIN 2019. Learning Analytics Summer Institute
     Spain 2019: Learning Analytics in Higher Education. CEUR Workshop Proceedings Series, pp.
     35-42. M. Caeiro-Rodríguez, Á. Hernández-García and PJ Muñoz-Merino, Eds, (2019)
[11] Bhardwaj, T., Upadhyay, H., Sharma, S.C.: Framework for Quality Ranking of Components in
     Cloud Computing: Regressive Rank. In: 2020 10th International Conference on Cloud Computing,
     Data Science & Engineering (Confluence), pp. 598-604. (2020)
[12] USIXML, last access: 02/20/2022, http://www.usixml.org/?download=usiXML-documentation-
     draft.pdf
[13] Seraj, M.S.A., Pastel, R., Al-Hasan, M.: A survey on User modeling In HCI. Computer
     Applications: An International Journal 5, pp. 21-28 (2018).
[14] Zanker, M., Rook, L., Jannach, D.: Measuring the impact of online personalisation: Past, present
     and future. Int. J. Hum. Comput. Stud. 131, pp. 160-168 (2019). 10.1016/j.ijhcs.2019.06.006
[15] Belo, J.M.E., Lystbæk, M.N., Feit, A.M., Pfeuffer, K., Kán, P., Oulasvirta, A., Grønbæk, K.: AUIT
     - the Adaptive User Interfaces Toolkit for Designing XR Applications. pp. 48:41-48:16 (2022)
[16] Mayer, C.C., Zimmermann, G., Grguric, A., Alexandersson, J., Sili, M., Strobbe, C.: A
     comparative study of systems for the design of flexible user interfaces. J. Ambient Intell. Smart
     Environ. 8, pp. 125-148 (2016). 10.3233/AIS-160370
[17] Pfeuffer, K., Abdrabou, Y., Esteves, A., Rivu, R., Abdelrahman, Y., Meitner, S., Saadi, A., Alt,
     F.: ARtention: A design space for gaze-adaptive user interfaces in augmented reality. Comput.
     Graph. 95, pp. 1-12 (2021). 10.1016/j.cag.2021.01.001
[18] Shen, T., Jia, J., Li, Y., Ma, Y., Bu, Y., Wang, H., Chen, B., Chua, T.-S., Hall, W.: Peia: Personality
     and emotion integrated attentive model for music recommendation on social media platforms. In:
     Proceedings of the AAAI conference on artificial intelligence, pp. 206-213. (2020)
[19] Mou, L., Li, J., Li, J., Gao, F., Jain, R.C., Yin, B.: MemoMusic: A Personalized Music
     Recommendation Framework Based on Emotion and Memory. pp. 341-347 (2021)
[20] Kim, H.-G., Kim, G.Y., Kim, J.Y.: Music Recommendation System Using Human Activity
     Recognition From Accelerometer Data. IEEE Trans. Consumer Electron. 65, pp. 349-358 (2019).
     10.1109/TCE.2019.2924177
[21] Woodward, J., McFadden, Z., Shiver, N., Ben-hayon, A., Yip, J.C., Anthony, L.: Using Co-Design
     to Examine How Children Conceptualize Intelligent Interfaces. pp. 575 (2018)
[22] Sboui, T., Ayed, M.B., Alimi, A.M.: A UI-DSPL Approach for the Development of Context-
     Adaptable        User       Interfaces.    IEEE      Access     6,     pp.     7066-7081        (2018).
     10.1109/ACCESS.2017.2782880
[23] Yigitbas, E., Jovanovikj, I., Biermeier, K., Sauer, S., Engels, G.: Integrated model-driven
     development of self-adaptive user interfaces. Softw. Syst. Model. 19, pp. 1057-1081 (2020).
     10.1007/s10270-020-00777-7
[24] Tsolas, I.E., Charles, V., Gherman, T.: Supporting better practice benchmarking: A DEA-ANN
     approach to bank branch performance assessment. Expert Systems with Applications 160, pp.
     113599 (2020).
[25] Carstensen, A.-K., Bernhard, J.: Design science research–a powerful tool for improving methods
     in engineering education research. European Journal of Engineering Education 44, pp. 85-102
     (2019).
[26] Brooke, J.: SUS: A "quick and dirty" usability scale. In: Jordan, P.W., Thomas, B., Weerdmeester,
     B.A., McClelland, I.L. (eds.) Usability Evaluation in Industry. Taylor & Francis, London, UK
     (1996)




                                                    17