=Paper= {{Paper |id=Vol-3124/paper21 |storemode=property |title=Towards Understanding the Transparency of Automations in Daily Environments |pdfUrl=https://ceur-ws.org/Vol-3124/paper21.pdf |volume=Vol-3124 |authors=Fabio Paternò,Simone Gallo,Marco Manca,Andrea Mattioli,Carmen Santoro |dblpUrl=https://dblp.org/rec/conf/iui/PaternoGMMS22 }} ==Towards Understanding the Transparency of Automations in Daily Environments== https://ceur-ws.org/Vol-3124/paper21.pdf
Towards Understanding the Transparency of Automations in
Daily Environments
Fabio Paternò, Simone Gallo, Marco Manca, Andrea Mattioli, Carmen Santoro
CNR-ISTI, HIIS Laboratory, Pisa, Italy


                  Abstract
                  This paper outlines a proposal for how to address transparency of automations in daily
                  environments, such as smart homes, based on experiences carried out in previous projects. The
                  trigger-action programming paradigm has been used to describe and implement such
                  automations in both commercial and research tools. Such automations can be generated through
                  machine learning techniques or directly by the end users or through an interaction between an
                  intelligent agent and the user. When they are executed the resulting behaviour does not always
                  result in the desired actions, and users may have difficulties in understanding and controlling
                  them. Thus, there is a need for design criteria and associated tools that help people to understand
                  and control what happens with the automations active in the environments where they live, and
                  explain how they work and can be modified to better meet their needs.

                  Keywords 1
                  End-user development, Everyday automation, Internet of Things


1. Introduction                                                                               programming [8, 19] has often been used to
                                                                                              describe and implement automations in
                                                                                              environments rich in terms of presence of
    How people interact with digital technologies
                                                                                              connected objects, devices, and services. It is
is currently caught between the Internet of Things
                                                                                              based on sets of rules that connect the dynamic
(IoT), where objects are continuously increasing
                                                                                              events and/or conditions with the expected
their technological capabilities in terms of
                                                                                              reactions without requiring the use of complex
functionalities and connectivity, and Artificial
                                                                                              programming structures, and it has been used in
Intelligence, which is penetrating many areas of
                                                                                              several domains, such as home automation [1, 16,
daily life by supporting their increasing ability to
                                                                                              19], ambient assisted living [14], robots, [11],
autonomously activate functionalities based on
                                                                                              finance [6]. However, when they are
collected data and statistically-based forecasts. In
                                                                                              automatically generated some problems can occur
both trends, human control over technology is
                                                                                              if the end user’s viewpoint is not sufficiently
jeopardized, little is happening in terms of
                                                                                              considered. For example, the study reported in
innovating how we think and control automations.
                                                                                              [20] describes how a learning system can fail to
    We live more and more in environments with
                                                                                              adapt to recent user changes or the difficulty users
dynamic sets of objects, devices, services, people,
                                                                                              have understanding what information the system
and intelligent support. This opens up great
                                                                                              requires in order to be trained to generate the
opportunities, new possibilities, but there are also
                                                                                              desired behaviour. Likewise, a survey-based
risks and new problems. The available
                                                                                              study with participants who have smart devices in
automations can be created through machine
                                                                                              their own home [9], reported difficulties in
learning techniques [18, 21] and activated or
                                                                                              avoiding false alarms, communicating complex
recommended [15, 18] to users, or can even be
                                                                                              schedules, and resolving conflicting preferences.
directly created by them. Trigger-action
                                                                                              Such issues highlight the importance of providing

IUI Workshops, March 2022, Helsinki, Finland
EMAIL:       {fabio.paterno,    simone.gallo,                           marco.manca,
andrea.mattioli, carmen.santoro}@isti.cnr.it
              ©️ 2020 Copyright for this paper by its authors. Use permitted under Creative
              Commons License Attribution 4.0 International (CC BY 4.0).

              CEUR Workshop Proceedings (CEUR-WS.org)
conceptual and technological support for                automations. The focus can consider a single
improving the transparency of such automations.         object, which for some reason is of interest for the
Thus, there is a need for novel solutions able to       user. For example, there is a lamp in front of the
support what we refer to as “humanations”,              user who may be interested in the automations that
which are automations that users can understand         control it. The focus can also consider a group of
and modify.                                             objects (e.g. the lamps that are nearby) or can be
                                                        more general and consider all the connected
                                                        objects that are in a given space (e.g. in a room or
2. Conceptual Dimensions                                a an entire flat).
                                                            One further dimension is represented by the
                                                        temporal aspects of automations [4, 10], which
    We can better address automation                    can be composed of triggers and actions, both of
transparency if we identify the set of dimensions       which have different temporal aspects. Triggers
that can characterise this concept. Design spaces       can be composed of events and conditions, where
for understanding automations have been                 events are instantaneous changes in some
proposed in previous work [3, 17] but we find that      contextual element, while conditions are
design criteria for their transparency have not
                                                        associated with the state of some elements, which
been sufficiently addressed. For this purpose, the
                                                        can last for some time. Likewise, the effects of the
first important point to clarify concerns the
                                                        actions can be instantaneous (e.g. sending a
possible desired levels of user control. We can         notification) or can have longer duration (e.g. turn
identify at least four possible levels: perception      a light on). Thus, the combination of triggers and
(users are able to perceive that some automation        actions can determine different types of situations
is active and working), understanding (users are        depending on the temporal aspects of the
able to understand how such automation works,           constituent elements, which should be clearly
thus some level of explainability is supported),
                                                        expressed to allow users to fully understand and
predictability (users are able to foresee what will     eventually modify the automations of interest.
happen in the future with the current active                One further aspect to consider for automation
automation), modification (users are enabled to         transparency is their analytics in other words
change something in the automations when their          support for analysing the data on how they have
results are not satisfying).
                                                        been used. Automations go through three stages:
    For example, we can consider a smart home
                                                        creation, enabled and execution. Regarding their
where the heating system is automatically               creation it is interesting to know what agent
activated when the user is at home and the              created them and when. Then, it can be useful to
temperature is below 17 Celsius degrees and the         know the periods of time when they have been
time is after 5 pm. The first level of control          enabled, meaning executable. Another aspect of
indicates that the user is able to detect that some
                                                        interest in their use is when and how many times
evenings the heating system is sometimes
                                                        they have been executed. This information is also
activated automatically (automation perception).        useful to understand whether the automation is
In order to ensure that users understand an             working as expected or it is executed at the wrong
automation it is necessary that they be able to
                                                        times or there are some correlations between them
know what elements are necessary to trigger the         and specific contexts of use.
automation (in this example, user location,
temperature, and time), when they actually trigger
the rule, and what the corresponding action is.         3. Tool Support
Predictability is achieved when the user is able to
understand the future behaviour of the smart home          If we want to provide tool support for the
[5]. Thus, for example, the user is able to indicate    transparency of daily automations we need to
whether the heating system will be active or not at     think about something that can be used frequently
a given time (e.g. 4 pm). Lastly, if the user is able   in many locations and situations, with limited
to modify such automation, for example for              effort. In addition, it should be something through
activating the heating system at different time and     which we can immediately interact with the
with different temperature, then the automation         variety of connected objects and sensors that may
modification level is reached.                          be involved in the automations. For this purpose,
    Another relevant dimension is the granularity       we can consider two possible directions. One is
of the set of objects involved in the considered        the use of conversational agents, where users can
ask in natural language what the current                 when the trigger is a condition and the action is
automations are, why they are active, and modify         instantaneous. Since the condition can last for
them, if not completely satisfactory by using            some time, when should the action be performed?
devices such as Alexa or Google Home or their            Since we can assume that the instantaneous
smartphone [7]. Another possibility is an                actions should be performed only once, then the
augmented reality smartphone-based application,          trigger should instead indicate an event to identify
which seems a relevant direction to investigate          when it is to be performed.
since the smartphone is the device that people               One initial possible solution addressing such
most often have with them, and it is immediate for       aspects has been proposed in [2] with the SAC
them to frame the surrounding objects of interest        app.
to receive relevant information through its
camera. Augmented Reality is a technology that
nowadays has reached a widespread application in
many domains for its ability to connect virtual and
physical elements. However, so far, in IoT
applications, it has mainly been used to
superimpose digital information about smart
objects available in the current user context,
primarily concerning their state and capabilities
[1]. We need to better exploit this technology to
support automation transparency, in order to make
the intelligence at work in the surrounding
                                                         Figure 1: The SAC app (from [2])
environment perceivable, so that users can know
what automations involving the nearby objects are
                                                             Figure 1 shows the types of interactions and
active, and modify them, if necessary.
    Regarding the levels of user control, relevant       representations that it supports: (left) info on the
solutions should be able to highlight whether the        current room (Living Room) and the framed
surrounding objects shown in the smartphone              sensor; (centre) the rules created for the current
screen are involved in active automations. They          room; (right) the support for creating new rules.A
                                                         first user study gathered positive feedback, but in
should be able to explain what automations are
                                                         order to fully support transparency, a richer set of
active on request, and also allow users to modify
them, even providing suggestions, if they do not         information should be provided, and also
meet their needs. In order to support the                augmented reality can be better exploited. The
granularity dimension, the tool should be able to        Vuforia functionalities were used to support
provide information not only of the automations          object recognition. They worked sufficiently well
                                                         but in some cases the sensors had to be manually
involving a single framed object but also those
                                                         marked to facilitate their recognition (see an
related to groups of objects, for example a group
graphically selected in the smartphone camera            example in Figure 1, left), and users had to be
supported view, or the entire current environment        sufficiently close, with the focus of the camera on
                                                         them for some seconds in order to perform their
where the user is located (e.g. a kitchen). This
implies that the solution include a connection with      recognition. Thus, a solution based on a computer
some indoor localization technology.                     vision technique exploiting Convolutional Neural
    To support the temporal dimension one key            Networks can be more efficient, if adequately
aspect is to provide explicit indications whether        trained.
the elements composing the trigger side are events           Another relevant experience has been carried
or conditions. For this purpose, it is possible to use   out in the AAL PETAL project, where a prototype
different keywords (e.g. “when” for events, “if” or      platform (TAREME) has been designed and
“while” for conditions). One further support is to       developed for supporting caregiver management
avoid the creation of automations whose                  of automations in the homes of older adults with
components contain erroneous temporal relations.         mild cognitive impairments in order to provide
For example, a trigger defined by the composition        personalised support in their daily activities. In
of two events with an AND logical operator is            order to allow caregivers to better understand the
almost impossible to occur since it is very unlikely     automations, the tool was extended [13] to allow
that the two events occur at the same time.              them to indicate a possible context of use and
                                                         some automations, and then it provided feedback
Another example of a problematic situation is
on what automation would have been triggered in        5. Acknowledgments
that context, with the possibility to receive and
explanation in natural language on why or why
                                                           Support from the PRIN           EMPATHY
not they would have been executed. The platform
                                                       (http://www.empathy-project.eu/)    project is
also includes functionalities for remote
                                                       gratefully acknowledged
monitoring and analytics of the automations [14].
Figure 2 shows some of the information that it is
able to display.                                       6. References
                                                       [1] Aghaee, S. and Blackwell, A.F. (2015). IoT
                                                           programming needs deixis. Proceedings of
                                                           CHI 2015 Workshop on End User
                                                           Development in the Internet of Things Era -
                                                           EUDITE)
                                                       [2] R. Ariano, M. Manca, F. Paternò, C. Santoro,
                                                           Smartphone-based Augmented Reality for
                                                           End-User Creation of Home Automations,
Figure 2: The TAREME display of some                       Behaviour & Information Technology, 2022,
automation analytics (from [14])                           https://www.tandfonline.com/doi/full/10.10
                                                           80/0144929X.2021.2017482
    The platform is able to monitor automations        [3] J. Bongard, M. Baldauf, and P. Fröhlich.
from multiple sites at the same time. In the               2020. Grasping Everyday Automation – A
example reported in the figure there are six trial         Design Space for Ubiquitous Automated
sites active, and it shows on the top the total            Systems. In 19th International Conference on
number of rules created, how many times they               Mobile and Ubiquitous Multimedia (MUM
have been triggered and how many are currently             2020), November 22–25, 2020, Essen,
active. The tool also supports the possibility to          Germany. ACM,
filter the displayed information only for one          [4] Will Brackenbury, Abhimanyu Deora, Jillian
specific site. In addition, as the figure shows, the       Ritchey, Jason Vallee, Weijia He, Guan
tool also categorizes the triggers depending on            Wang, Michael L. Littman, Blase Ur: How
whether they are related to the user behaviour,            Users Interpret Bugs in Trigger-Action
environmental aspects or some device, and                  Programming. CHI 2019: 552
indicates how many triggers belong to each             [5] S. Coppers, D. Vanacken, K. Luyten,
category. Likewise, also the numbers of                    FORTNIoT: Intelligible Predictions to
associated actions are displayed classified                Improve User Understanding of Smart Home
depending on whether they are performed on                 Behavior, Proceedings of the ACM on
some appliance or they are reminders or alarms.            Interactive,    Mobile,      Wearable    and
                                                           Ubiquitous Technologies, Volume 4, Issue 4,
                                                           December 2020, Article No.: 124, pp 1–24
4. Conclusions                                         [6] Chris Elsden, Tom Feltwell, Shaun W.
                                                           Lawson, John Vines: Recipes for
   In this paper we introduce the concept of               Programmable Money. CHI 2019: 251
transparency      of    automations    in   daily      [7] Simone Gallo, Marco Manca, Andrea
environments, and some logical dimensions that             Mattioli, Fabio Paternò, Carmen Santoro
characterise it. Such dimensions are provided at a         (2021)       Comparative      Analysis    of
conceptual design level, and we also report and            Composition Paradigms for Personalization
discuss how we have addressed them with some               Rules     in    IoT     Settings.   End-User
tools in previous projects.                                Development. IS-EUD 2021. Lecture Notes
   Future work will be dedicated to extending and          in Computer Science, vol 12724. Springer
validating the identified design aspects, and          [8] Giuseppe Ghiani, Marco Manca, Fabio
provide improved associated tool support, for              Paternò, and Carmen Santoro, 2017.
example with more thorough treatment of                    Personalization     of     Context-dependent
explainability aspects [9].                                Applications through Trigger-Action Rules.
                                                           ACM Transactions on Computer-Human
                                                           Interaction, 24(2), Article 14.
[9] W He, J Martinez, R Padhi, L Zhang, B Ur,            Wearable Ubiquitous Technol. 2, 1, Article
     When smart devices are stupid: negative             35 (March 2018), 34 pages.
     experiences using home smart devices, 2019     [19] Blase Ur, Erin A McManus, Melwyn Pak
     IEEE Security and Privacy Workshops                 Yong, Michael L Littman. 2014. Practical
     (SPW), 150-155                                      trigger-action programming in the smart
[10] Justin Huang, and Maya Cakmak. 2015.                home. Proceedings of CHI ’14. ACM, 803–
     Supporting mental model accuracy in                 812.
     trigger-action programming. In Proceedings          https://doi.org/10.1145/2556288.2557420
     of the 2015 ACM International Joint            [20] Rayoung Yang, Mark Webster Newman,
     Conference on Pervasive and Ubiquitous              Learning from a learning thermostat: lessons
     Computing (UbiComp '15). ACM, New                   for intelligent systems for the home. 2013.
     York,        NY,        USA,       215-225.         ACM international joint conference on
     DOI=http://dx.doi.org/10.1145/2750858.280           Pervasive and ubiquitous computing, pp. 93-
     5830                                                102
[11] Nicola Leonardi, Marco Manca, Fabio            [21] L. Zhang, W. He, O. Morkved, V. Zhao, M.
     Paternò, Carmen Santoro, Trigger-Action             L. Littman, S. Lu, and B. Ur. 2020.
     Programming for Personalising Humanoid              Trace2TAP: Synthesizing Trigger-Action
     Robot Behaviour, ACM Conference on                  Programs from Traces of Behavior. Proc.
     Human Factors in Computing Systems                  ACM Interact. Mob. Wearable Ubiquitous
     (CHI'19), Glasgow, Paper 445.                       Technol. 4, 3, Article 104 (September 2020),
[12] Q. Vera Liao, Daniel M Gruen, Sarah Miller,         26 pages. https://doi.org/10.1145/3411838
     Questioning the AI: Informing Design
     Practices for Explainable AI User
     Experiences, CHI 2020
[13] Marco Manca, Fabio Paternò, Carmen
     Santoro, Luca Corcella, Supporting end-user
     debugging of trigger-action rules for IoT
     applications, International Journal of
     Human-Computer Studies, Vol.123, 56-69
[14] Marco Manca, Fabio Paternò, Carmen
     Santoro, Remote Monitoring of End-User
     Created Automations in Field Trials, Journal
     of Ambient Intelligence and Humanized
     Computing, 2021
[15] Andrea Mattioli & Fabio Paternò (2021)
     Recommendations for creating trigger-action
     rules in a block-based environment,
     Behaviour & Information Technology,
     40:10,            1024-1034,           DOI:
     10.1080/0144929X.2021.1900396
[16] Antti Salovaara, Andrea Bellucci, Andrea
     Vianello, and Giulio Jacucci. 2021.
     Programmable Smart Home Toolkits Should
     Better Address Households’ Social Needs. In
     CHI 21, May 8–13, 2021, Yokohama, Japan.
     ACM, 14 pages.
[17] Ben Shneiderman (2020) Human-Centered
     Artificial Intelligence: Reliable, Safe &
     Trustworthy, International Journal of
     Human–Computer Interaction, 36:6, 495-
     504
[18] Vijay Srinivasan, Christian Koehler, and
     Hongxia Jin. 2018. RuleSelector: Selecting
     Conditional Action Rules from User
     Behavior Patterns. Proc. ACM Interact. Mob.