=Paper= {{Paper |id=Vol-2905/paper8 |storemode=property |title=From Human-Human Computer Mediated Communication to Human-Automation Collaboration in the Light of Large Civil Aircraft Workplace |pdfUrl=https://ceur-ws.org/Vol-2905/paper8.pdf |volume=Vol-2905 |authors=Elodie Bouzekri,Célia Martinie,Philippe Palanque |dblpUrl=https://dblp.org/rec/conf/chi/BouzekriMP21 }} ==From Human-Human Computer Mediated Communication to Human-Automation Collaboration in the Light of Large Civil Aircraft Workplace== https://ceur-ws.org/Vol-2905/paper8.pdf
From Human-Human Computer Mediated Communication to Human-Automation Collaboration in the light
of Large Civil Aircraft Workplace


Elodie Bouzekri, Célia Martinie &Philippe Palanque
ICS-IRIT, Université Paul Sabatier – Toulouse III, Toulouse, France, palanque@irit.fr

This position paper proposes the use of human-human collaboration models to study human-automation collaboration. We first
present some of most prominent models from psychology and HCI and project their content to identify design rules that could be
used to design and evaluate human-automation collaborations. We apply these principles to the workplace of cockpits of large
civil aircrafts.

• Human-centered computing → Human computer interaction (HCI); Interactive systems and tools;

Additional Keywords and Phrases: Automation, collaboration, task modelling, aircraft cockpits

ACM Reference Format:
Palanque, P., Bouzekri, E., Martinie, C. 2021. From Human-Human Computer Mediated Communication to Human-
Automation Collaboration in the light of Large Civil Aircraft Workplace. In Proceedings of AutomationXP'21: Workshop on
Automation Experience at the Workplace. In conjunction with CHI'21, May 07, 2021.


1 INTRODUCTION
The evolution of computer use from one computer for several persons to many computers for one person could
have been the end of multi-user computing. However, the widespread of internet and the rise of social computing
has demonstrated that dealing with single user applications is nowadays part of history. Designing interactive
systems thus requires, most of the time, to address the needs of groups of users involved in common tasks for
which the communication, coordination and production is mediated by computers. Despite this undeniable
situation, most of the research contributions in the area of interactive systems engineering still focus on single
user applications. This is easily understandable as multi-users application are far more difficult to build than
single user ones. This difficulty comes from different sources:

            •     The difficulty to gather and understand the requirements as well as the need of the users;
            •     The difficulties to address the required communication infrastructures in order to allow both
                  synchronous and asynchronous communication between the users;
            •     The difficulty to ensure usability of these applications that are used jointly by different users (with
                  different characteristics and needs) and under different environmental conditions (time zones,
                  seasons, light, sound, …);
            •     The difficulty to ensure the reliability of these computing systems involving underlying
                  communication mechanisms, networks and the fact that their testing and validation is even more
                  complex as they involve multiple, diverse software and hardware entities.
These problems are even more prominent when it comes to largely distributed worldwide teamwork such as
international collision avoidance systems for satellites [7].
Workshop proceedings Automation Experience at the Workplace
In conjunction with CHI'21, May 7th, 2021, Yokohama, Japan
Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
Website: http://everyday-automation.tech-experience.at
When it comes to automation, there is a very important difference to highlight, the fact that automation can be
considered as co-located with the user thus removing the issue of latency, time zones and reliability of
communication means. However, if the human-automation team is controlling a remote entity (e. g. the Automated
Transfer Vehicles for refueling the International Space Station or satellite ground segments) some of the problems
become valid again [5].
This position paper aims a positioning human-human computer mediated communication knowledge and
principles in the perspective of human-automation collaboration. On the other side, we will also revisit the work on
human-automation teaming such as [1]. Beyond, as the workshop focuses on automation, experience and workplace
we will concentrate on the special context of work of large civil aircrafts.

2 HUMAN-HUMAN COLLABORATION
   Aircraft cockpits are workplaces were several persons, names crewmembers collaborate to achieve a common
goal: to bring passengers of the aircraft from one location to another.

2.1 Collaboration in large civil aircraft cockpit workplace
   To pilot the large civil aircraft, the crewmembers operates the avionics systems through human-machine
interfaces called command and control systems and located in the cockpit. Avionics system includes all the physical
systems (engines, sensors, flaps ...), electrical, electronic and computer systems (autopilot, digital flight controls
digital flight controls ...), embedded in the aircraft. The cockpit embed screens that support the display of
information coming from several avionics systems on the same screen. To control the avionics systems, the
crewmembers use many physical buttons analog cockpits, physical buttons placed at the disposal beside the
displays.
The main goals of the crewmembers, by order of importance, are to fly, navigate, communicate and manage
platforms. Nowadays, flying crews are composed of two crewmembers, which can have one of the two following
roles: Pilot Flying (PF) and Pilot Monitoring (PM). PF is in charge of flying the aircraft and of stabilizing its pitch,
altitude, bank angle, vertical and horizontal flight plans. Pilot Monitoring is here to back up PF by monitoring and
making call outs. In the early days of commercial aviation, a third crewmember (see example in Figure 1) was in
charge of managing avionics system and of recalculating flight plan in case of adverse events, such as clearance from
the air traffic controller [11].




                Figure 1. Cockpit of an airbus A300 with pilot flying, pilot monitoring and flight engineer




                                                              2
2.2 Main principles of description of human-human collaboration
The description of human-human collaboration requires the explicit identification of the tasks that each human
has to perform, in order to understand and allocate the work between them. It also requires the identification of
specific aspects of the collaboration. Cooperative work may be dedicated to one or more of the following type of
collaborative activities: production, coordination, communication. It is then possible to associate one or more
properties amongst this set. For example, Figure 4 a) shows that one task is dedicated to coordination whereas
Figure 4 b) shows that the task is dedicated to both coordination and communication.




                            a)                               b)
                     Figure 2. Example of cooperative task properties from a “functional clover” [2]

Cooperative tasks may be performed within various space-time constraints (local/distant,
synchronous/asynchronous) [4]. Table 1 presents these different space-time constraints and illustrates the
different possibilities with an example.

Table 1. Time Space Matrix from [4]

                             Same Time                                             Different Times
 Same Place                  Face to face interaction                              Asynchronous interaction
                             Ex: Collaboration between pilots in the cockpit       Ex: Post-it
 Different Places            Synchronous distributed interaction                   Asynchronous distributed
                             Ex: Collaboration between pilot and air traffic       interaction
                             controller                                            Ex: Mail

2.3 Notation for the description of human-human collaboration
To apply these main principles of description, we use the notation HAMSTERS-XL [7] for describing collaborative
tasks i.e. tasks having group of users trying to achieve common goals. HAMSTERS-XL provides elements to describe
abstract group tasks as well as individual cooperative tasks. Several human persons are involved in collaborative
work, each one having a role in the achievement of common goals. Collaborative work can be described at different
abstraction levels: at the group level and at the individual level. A group task is a set of task that a group has to carry
out in order to achieve a common goal [5], whereas a cooperative task is an individual task performed by a person
in order to contribute to the achievement of the common goal [9]. Table 2 presents the main types of tasks for
describing individual and collaborative tasks in task models.




                                                            3
        Table 2. Individual and human-human computer mediated collaboration task types in HAMSTERS-XL




To identify time-space constraints, dedicated notation elements are available for cooperative tasks, as
illustrated in Table 3.
                              Table 3. Time and space properties for cooperative tasks

                                                       Local                        Distant
                     Synchronous



                     Asynchronous




2.4 Example: ‘Modify flight plan’ group task with three crewmembers
Figure 3 presents an example of the usage of this notation through tasks models. These tasks models describe
a group task: “Modify flight plan”. This group task describes the recalculation of the flight plan by the flight
engineer on pilot’s request due to weather adverse events. In this section, we focus on two roles of the crew:
pilot flying and flight engineer.

After the detection of weather adverse, the pilot flying asks for a new flight plan to the flight engineer. “Ask
for a new flight plan” (P1) is a cooperative motor task with “Listen to the request for a new flight plan” (FE1)
cooperative perception task of the flight engineer. This cooperation is local (in the cockpit) and synchronous
(oral communication). These tasks describe a coordination between the pilot flying and the flight engineer.
The pilot allocates a new next task (calculate a new flight plan) to the flight engineer. Then, the pilot flying
and the flight engineer discuss and choose a new flight plan until one of the options is confirmed. The flight
engineer communicates his progression (P2 and FE2) while calculating a new flight plan. Afterwards, the flight
engineer proposes a new flight plan (P3 and FE3). After a validation (PE5 and FE5 user cooperative task) or
a rejection (P4 and FE4 user cooperative task) of this new flight plan, the flight engineer calculates another
flight plan or applies the new flight plan. All these cooperative tasks are local to the cockpit and synchronous.




                                                         4
These tasks describe communication between the pilot flying and the flight engineer in order to produce a new
flight plan.
The Table 4 presents associations of cooperative tasks of the pilot and flight engineer roles.




          Figure 3. Recalculation of the flight plan by the flight engineer on pilot flying’s request due to weather adverse events

Table 4. Cooperative tasks table.

    Cooperative task                                    Linked to
    “Ask for a new flight plan” (P1)                    “Listen to the request for a new flight plan” (FE1)
    “Communicate progression” (FE2)                     “Listen to flight engineer’s report on his progression” (P2)
    “Communicate a new flight plan” (FE3)               “Listen to flight engineer’s recommendation” (P3)
    “Request for another route” (P4)                    “Hear and acknowledge other route request” (FE4)
    “Confirm the route” (P5)                            “Hear and acknowledge confirmation of the route” (FE5)


3 HUMAN AUTOMATION TEAMING
Aircraft cockpits are nowadays workplaces where crewmembers interact between each other and with automation.

3.1 Automation of the flight management
Several decades ago, the two pilots’ forward facing cockpit was introduced, such as A3001, and reduced the crew to
two members: Pilot Flying and Pilot Monitoring. The “Manage systems” tasks was partly automated with an alerting
system and the Pilot Monitoring is now in charge of monitoring this alerting system to manage systems. The flight
path calculation has also been automated using the Flight Management System (FMS) and an auto-piloting system
has been introduced to reduce workload during critical phases of flight. Figure 4 depicts the three loops of
automation that include: the direct interaction between Pilot Flying and flight commands (no automation), the
interaction between Pilot Flying and auto-pilot (automation of task “fly”), and the interaction with the FMS
(automation of the task “Navigate”).


1
    https://www.airbus.com/company/history/aircraft-history/1970-1972/a300.html




                                                                     5
                                Figure 4. Three levels of automation of the flight management task

3.2 Main principles of description of human-automation collaboration
   When dealing with human-automation collaboration, it becomes insufficient to describe only the program of the
system. Indeed, human-automation teaming requires transparency (mutual understanding), bi-directional
communication and operator directed execution (responsibility of the final decision to the human) [1]. Description
of human-automation teaming requires to describe [10]: actors (both human and automation), role allocations of
actors and their relationships. In addition, it is required to identify the cooperative tasks of the ‘automation’ actor
that are actually implemented by one or several functions, as well as the type of collaborative tasks between the
‘automation’ actor and the human actor. Cummings and Bruni [2] defined a taxonomy of human-automation
collaboration based on the tasks that the human and the automation perform: data acquisition tasks, analysis tasks,
decision tasks and motor (action) tasks (see Figure 5). For example, the Decider is the actor(s) that makes the final
decision.




   Figure 5. High-level view of the three collaborative decision-making process roles: moderator, generator, and decider from [2]

3.3 Notation for the description of human-automation collaboration
To apply these main principles of description, we use the HAMSTERS-XL notation [7] that provide elements of
notation for describing cooperative tasks of both human actors and ‘automation’ actors. The cooperative human
tasks are the same as the ones presented in Table 2. The Figure 6 presents the system task types. The input and
output tasks are cooperative tasks that receive or send information.




                                                                 6
                                                   Figure 6. System task types

A separate model describes the tasks performed by a role of an ‘automation’ actor. In order to describe the
associations between automation cooperative tasks and human cooperative tasks and the flow of information
between them, we define cooperation protocols. A cooperation protocol describes information sharing between
several cooperative tasks, which contributes to transparency. Five attributes composed a cooperation protocol:
(main) type of collaborative activities (see Figure 2), the localization and time (see Table 3), cardinality (broadcast,
unicast, groupcast or anycast) and the information shared.

3.4 Example: ‘Modify flight plan’ group task with a two-pilots crew and automation of flight management
   Figure 7 presents an example of the usage of this notation. These tasks models describe the same group task as
before. However, the flight engineer role is automated. We consider an envisioned FMS. This envisioned FMS can
recommend flight plans to the pilots. This FMS performs similar tasks as the flight engineer but automated: receive
the request (CP1), calculate a possible new flight plan and display progression (CP2), then decide a flight plan to
propose to the pilots (CP3) and iterates (if rejection by pilots (CP4)) until the proposed flight plan is confirmed
(CP5). Finally, the FMS displays the new flight plan to pilots (CP6).




                 Figure 7. Recalculation of the flight plan due to weather adverse events with the envisioned FMS

   Figure 8 presents the cooperation protocols between the pilot and FMS roles. For example, the CP4 and CP5 are
cooperation protocols describing the communication of the decision made by pilots on the proposed flight plan
(reject or confirm). The pilots are the deciders of the flight plan selection. These cooperative tasks are synchronous,
local to the cockpit and unicast (from the pilot to the FMS).




                                                                7
                                        Figure 8. Cooperation protocols properties

Despite the fact that the envisioned FMS aims to implement some of the tasks that the flight engineer used to
perform, the collaboration will be of a different nature. For instance, the flight engineer might take time between
receiving a request and processing it and might even forget it. The automation at the opposite will process
information immediately (at least this will be perceived as such by the pilot). Indeed, the interaction with
automation might look like a standard interaction on a button: the user presses the mouse button while the cursor
is on it (this corresponds to sending a request) and the immediate graphical feedback showing the button pressed
corresponds to the acknowledgement. This fine grained interaction will have to be design, standardized and
implemented for any kind of cooperation or the interaction with automation will not be transparent [12]. This might
be even more complex due to the multiple objectives of automation [14]. Even though some design rules have been
proposed we are far from having standards defining interaction [13].

4 CONCLUSION AND TAKE AWAY MESSAGE
Integration of automation in the workplace started several decades ago. This paper highlights how automation has
been integrated in commercial aircraft cockpits to automate part of the flight management tasks. Nowadays, there
is a raising interest in AI technologies to automate additional human tasks. This questions the way the humans will
interact with this new kind of automation. Should this interaction be limited as it can be interpreted by exploring a
Google car (see Figure 9 a))? Or should this interaction be very complete in order to provide humans information
about what the system is collecting, doing, proposing and to provide humans ways to control this intelligence, as in
command and control rooms (see Figure 9 b))? The design and development of workplaces that exploit automation
and AI will still require notations, techniques and tools to describe precisely the human automation collaboration.
Furthermore, in workplaces using critical systems, such descriptions will be mandatory to support the certification
process of these systems.




                                                            8
                                       a)                                                          b)

                        Figure 9. Cockpit of the a) Google car b) control room of the Large Hadron Collider at CERN

ACKNOWLEDGMENTS
This work is partly funded by IKKY research project grant from Airbus.

REFERENCES
[1]   Battiste V., Lachter J., Brandt S., Alvarez A., Strybel T.Z., Vu KP.L. (2018) Human-Automation Teaming: Lessons Learned and Future Directions.
      In: Yamamoto S., Mori H. (eds) Human Interface and the Management of Information. Information in Applications and Services. HIMI 2018.
      Lecture Notes in Computer Science, vol 10905. Springer, Cham. https://doi.org/10.1007/978-3-319-92046-7_40
[2]   Mary Cummings and Sylvain Bruni. (2009). Collaborative Human–Automation Decision Making. In Springer Handbook of Automation, edited
      by Shimon Y. Nof, 437‑47. Springer Handbooks. Berlin, Heidelberg: Springer. https://doi.org/10.1007/978-3-540-78831-7_26.
[3]   Calvary, G., Coutaz, J., Nigay, L. From single-user architectural design to PAC*: a generic software architecture model for CSCW. In Proc. of CHI
      '97. ACM, NY, USA, 242-249.
[4]   Ellis, C. A., Gibbs, S. J., & Rein, G. (1991). Groupware: Some issues and experiences. Communications of the ACM, 34(1), 39–58.
      https://doi.org/10.1145/99977.99987
[5]   Michael Feary, Célia Martinie, Philippe A. Palanque, Manfred Tscheligi. Multiple Views on Safety-Critical Automation: Aircrafts, Autonomous
      Vehicles, Air Traffic Management and Satellite Ground Segments Perspectives. CHI Extended Abstracts 2016: 1069-1072
[6]   McGrath J. E. Groups: Interaction and Performance. Prentice Hall, Inc., Englewood Cliffs, 1984.
[7]   Célia Martinie, Eric Barboni, David Navarre, Philippe A. Palanque, Racim Fahssi, Erwann Poupart, Eliane Cubero-Castan. Multi-models-based
      engineering of collaborative systems: application to collision avoidance operations for spacecraft. EICS 2014: 85-94
[8]   Célia Martinie, Philippe Palanque, Elodie Bouzekri, Andy Cockburn, Alexandre Canny, and Eric Barboni. 2019. Analysing and Demonstrating
      Tool-Supported Customizable Task Notations. Proc. ACM Hum.-Comput. Interact. 3, EICS, Article 12 (June 2019), 26 pages. DOI:
      https://doi.org/10.1145/3331154
[9]   Roschelle, J., and Teasley, S. D. (1995). The construction of shared knowledge in collaborative problem solving. In C. E. O'Malley (Ed.),
      Computer-supported collaborative learning (pp. 69-197).
[10] Axel Schulte, and Diana Donath. (2018). A Design and Description Method for Human-Autonomy Teaming Systems. In Intelligent Human
     Systems Integration, édité par Waldemar Karwowski et Tareq Ahram, 3-9. Advances in Intelligent Systems and Computing. Cham: Springer
     International Publishing. https://doi.org/10.1007/978-3-319-73888-8_1.
[11] Wiener, Earl L. 1989. « Human factors of advanced technology (glass cockpit) transport aircraft ». NASA
[12] R. Bernhaupt, M. Cronel, F. Manciet, C. Martinie, and P. Palanque. 2015. Transparent Automation for Assessing and Designing better
     Interactions between Operators and Partly-Autonomous Interactive Systems. In Proceedings of the 5th International Conference on
     Application and Theory of Automation in Command and Control Systems (ATACCS '15). Association for Computing Machinery, New York, NY,
     USA, 129–139. DOI:https://doi.org/10.1145/2899361.2899375
[13] Philippe Palanque. 2020. Ten Objectives and Ten Rules for Designing Automations in Interaction Techniques, User Interfaces and Interactive
     Systems. In Proceedings of the International Conference on Advanced Visual Interfaces (AVI '20). Association for Computing Machinery, New
     York, NY, USA, Article 2, 1–10. DOI:https://doi.org/10.1145/3399715.3400872
[14] Philippe Palanque. 2018. Engineering Automations: From a Human Factor Perspective to Design, Implementation and Validation Challenges.
     In Proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS '18). Association for Computing
     Machinery, New York, NY, USA, Article 2, 1–2. DOI:https://doi.org/10.1145/3220134.3223044




                                                                          9