=Paper= {{Paper |id=Vol-2009/fmt-proceedings-2017-paper13 |storemode=property |title=Interaction Concepts for Collaborative Visual Analysis of Scatterplots on Large Vertically-Mounted High-Resolution Multi-Touch Displays |pdfUrl=https://ceur-ws.org/Vol-2009/fmt-proceedings-2017-paper13.pdf |volume=Vol-2009 |authors=Mohammad Chegini,Shao Lin,Dirk Joachim Lehmann,Keith Andrews,Tobias Schreck |dblpUrl=https://dblp.org/rec/conf/fmt/CheginiLLAS17 }} ==Interaction Concepts for Collaborative Visual Analysis of Scatterplots on Large Vertically-Mounted High-Resolution Multi-Touch Displays== https://ceur-ws.org/Vol-2009/fmt-proceedings-2017-paper13.pdf
     Interaction Concepts for Collaborative Visual
            Analysis of Scatterplots on Large
    Vertically-Mounted High-Resolution Multi-Touch
                        Displays
              Mohammad Chegini∗ , Lin Shao∗ , Dirk J. Lehmann† , Keith Andrews‡ and Tobias Schreck∗
            ∗ Institute of Computer Graphics and Knowledge Visualisation, Graz University of Technology, Austria

                                         Email: {m.chegini, l.shao, t.schreck}@cgv.tugraz.at
                                                 † University of Magdeburg, Germany

                                                Email: dirk@isg.cs.uni-magdeburg.de
                   ‡ Institute of Interactive Systems and Data Science, Graz University of Technology, Austria

                                                      Email: kandrews@tugraz.at


   Abstract—Large vertically-mounted high-resolution multi-             Although there are studies about collaborative interaction with
touch displays are becoming increasingly available for interactive      large displays (e.g. [7], [8]), they usually focus on single-
data visualisation. Such devices are well-suited to small-team          user interaction [9]. Since typical multi-touch interactions do
collaborative visual analysis. In particular, the visual analysis of
large high-dimensional datasets can benefit from high-resolution        not support collaboration, more research needs to be done on
displays capable of showing multiple coordinated views.                 cooperative gestures, modalities and the dynamics of group
   This paper identifies some of the advantages of using large,         work around these devices. Cooperative gestures are known to
high-resolution displays for visual analytics in general, and           enhance the sense of teamwork and increase the participation
introduces a set of interactions to explore high-dimensional            of team members [10].
datasets on large vertically-mounted high-resolution multi-touch
displays using scatterplots. A set of touch interactions for col-          Screen size and resolution are particularly important for
laborative visual analysis of scatterplots have been implemented        information visualisation of multivariate datasets. Having a
and are presented. Finally, three perception-based level of detail      large display allows multiple, linked views, such as scatter-
techniques are introduced for such displays as a concept for            plot matrices and parallel coordinates [11] to be provided
further implementation.                                                 simultaneously. If the screen is not high-resolution, the user
                                                                        experience of near distance interaction decreases significantly.
                       I. I NTRODUCTION
                                                                        For instance, on screens with less than sixty pixels per inch,
   Large high-resolution displays are becoming an affordable            the user is not able to read from the screen up-close [12]. Fur-
option for the visualisation of data [1]. Large displays have           thermore, users can make more observations with less effort
proved to be effective for tasks such as comparative genomics           using physical navigation (e.g., walking) rather than virtual
analysis [2], graph topology exploration [3], and sensemaking           [1]. More screen space can be used to either provide a better
[4]. Large vertically-mounted (landscape-orientation) high-             overview of a dataset or to provide more details of a portion
resolution multi-touch displays are particularly effective for          of it. For example, users can see both an entire scatterplot
collaborative analysis by small teams. However, previous                matrix, specific scatterplots, and parallel coordinates plots at
research has often focused on horizontally-mounted tabletop             the same time. As a result, users may have the opportunity to
surfaces or vertically-mounted displays with more distant               gain more insight into large datasets.
interaction [5]. In this paper, a set of user interactions to              Previous studies [5] suggest that vertically-mounted displays
support scatterplot matrices analysis on vertically-mounted             are more suited to parallel tasks within a group, due to reduced
displays are introduced. These techniques help analysts to              visual distraction and the possibility to share information
efficiently select a scatterplot from scatterplot matrices and          through physical navigation like turning the head or walking.
explore it collaboratively.                                             On tabletop displays, if users are not on the same side of the
   Some physical and virtual interactions with large displays           table, the shared view often needs to be reoriented.
were described in the previous literature. Modalities range                This paper addresses the design gap between standard inter-
from natural interactions like speech, body tracking, gaze, and         action techniques for large, multi-touch displays and advanced
gestures to the use of secondary control devices like mobile            interaction techniques and visual feedback for collaborative
phones, tablets, or Wii controllers [6]. Of these, multi-touch          scatterplot and scatterplot matrix analysis. Design concepts
interactions provide a fluid and intuitive interface suitable for       for such interaction techniques have been implemented as a
up-close interaction in front of the display by small groups.           proof of concept and are presented. The techniques include




                                                                       90
 Interaction Concepts for Collaborative Visual Analysis of Scatterplots on Large Vertically-Mounted
 High-Resolution Multi-Touch Displays

                                                                              densely or sparsely distributed targets. They concluded that
                                                                              since the whole dataset fits on a larger display, sparse targets
                                                                              can be found faster.
                                                                                 Multiple linked views are often used to gain a better
                                                                              understanding of a high-dimensional dataset. Such views are
                                                                              usually connected by techniques such as brushing or combined
                                                                              navigation [18]. Every view occupies space on display. If more
                                                                              space is available, additional views can be shown simultane-
                                                                              ously. Allowing the user to access multiple windows increases
                                                                              performance and satisfaction [19]. Isenberg et al. [20] present
                                                                              hybrid-image visualisation for data analysis, where two images
                                                                              are blended to achieve distance-dependent perception. This
                                                                              concept might be especially helpful for collaborative visual
                                                                              analysis tasks on vertically-mounted displays, where users
Fig. 1. Two users collaboratively analyse a dataset on a large vertically-    observe data from various distances.
mounted multi-touch screen. User A on the left drags a Regression Lens,
while user B on the right adapts the degree of the regression model using
the floating toolbox. The display is an Eyevis 84-inch 4K/Ultra-HD 60Hz
multi-touch LCD monitor with a resolution of 3840 × 2160.                     B. Visual Data Analysis and Multi-Touch Interaction

                                                                                 Previous researchers proposed various interaction tech-
scatterplot selection from scatterplot matrices, collaborative                niques for large displays and multi-dimensional dataset inter-
regression model analysis, and an extension of the Regression                 action on multi-touch displays. Ardito et al. [18] proposed
Lens [13] to include a floating toolbox. As a proof of concept,               a classification of large display interaction having five di-
the techniques are developed on a large display.                              mensions: visualisation technology, display setup, interaction
   The paper is structured as follows: Section II discusses re-               modality, application purpose, and location. Khan presented a
lated work. Several novel interaction designs for collaborative               survey of interaction techniques and devices for large, high-
visual analysis of scatterplots on large displays are introduced              resolution displays [6]. The survey categorises modalities of
in Section III. The use case and current implementation of the                interaction into speech, tracking, gestures, mobile phones, hap-
proposed interaction techniques are described in Section IV.                  tic and other technologies such as gaze and facial expression.
Section VI introduces the concept of perception-based level of
                                                                                 Tsandilas et al. presented SketchSliders [21], a tool that
visual detail. The paper concludes with a discussion of open
                                                                              provides a mobile sketching interface to create sliders which
problems and future work in Section VII.
                                                                              interact with multi-dimensional datasets on a wall display. In
                        II. R ELATED W ORK                                    comparison, in this paper, interaction is performed directly
   At a high level, information visualisation systems consist of              on the display rather than using a secondary touch device.
two components: visual representation and interaction. Visual                 Zhai et al. [22] introduced gesture interaction for wall displays
representation concerns the mapping from data to display [14].                based on the distance of the user from the screen. The gestures
The interaction starts with a user’s intent to perform a task,                can be performed in far or near mode. Unlike the techniques
followed by a user action. The system then reacts and feedback                described in this paper, the proposed interaction gestures are
is given to the user [15]. It is essential to consider both visual            not directly related to visual analytics tasks. Heilig et al. [23]
representation and interaction when designing an application                  developed multi-touch scatterplot visualisation on a tabletop
for information visualisation.                                                display. Sadana and Stasko [24] proposed advanced techniques
                                                                              for scatterplot data selection on smaller touch-based devices,
A. Visualisation on Large Displays                                            such as tablets and smartphones, whereas this paper focuses
   Researchers in various fields are increasingly confronted                  on large multi-touch displays.
with the challenge of visualising and exploring high-                            MultiLens supports various gestures for fluid multi-touch
dimensional datasets [13], [16]. Keim argues that although                    exploration of graphs [25]. The Regression Lens [13] allows
many traditional techniques exist to represent data, they are                 the user to interactively explore local areas of interest in
often not scalable to high-dimensional datasets without suit-                 scatterplots by showing the best fitting regression models
able analytical or interaction design [16].                                   inside the lens. The idea of visualising local regression models
   With the current size and resolution of typical computer                   is also studied by Matković et al. [26]. Rzeszotarski et al. [27]
displays, it is challenging to represent entire datasets on one               introduced Kinetica, a tool for exploring multivariate data by
screen using techniques like scatterplot matrices or parallel                 physical interactions on multi-touch screens. Kister et al. [25]
coordinates. The user is often forced to resort to panning and                presented BodyLenses, a promising set of magic lenses for
zooming, leading to frustration and longer task completion                    wall displays, which are mostly controlled by body interaction
times. Ruddle et al. [17] conducted an experiment in which                    and therefore suitable for interacting with wall displays from
participants searched maps on three different displays for                    a distance.




                                                                             91
 Interaction Concepts for Collaborative Visual Analysis of Scatterplots on Large Vertically-Mounted
 High-Resolution Multi-Touch Displays




Fig. 2. On the left, a user is drags a Regression Lens with the right hand while    Fig. 3. On the left, two users collaboratively analyse a scatterplot. Both users
adjusting the lens with the left hand. On the right, a user drags a scatterplot     create a regression model for a subset of selected data. The created models
with the right hand while panning through the scatterplot matrix with the left      are displayed in their partner’s respective lens as well, supporting comparison
hand.                                                                               of local data models. On the right, one user analyses a scatterplot, while their
                                                                                    partner selects interesting plots in the scatterplot matrix and passes them over
                                                                                    by holding the background and swiping the right hand.
   In comparison to this work, the aforementioned studies
either focus on a different type of interaction and medium
or are not designed for collaborative visual analytics tasks.                       terplots and scatterplot matrices on such devices. Some of
                                                                                    the interaction techniques are based on the concept of the
C. Collaborative Visualisation                                                      Regression Lens [13], which supports real-time regression
    Large displays are well-suited to collaboration [28], [29].                     analysis of subsets of a scatterplot through lens selection
Jakobsen and Hornbæk [5] conducted an exploratory study to                          and manipulation. With Regression Lens, a user can select
understand group work with high-resolution multi-touch wall                         a local area in a scatterplot and observe the regression model
displays. The study suggests that using this kind of display                        of selected points [13]. Shao et al. proposed operations to
helps users to work more efficiently as a group and fluidly                         adjust and manipulate the regression model shown in the
change between parallel and joint work. A large display ben-                        Regression Lens, such as changing the degree of the regression
efits group working on a shared task, since users can operate                       model or inverting its axes. Figure 1 illustrates some of the
on one common physical medium and share information on                              suggested collaborative gestures on an 84-inch 4K/ULTRA-
it.                                                                                 HD@60HZ multi-touch LCD monitor produced by Eyevis
    Morris et al. [10] formalised the concept of cooperative                        [33]. The user on the left finds interesting scatterplots and
gestures as a set of gestures performed by multiple users and                       passes them to the user on the right. The user on the right
interpreted as a single task by the system. Liu et al. developed                    analyses the plots using the Regression Lens [13]. In the rest
CoReach [9], a set of gestures for collaboration between two                        of this section, four interaction designs for both collaborative
users over large multi-touch displays. Comparing the use of a                       and single scatterplot analysis are introduced. Later in Section
large vertically-mounted display against two ordinary desktop                       IV, an implementation of these techniques is demonstrated.
displays, Prouzeau et al. [30] concluded that groups obtain
                                                                                    A. Lens and Floating Toolbox
better results and communicate better on large, vertically-
mounted displays.                                                                      Magic lens techniques like DragMagics [34] and
    An experiment by Pedersen and Hornbæk [31] showed                               BodyLens [35] are used to explore local regions in a
that users prefer horizontal surfaces over vertically-mounted                       visualisation. An extended version of the basic lens concept
displays, but this result was limited to simple single-user                         provides for more fluid interaction with large multi-touch
tasks and not collaborative tasks with different dynamics.                          displays. For instance, as shown in Figure 2, after a region of
Vertically-mounted displays allow users to obtain an overview                       interest has been selected in a scatterplot using the dominant
of their data by stepping back from the display and make it                         hand (here the right hand), a toolbox appears next to the
possible to interact from afar as well as up close. Badam et                        other side of the lens (near the non-dominant hand), where
al. [32] proposed a system for collaborative analysis on large                      the user can use sliders and touch buttons to adjust the lens.
displays by controlling individual lenses through explicit mid-                     For example, the user can change the degree of the regression
air gestures.                                                                       model. The lens can be dragged with one hand, while being
    Although these studies are not directly related to collabo-                     adjusted with the second hand, thus potentially speeding up
rative scatterplot analysis on large multi-touch displays, they                     performance.
do provide valuable insights into the design process of such
                                                                                    B. Two-Handed Interaction with Scatterplot Matrices
systems.
                                                                                      A scatterplot matrix consists of pairwise scatterplots ar-
          III. P ROPOSED I NTERACTION T ECHNIQUES                                   ranged in a matrix, with dimensions typically labelled in the
   Current standard multi-touch interaction techniques are                          diagonal cells. Since the number of dimensions is usually
not designed for collaboration on vertically-mounted high-                          high, panning and zooming within the scatterplot matrix is
resolution displays [9]. Here, both single-user and collab-                         almost inevitable. With common multi-touch interactions, the
orative interactions are proposed for the analysis of scat-                         scatterplot or dimension label is dragged to the corner of




                                                                                   92
 Interaction Concepts for Collaborative Visual Analysis of Scatterplots on Large Vertically-Mounted
 High-Resolution Multi-Touch Displays




Fig. 4. A user selects a scatterplot of interest from a scatterplot matrix by      Fig. 5. A Regression Lens containing a cubic regression model is shown. At
touching and holding the left hand on the scatterplot. Swiping with the right      the left side of the Regression Lens, a floating toolbox with various options
hand then passes the selected scatterplot to the right hand side of the display    is visible.
for more detailed analysis.

                                                                                   of Figure 3, the user on the left side of the screen creates a
the scatterplot matrix for panning. It is not feasible to zoom                     regression lens and regression model in blue. Meanwhile, the
into or out of a scatterplot matrix while dragging another                         user on the right side of the screen creates their regression
object. Based on two-handed interaction on tablets [36], a                         lens and regression model in red. Both users can see the other
two-handed technique is proposed whereby the dominant hand                         user’s regression model reflected in their own regression lens.
is responsible for dragging items, while the non-dominant
hand performs common operations. As shown on the left side                                                IV. I MPLEMENTATION
of Figure 2, the user drags a scatterplot around to reorder                           Proof-of-concept interaction techniques for single-user and
the plots in the scatterplot matrix. Panning is performed by                       collaborative analysis of scatterplots and scatterplot matrices
the non-dominant hand. With this two-handed technique, the                         have been implemented on a vertically-mounted Eyevis 84-
interactions needed to reorder scatterplots in a scatterplot                       inch multi-touch display with a resolution of 3840 × 2160
matrix can be reduced.                                                             pixels and a frame rate of 60 Hz. Figure 1 demonstrates a
                                                                                   typical setup of the implemented application with two users
C. Collaboration using Gestures                                                    working on the screen.
   On large vertically-mounted collaborative displays, it is not                      The prototype application is written in Java, using JavaFX
always desirable to move from one side of the screen to                            for the user interface and the TUIO [37] and the TUIOFX
the other to perform a task. Instead, collaborative gestures                       library [38] for multi-touch interaction. To enable multiple
can be used to pass objects. Based on the ideas of Liu et                          users to work on the same screen with different widgets
al. [9], collaborative gestures on scatterplots are proposed.                      and user interface elements at the same time, a concept
In the right-hand side of Figure 3, the user on the left is                        called focusArea from the TUIOFX library is used [39]. The
analyses a scatterplot, while the user on the right selects                        application follows the widely-used Model-View-Controller
another scatterplot of interest. By holding the background of                      (MVC) architecture.
the scatterplot matrix with one hand, and swiping with the
other hand, the scatterplot is passed over to the partner. The                                                  V. U SE C ASE
partner can then decide whether or not to load the scatterplot                        The use case for the prototype application is to improve
for comparison. This technique can also be used for other                          interaction with the Regression Lens on multi-touch screens.
tasks. For example, in Figure 4, the user selects a scatterplot                    The developed interaction techniques were tested with the
of interest from a scatterplot matrix by touching and holding it                   well-known car dataset from the UCI Machine Learning
with one hand (here, the left hand) and swipes the other hand                      Repository [40].
in the direction of the analysis panel to load that scatterplot                       For the interaction technique shown in Figure 1, user A (on
for more detailed analysis.                                                        the left) and user B (on the right) select two different plots
                                                                                   from the shared central area containing the scatterplot matrix.
D. Collaborative Lens                                                              For this technique, the user holds and touches a scatterplot
   In collaborative analysis, visual feedback plays an essential                   with one hand and swipes to the right or left with the other
role. When two analysts work on a vertically-mounted display                       hand to maximise it. This technique is elaborated in detail in
without proper visual feedback, they need to communicate                           Section III-C. After that, users A and B select an area in the
more and turn their heads more often. A collaborative lens                         scatterplot separately and toggle the Collaborative Lens option
can help ameliorate this issue. As illustrated on the left side                    in the Floating Toolbox. As described in Section III-D, each




                                                                                  93
 Interaction Concepts for Collaborative Visual Analysis of Scatterplots on Large Vertically-Mounted
 High-Resolution Multi-Touch Displays




Fig. 6. The left and right panels are scatterplots for User A (left) and B (right) respectively. The central area of the screen contains a shared scatterplot
matrix. User A on the left draws an arbitrary rectangle and is interested in the quadratic regression model of the selected records, shown in red. User B on
the right chooses to observe the cubic regression model of the selected area, shown in blue. User A can see the cubic regression model of the right panel in
dashed blue and user B can see the left panel regression model in dashed red. Selected scatterplots are highlighted in green in the scatterplot matrix.



user is now able to observe the regression model of the other                   analysis. Three techniques are proposed to apply a perception-
user in their regression lens. Figure 1 shows two users working                 based level of detail to scatterplots on large vertically-mounted
side by side on a large vertically-mounted multi-touch display,                 high-resolution displays.
after creating two separate Regression Lenses and toggling to                       Firstly, the concept of superpixels is similar to image
the Double Lens option. The exact state of the screen is shown                  mosaics. A superpixel consists of a set of pixels in a small
in Figure 6. A single Regression Lens with a floating toolbox                   rectangular area of the screen, for example a regular grid of
is visible in Figure 5.                                                         say 50×50 pixels. The average colour, brightness, and contrast
                                                                                properties of superpixels can be used to visualise data for
    VI. P ERCEPTION - BASED L EVEL OF V ISUAL D ETAIL                           users farther from the screen. At the same time, the individual
             C ONCEPTS FOR S CATTERPLOTS                                        colouring of pixels comprising a superpixel can be used to
                                                                                visualise more detailed information for users who are closer
   Users of large vertically-mounted high-resolution displays                   to the screen.
may take up positions at varying distances from the display,                        Secondly, the concept of a Screen Progressive Visual Glyph
and hence may perceive more or less detail in the display.                      (SPVG) utilises the colour, brightness, and contrast values of
At greater distances from a large high-resolution display, less                 a glyph to encode different secondary information for closer
detail is perceived. Here, perceived pixel density (PPD) is                     users. In Figure 8, the scatterplot on the left visually encodes
defined as the number of pixels mapped to a single cell on                      two different classes (brown and cyan) in the data. This is
the retina of the user’s eye. PPD increases quadratically as                    easily perceivable by a distant user. On the right, a user who is
distance to the screen increases. The human perceptual system                   closer can make out an additional level of detail: the dots of the
tends to average out too large PPD w.r.t. colour, brightness,                   scatterplot in fact contain an additional histogram representing
and contrast [41], for example a red pixel and a green pixel                    the distribution of the related class in the data. In this case,
is perceived as brown.                                                          the circles representing the mapped data points are SPVGs.
   The perceptual effect of averaging is well known, for                        The difference between SPVGs and superpixels is that SPVGs
instance in the perception of secondary colours as a mixture                    encode different visual details of the same data at different
of two primary colours or in the phenomena of metamerism.                       distances. In this way, they could be understood as a data
More related effects include simultaneous contrast [42], after-                 filter concept as well. SPVGs can be placed on the screen
images [43], and the Chubb effect [44]. Without delving too                     on demand and are not restricted to a regular grid, providing
deeply into perception psychology, note that a sophisticated                    greater flexibility.
theory for averaging effects are already available and well                         Thirdly, variational textures are related to halftone tech-
described. For the purpose of this discussion with respect                      niques. Structural variations of an underlying texture can be
to large high-resolution displays, it is sufficient to state that               used to visually encode fine data details for users who are
the effect of averaging a set of pixels is already exploited                    very close to the screen, while these details will immediately
in practice by techniques such as image mosaics [41] and                        disappear when the user goes further away.
halftone techniques [45], as illustrated in Figure 7.                               These proposed approaches for level of visual detail align
   Since PPD and related averaging effects are a function of                    well with Shneiderman’s mantra for information visualisation
distance from the display, screen distance can be seen as an                    [46]: “Overview first, zoom and filter, details on demand”. In
interactive parameter which can be exploited for visual data                    this case, distance from the screen is an additional degree of




                                                                             94
 Interaction Concepts for Collaborative Visual Analysis of Scatterplots on Large Vertically-Mounted
 High-Resolution Multi-Touch Displays




                                                                           Fig. 8. Screen Progressive Visual Glyphs (SPVGs): On the left, dots on a
                                                                           scatterplot representing items belonging to two classes (brown and cyan) are
                                                                           seen by distant users as simple dots. On the right, users who are closer to
Fig. 7. On the left, a multi-image mosaic of the Mona Lisa [41]. On the    the screen can perceive an additional histogram showing the distribution of
right, an example of halftone dot sampling [45].                           items.



freedom, controlled by each user individually as they move                 support to recommend views for small collaborative team work
closer to or further away from the display. The approaches                 on a large display. Moreover, adding group activity recognition
are discussed as a concept and not implemented yet.                        and therefore pro-active interaction, can support collaboration
                                                                           by preventing information overload [48].
            VII. D ISCUSSION AND F UTURE W ORK
                                                                                               VIII. C ONCLUDING R EMARKS
   The concepts described in this paper are first designs of                  This paper presented challenges and solutions for collabora-
appropriate touch interaction for the visual interactive analysis          tive and single-task multi-touch interaction on large vertically-
of scatterplot data on large vertically-mounted high-resolution            mounted high-resolution displays. The techniques presented
multi-touch displays. The interactions support small-group                 are well-suited for collaborative analysis tasks with scatterplots
collaborative analysis, by exchanging patterns or settings from            and scatterplot matrices. They are potentially generalisable for
one user’s view to the others. The interaction design is                   other data exploration and visual analytics practices but require
currently based on user selections, but is generalisable to                further implementation and evaluation. Also, perception-based
other basic techniques. The interaction techniques have been               visualisation of scatterplots is introduced as a possible direc-
implemented as a proof of concept. They still need to be                   tion for further research.
evaluated with real users and real tasks as part of future
work. Mapping out the design space for this combination                                                    R EFERENCES
of visualisation and display device may well yield further                     [1] K. Reda, A. E. Johnson, M. E. Papka, and J. Leigh, “Effects of display
interesting interaction designs.                                                   size and resolution on user behavior and insight acquisition in visual
                                                                                   exploration,” in Proceedings of the 33rd Annual ACM Conference on
   The idea of exploiting perception-based level of detail for                     Human Factors in Computing Systems. ACM, 2015, pp. 2759–2768.
the visualisation of scatterplots on large displays is new.                    [2] R. A. Ruddle, W. Fateen, D. Treanor, P. Sondergeld, and P. Ouirke,
Detailed information can be rendered inside the marks of                           “Leveraging wall-sized high-resolution displays for comparative ge-
                                                                                   nomics analyses of copy number variation,” in Biological Data Visu-
the plot, becoming perceivable once users are closer to the                        alization (BioVis), 2013 IEEE Symposium on. IEEE, 2013, pp. 89–96.
screen. Again, this is a proof of concept and requires further                 [3] A. Prouzeau, A. Bezerianos, and O. Chapuis, “Evaluating multi-user
development and evaluation.                                                        selection for exploring graph topology on wall-displays,” IEEE Trans-
                                                                                   actions on Visualization and Computer Graphics, 2016.
   While large high-resolution displays can improve the ex-                    [4] C. Andrews, A. Endert, and C. North, “Space to think: large high-
ploration of large scatterplot spaces, further data analysis                       resolution displays for sensemaking,” in Proceedings of the SIGCHI
support is needed to scale up with the number of data points                       conference on human factors in computing systems. ACM, 2010, pp.
                                                                                   55–64.
and dimensions. Traditional techniques like cluster analysis                   [5] M. R. Jakobsen and K. Hornbæk, “Up close and personal: Collaborative
and aggregation can help with scalability. Another relevant                        work on a high-resolution multitouch wall display,” ACM Transactions
line of improvement is to adjust the view to the user’s                            on Computer-Human Interaction (TOCHI), vol. 21, no. 2, p. 11, 2014.
                                                                               [6] T. K. Khan, “A survey of interaction techniques and devices for large
need and situation. In [47], the authors propose using eye                         high resolution displays,” in OASIcs-OpenAccess Series in Informatics,
tracking to infer user interest and using this information to                      vol. 19. Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik, 2011.
recommend additional relevant but previously unseen views                      [7] K. Vogt, L. Bradel, C. Andrews, C. North, A. Endert, and D. Hutch-
                                                                                   ings, “Co-located collaborative sensemaking on a large high-resolution
for exploration. While that work was developed as a desktop                        display with multiple input devices,” Human-Computer Interaction–
application, it might be interesting to incorporate eye-tracking                   INTERACT 2011, pp. 589–604, 2011.




                                                                          95
 Interaction Concepts for Collaborative Visual Analysis of Scatterplots on Large Vertically-Mounted
 High-Resolution Multi-Touch Displays

 [8] P. Isenberg, S. Carpendale, A. Bezerianos, N. Henry, and J.-D. Fekete,         [29] P. Isenberg, T. Isenberg, T. Hesselmann, B. Lee, U. Von Zadow, and
     “Coconuttrix: Collaborative retrofitting for information visualization,”            A. Tang, “Data visualization on interactive surfaces: A research agenda,”
     IEEE Computer Graphics and Applications, vol. 29, no. 5, pp. 44–57,                 IEEE Computer Graphics and Applications, vol. 33, no. 2, pp. 16–24,
     2009.                                                                               2013.
 [9] C. Liu, O. Chapuis, M. Beaudouin-Lafon, and E. Lecolinet, “Coreach:            [30] A. Prouzeau, A. Bezerianos, and O. Chapuis, “Trade-offs between a
     Cooperative gestures for data manipulation on wall-sized displays,”                 vertical shared display and two desktops in a collaborative path-finding
     in Proceedings of the 2017 CHI Conference on Human Factors in                       task,” in Proceedings of Graphics Interface 2017, 2017.
     Computing Systems. ACM, 2017, pp. 6730–6741.                                   [31] E. W. Pedersen and K. Hornbæk, “An experimental comparison of touch
[10] M. R. Morris, A. Huang, A. Paepcke, and T. Winograd, “Cooperative                   interaction on vertical and horizontal surfaces,” in Proceedings of the
     gestures: multi-user gestural interactions for co-located groupware,” in            7th Nordic Conference on Human-Computer Interaction: Making Sense
     Proceedings of the SIGCHI conference on Human Factors in computing                  Through Design. ACM, 2012, pp. 370–379.
     systems. ACM, 2006, pp. 1201–1210.                                             [32] S. K. Badam, F. Amini, N. Elmqvist, and P. Irani, “Supporting visual
[11] A. Inselberg, “The plane with parallel coordinates,” The visual computer,           exploration for multiple users in large display environments,” in Visual
     vol. 1, no. 2, pp. 69–91, 1985.                                                     Analytics Science and Technology (VAST), 2016 IEEE Conference on.
[12] M. Ashdown, P. Tuddenham, and P. Robinson, “High-Resolution Inter-                  IEEE, 2016, pp. 1–10.
     active Displays,” in Tabletops - Horizontal Interactive Displays, 2010,        [33] “Eyevis display 84-inch,” 2017. [Online]. Available: http:
     pp. 71–100.                                                                         //www.eyevis.de/en/products/lcd-solutions/4k-ultra-hd-lcd-monitors/
[13] L. Shao, A. Mahajan, T. Schreck, and D. J. Lehmann, “Interactive                    84-inch-4k-uhd-lcd.html
     regression lens for exploring scatter plots,” in Computer Graphics             [34] A. Prouzeau, A. Bezerianos, and O. Chapuis, “Towards road traffic
     Forum, vol. 36, no. 3. Wiley Online Library, 2017, pp. 157–166.                     management with forecasting on wall displays,” in Proceedings of the
[14] J. S. Yi, Y. A. Kang, J. Stasko, and J. Jacko, “Toward a deeper                     2016 ACM on Interactive Surfaces and Spaces. ACM, 2016, pp. 119–
     understanding of the role of interaction in information visualization.”             128.
     IEEE transactions on visualization and computer graphics, vol. 13,             [35] U. Kister, P. Reipschläger, F. Matulic, and R. Dachselt, “Bodylenses:
     no. 6, pp. 1224–31, 2007.                                                           Embodied magic lenses and personal territories for wall displays,”
[15] B. Lee, P. Isenberg, N. H. Riche, and S. Carpendale, “Beyond mouse                  in Proceedings of the 2015 International Conference on Interactive
     and keyboard: Expanding design considerations for information visual-               Tabletops & Surfaces. ACM, 2015, pp. 117–126.
     ization interactions,” IEEE Transactions on Visualization and Computer         [36] K.-P. Yee, “Two-handed interaction on a tablet display,” in CHI’04
     Graphics, vol. 18, no. 12, pp. 2689–2698, 2012.                                     Extended Abstracts on Human Factors in Computing Systems. ACM,
[16] D. Keim, “Information visualization and visual data mining,” IEEE                   2004, pp. 1493–1496.
     Transactions on Visualization and Computer Graphics, vol. 8, no. 1,            [37] M. Kaltenbrunner, T. Bovermann, R. Bencina, and E. Costanza, “Tuio:
     pp. 1–8, 2002.                                                                      A protocol for table-top tangible user interfaces,” in Proc. of the The
[17] R. A. Ruddle, R. G. Thomas, R. S. Randell, P. Quirke, and D. Treanor,               6th Intl Workshop on Gesture in Human-Computer Interaction and
     “Performance and interaction behaviour during visual search on large,               Simulation, 2005, pp. 1–5.
     high-resolution displays,” Information Visualization, vol. 14, no. 2, pp.      [38] M. Fetter and D. Bimamisa, “Tuiofxtoolkit support for the development
     137–147, 2015.                                                                      of javafx applications for interactive tabletops,” in Human-Computer
[18] J. C. Roberts, “Exploratory Visualization with Multiple Linked Views,”              Interaction. Springer, 2015, pp. 486–489.
     in Exploring Geovisualization. Amsterdam: Elseviers, 2005, pp. 159–            [39] M. Fetter, D. Bimamisa, and T. Gross, “Tuiofx: A javafx toolkit
     180.                                                                                for shared interactive surfaces,” Proceedings of the ACM on Human-
[19] M. Czerwinski, G. Smith, T. Regan, B. Meyers, G. G. Robertson, and                  Computer Interaction, vol. 1, no. 1, p. 10, 2017.
     G. Starkweather, “Toward characterizing the productivity benefits of very      [40] M. Lichman, “UCI machine learning repository,” 2013. [Online].
     large displays.” in Interact, vol. 3, 2003, pp. 9–16.                               Available: http://archive.ics.uci.edu/ml
[20] P. Isenberg, P. Dragicevic, W. Willett, A. Bezerianos, and J.-D. Fekete,       [41] A. Finkelstein and M. Range, “Image Mosaics,” Princeton University,
     “Hybrid-image visualization for large viewing environments,” IEEE                   Computer Science Department, Technical Report TR-574-98, Mar. 1998.
     transactions on visualization and computer graphics, vol. 19, no. 12,          [42] P. Burgh, “Peripheral viewing and simultaneous contrast,” The Quarterly
     pp. 2346–2355, 2013.                                                                Journal of Experimental Psychology 16(3), pp. 257–263, 1964.
[21] T. Tsandilas, A. Bezerianos, and T. Jacob, “Sketchsliders: Sketching           [43] S. Anstis, B. Rogers, and J. Henry, “Interactions between simultaneous
     widgets for visual exploration on wall displays,” in Proceedings of the             contrast and colored afterimages,” Vision Research, pp. 899–911, 1978.
     33rd Annual ACM Conference on Human Factors in Computing Systems.              [44] C. Chubb, G. Sperling, and J. Solomon, “Texture interactions determine
     ACM, 2015, pp. 3255–3264.                                                           perceived contrast,” Proc Natl Acad Sci, 86, pp. 9631–9635, 1989.
[22] Y. Zhai, G. Zhao, T. Alatalo, J. Heikkilä, T. Ojala, and X. Huang,            [45] R. Steinbrecher, “Bildverarbeitung in der praxis,” RST-Verlag. München-
     “Gesture interaction for wall-sized touchscreen display,” in Proceedings            Wien-Oldenburg, 2005.
     of the 2013 ACM conference on Pervasive and ubiquitous computing               [46] B. Shneiderman, “The eyes have it: A task by data type taxonomy for
     adjunct publication. ACM, 2013, pp. 175–178.                                        information visualizations,” in IEEE Visual Languages, 1996, pp. 336–
[23] M. Heilig, S. Huber, M. Demarmels, and H. Reiterer, “Scattertouch: a                343.
     multi touch rubber sheet scatter plot visualization for co-located data        [47] L. Shao, N. Silva, E. Eggeling, and T. Schreck, “Visual exploration
     exploration,” in ACM International Conference on Interactive Tabletops              of large scatter plot matrices by pattern recommendation based on eye
     and Surfaces. ACM, 2010, pp. 263–264.                                               tracking,” in Proceedings of the 2017 ACM Workshop on Exploratory
[24] R. Sadana and J. Stasko, “Expanding selection for information visual-               Search and Interactive Data Analytics. ACM, 2017, pp. 9–16.
     ization systems on tablet devices,” in Proceedings of the 2016 ACM on          [48] D. Gordon, J.-H. Hanne, M. Berchtold, A. A. N. Shirehjini, and
     Interactive Surfaces and Spaces. ACM, 2016, pp. 149–158.                            M. Beigl, “Towards collaborative group activity recognition using mo-
[25] U. Kister, P. Reipschläger, and R. Dachselt, “Multilens: Fluent inter-             bile devices,” Mobile Networks and Applications, vol. 18, no. 3, pp.
     action with multi-functional multi-touch lenses for information visual-             326–340, 2013.
     ization,” in Proceedings of the 2016 ACM on Interactive Surfaces and
     Spaces. ACM, 2016, pp. 139–148.
[26] K. Matković, H. Abraham, M. Jelović, and H. Hauser, “Quantitative
     externalization of visual data analysis results using local regression mod-
     els,” in International Cross-Domain Conference for Machine Learning
     and Knowledge Extraction. Springer, 2017, pp. 199–218.
[27] J. M. Rzeszotarski and A. Kittur, “Kinetica: naturalistic multi-touch data
     visualization,” in Proceedings of the 32nd annual ACM conference on
     Human factors in computing systems. ACM, 2014, pp. 897–906.
[28] C. Andrews, A. Endert, B. Yost, and C. North, “Information visualization
     on large, high-resolution displays: Issues, challenges, and opportunities,”
     Information Visualization, vol. 10, no. 4, pp. 341–355, 2011.




                                                                                   96