<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Series</journal-title>
      </journal-title-group>
      <issn pub-type="ppub">1613-0073</issn>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>The use of simple graphs and cliques for analysis of cartographic eye-tracking data</article-title>
      </title-group>
      <contrib-group>
        <aff id="aff0">
          <label>0</label>
          <institution>Jitka Dolezalova Department of Geoinformatics, Palacký University Olomouc 17. listopadu 50</institution>
          ,
          <addr-line>Olomouc</addr-line>
          ,
          <country country="CZ">Czech Republic</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Stanislav Popelka Department of Geoinformatics, Palacký University Olomouc</institution>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2016</year>
      </pub-date>
      <volume>1649</volume>
      <fpage>206</fpage>
      <lpage>211</lpage>
      <abstract>
        <p>Usability testing with the use of eye-tracking technology is now emerging. Measuring point of gaze is employed in different fields of research and helps to solve real world problems. One of these areas is cartography. In addition to traditional methods of analyses of eye-tracking data, as attention maps and gaze plots are, a more sophisticated method exists - scanpath comparison. Many different approaches to scanpath comparison exist. One of the most frequently used is String Edit Distance, where the gaze trajectories are replaced by the sequences of visited Areas of Interest. In cartography, these Areas of Interest could be marked around specific parts of maps - map composition elements. We have developed an online tool called ScanGraph which output is visualized as a simple graph, and similar groups of sequences are displayed as cliques of this graph. ScanGraph uses modified Levenshtein distance and Needleman-Wunsch algorithms for calculating the similarities between sequences of visited Areas of Interest. Cliques in the graph are sought with the use of the exhaustive algorithm. ScanGraph functionality is presented in the example of cartographic study dealing with uncertainty in maps. Stimuli in the study contained several visualization methods of uncertainty and eye-tracking experiment with 40 respondents was performed. With the use of ScanGraph, groups of participants with similar strategy were identified.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        Eye-tracking is one of the most precise and objective
methods of usability studies. The term usability is defined by
ISO 9241-11 as “the effectiveness, efficiency, and
satisfaction, with which specified users achieve specified
goals in particular environments”. To be able to derive
qualitative or quantitative measures of the user attitudes to
the product, many evaluation methods exists: focus group
studies, interview, direct observation, think-aloud protocol,
screen capturing and eye-tracking [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Each of the methods
for studying usability has its advantages and disadvantages.
Methods of focus groups and interview use a direct contact
with the user. They are based on a targeted questioning and
recording of discussions and reactions of individuals or
groups of respondents to a particular product. A very
important method of usability assessment is Think-aloud.
Participants verbally describe the process of particular task
solving and also their feelings [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. In the above methods, the
problem is the fact that participants are not aware of all
processes, and not all processes can be simply expressed in
words. The information that respondents communicate
during an interview or fill in the questionnaire may not
correspond to reality, although respondents believe their
answers [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. Cognitive load of the respondent during a
thinkaloud method can be so large that it affects his interaction
with the map. In contrast, during eye-movement recording,
the cognitive load associated with self-reporting is
eliminated. Eye-tracking can be considered an objective
method because recording eye movements does not rely on
self-reporting [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. A combination of different methods is
used very often (i.e. [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] or [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]).
      </p>
      <p>
        Hammoud and Mulligan [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] state that the beginning of the
scientific study of eye movement begun at the end of the 19th
century, when many methods for measuring eye movements
were developed. Some of these methods were based on
a mechanical transmission of the position information of the
eye [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], others use study of photographs [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. Most of the
modern eye-trackers work on the principle of non-contact
recording of the pupil and corneal reflection [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ].
Eyetracker is usually located below the monitor displaying
studied stimuli. This unit incorporates one or more infrared
light that shines in the direction of the user. The apparatus
also includes a camera that captures the user's eyes. The
center of the pupil and the reflection of infrared light is found
by image recognition. From the relative positions of these
two points, the device calculates the direction of view (Point
of Regard).
      </p>
      <p>Eye-tracking is used in many areas. The most common are
psychological studies, medicine, HCI (Human-Computer
Interaction), marketing, usability studies and also
cartography.</p>
      <p>
        Although the eye-tracking was firstly used for the
evaluation of maps and cartographic works in the late 50s of
the 20th century [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ], it is increasingly used in the last ten to
fifteen years. The reason is the decreasing cost of equipment
and the development of computer technology, which allows
faster and more efficient analysis of the measured data. The
eye-tracking in cartography can be used for evaluation of
map portals [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], meteorological maps [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ], for analysis of
text labels on the map [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ] or 3D visualization in
cartography [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
      </p>
      <p>In most of the studies, measured data were evaluated with
the use of statistical analysis of eye-tracking metrics. For
visualization of the data mostly only basic visualization
methods such as Scanpath or Heatmaps were used. In some
cases, the most sophisticated method of analysis is needed.</p>
      <p>
        The example of this sophisticated method is Scanpath
Comparison. This method can be used in the situation when
the similarity between different participants’ strategy is
investigated. The beginning of the interest about distinctive
scanning pattern can be found in the study of Noton and
Stark [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ], who reported a qualitative similarity in
eyemovements when people viewed line drawings on multiple
occasions. The scanpath consists of sequences of alternating
saccades and fixations that repeat themselves when
a respondent is viewing stimuli. Scanpath comparison
methods can be divided into six groups (String Edit
Distance, ScanMatch, Sample-based measures, Linear
distance, MultiMatch and Cross-recurrence quantification
analysis). The comparison of these methods is described in
[
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]. One of the most frequently used methods is String Edit
Distance, which is used to measure the dissimilarity
of character strings. For the use of String Edit Distance, the
grid or Areas of Interest (AOI) have to be marked in the
stimulus. The gaze trajectory (scanpath) is then replaced by
a character string representing the sequence of fixations with
characters for AOIs they hit. Only 10 percent of the scanpath
duration is taken up by the collective duration of saccadic
eye-movements. Fixations took 90 percent of the total
viewing period [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ].
2
      </p>
    </sec>
    <sec id="sec-2">
      <title>Methods</title>
      <p>
        ScanGraph is a web application developed by authors of the paper.
Its purpose is to analyse similarities between sequences of visited
Areas of Interest from eye-tracking data. It is designed to load data
directly from open-source application OGAMA [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ], so no
additional data preparation is needed. The motivation for the
creation of the application was the lack of any other tool which will
allow finding groups of participants with a similar strategy of
stimuli observation based on the given degree of similarity. The
interface of ScanGraph is displayed in Figure 1. The application is
freely available at www.eyetracking.upol.cz/scangraph. More
information about the use of the application is available in [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ] and
needed principles are described below.
Let : Σ ∗ Σ ∗ → be a distance function measuring the distance
between two given sequences (words) , ∈ Σ ∗. We require to
have these properties:
,
,
,
,
, ,
(reflexivity)
(symmetry)
      </p>
      <p>(triangle inequality)
ScanGraph uses two distance functions based on a Levenshtein
distance and Needleman-Wunsch algorithm.</p>
      <p>
        Levenshtein distance is named after the Russian scientist Vladimir
Levenshtein [
        <xref ref-type="bibr" rid="ref20">20</xref>
        ]. Levenshtein distance between two strings
, , … , | |; , , … , | | of the length || and ||
(let us denote , ) is the number of deletions, insertions, or
substitutions needed to transform source string into target string.
Hence, , if and only if the strings are equal and
, max ||, || if and only if there is any
correspondence between the strings. The value of Levenshtein
distance is increasing with larger differences between the strings.
The modified Levenshtein distance function ´, used by
the ScanGraph is defined by this equation:
´
,
      </p>
      <p>,
max| |, | |
.</p>
      <p>
        The other used metric of sequence alignment is called
NeedlemanWunsch algorithm [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ] with its scoring system.
,
The Needleman-Wunsch algorithm (let us denote its value
) searches for concordant elements between two strings
, , … , | |; , , … , | | of the length || and
|| . The basic scoring system used for our needs is given by Match
reward equal to , Gap cost equal and Mismatch penalty equal
to . Hence, , min ||, || , when is a subset
of or is a subset of . The value of , is increasing
with the similarity between the strings.
      </p>
      <p>The modified Needleman-Wunsch algorithms ´,
the ScanGraph is defined by this equation:
used by
´,
.</p>
      <p>The values of ´, , ´, ∈ , express the degree
of similarity. The higher the value, the greater similarity. The
matrix formed by these values is constructed. The user
sets a value, which represents the minimal desired degree of
similarity. This value is called the parameter and is denoted as .
Hence, the adjacency matrix of the graph is created
from the matrix according to this relation:
, if
, otherwise.</p>
      <p>Groups of sequences with a degree of similarity higher or equal to
the desired parameter are equivalent to cliques in the given
graph .</p>
      <p>ScanGraph seeks the cliques as submatrices
the adjacency matrix order , , where
, … , , and there doesn’t exist any matrix ’ ⊃
condition.
order</p>
      <p>of
, ∀, ∈
with the same</p>
      <p>
        Maximal clique problem is - complete problem [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ]. Hence,
the algorithm doesn’t run in a polynomial time. When the
computing time is too long, the greedy heuristic is used.
3
      </p>
    </sec>
    <sec id="sec-3">
      <title>Example case study</title>
      <p>
        Analysis of recorded eye-tracking data using ScanGraph can be
employed in every case, where it is appropriate to compare
different groups of respondents. Despite the fact that ScanGraph is
quite new, it was used for several case studies yet. Analysis of
differences between cartographers and non-cartographers
observing different map compositions was performed in [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ].
Differences between males and females during searching for point
symbol in a map were found in [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ]. Snopková [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ] analysed
differences of map reading between people with normal vision and
with colour-blind participants. Apparently, ScanGraph can also be
used in other fields of research (not only cartography). Pulkrtová
[
        <xref ref-type="bibr" rid="ref25">25</xref>
        ] used it in her psychological thesis dealing with the different
perception of red colour by males and females. Hájková [
        <xref ref-type="bibr" rid="ref26">26</xref>
        ] used
ScanGraph in the study at Department of Physiotherapy,
University Hospital Olomouc with patients after brain stroke. She
compared the control group with two groups of patients with
different types of stroke.
      </p>
      <p>
        In this paper, possibilities of Scangraph will be presented on the
example of cartographic study dealing with the uncertainty
visualization of maps. Uncertainty is seen as vagueness,
randomness of conditions or result of particular processes and
phenomena. The concept of uncertainty is also quite often used to
describe little certainty about a particular phenomenon in
maps [
        <xref ref-type="bibr" rid="ref27">27</xref>
        ]. Many approaches and methods of uncertainty
visualization have been developed based for example on Bertin’s
theory of graphic variables and combining both static and dynamic
elements of visualization [
        <xref ref-type="bibr" rid="ref28">28</xref>
        ].
      </p>
      <p>
        The case study uses data from master thesis [
        <xref ref-type="bibr" rid="ref29">29</xref>
        ]. In this thesis,
sets of cartographic symbols for visualization of an uncertainty of
point, lines and areas were created. Point symbols have been set up
according to the study of [
        <xref ref-type="bibr" rid="ref30">30</xref>
        ]. These symbols were placed into
maps, and these maps were used as stimuli for the eye-tracking
experiment and online questionnaire. The aim of the thesis was to
find which visualizations are the most comprehensible for the map
reader. The experiment was conducted with 40 participants.
Twenty of them were students of cartography, twenty of them were
respondents with no education in cartography. In the thesis,
eyetracking metrics (Trial Duration, Gaze Length) and accuracy of
answers were compared to all stimuli. Total of 27 maps with point
symbols were used in the experiment. Thirteen of them were
depicting the single phenomenon; eight were representing the
combination of more phenomenon. The last six maps were
showing the spatial and temporal uncertainty separately.
      </p>
      <p>In the beginning, user observation of stimulus BK07 was
analysed. In this case, the map contained 16 point symbols
representing the possible occurrence of three animal species (wild
boar, hare, and fox) with different level of uncertainty. The task was
to find the most probable locality, where it is possible to found each
animal. The legend for all three species was located on the right
side of the stimulus. The left part contained an orthophoto map with
point symbols. Areas of Interest were marked around the map field
and each part of the legend (see Figure 2).</p>
      <p>
        Gaze data were converted to the strings of characters
according to the position of fixations in marked Areas of
Interest in OGAMA software [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ]. The process of conversion
is displayed in Figure 3. From the scanpaths (left side of
Figure 3), the character strings are generated. For the analysis,
collapsed strings (with no consecutive characters - right part
of Figure 3) will be used.
      </p>
      <p>In the ScanGraph interface, modified Levenshtein
computation method was selected. As is mentioned above,
collapsed data were used for analysis. The parameter (see
above for more information) was set up to .
(representing the similarity at least % ). Six non-trivial
cliques were found in the resulting graph (see Figure 4).</p>
      <p>The largest clique contained five sequences (participants).
Three of them were cartographers; two belonged to the
group of non-cartographers. The strategy of these
participants can be described as an ideal one. All of them
started in the center of the screen (AOI D). Then they moved
their gaze to all parts of the legend (AOI A, B, and C) and
then they moved back to the map field (AOI D) and sought
for the correct answer. Participants P13 and P17 made an
additional fixation in the AOI A after looking into AOI B.</p>
      <p>In the clique with four participants, the situation was
similar. In this case, all sequences were “DABCD” The only
exception was participant P20, who performed an additional
fixation in AOI B at the end of stimulus observation.</p>
      <p>The rest of displayed non-trivial cliques contained only
two participants. All of these participants omitted AOI C
during their view of the stimulus. The AOI C was marked
around the last part of the legend (representing the possible
occurrence of the fox). Because all legends looked similar,
these participants decided not to look into the last part.</p>
      <p>The rest of participants from the experiment were isolated
nodes. That means that their sequence of visited Areas of
Interest was not similar to any other sequence (according to
parameter . ). An example of these sequences can be
participant P29 with sequence “DABADADADADA
BDBCBDAD”, P30 with sequence DAD, or P43, who spent
the whole observation time in the map field (sequence “D”).
All these three participants belonged to the group of
noncartographers.</p>
      <p>With the use of ScanGraph, we were able to find quickly
the group of participants, who observed the stimuli in
a similar way. After examination of the particular sequences,
it was discovered that this sequence was the “ideal one”.</p>
      <p>The second analysed map from the experiment was
stimulus C03. The map, in this case, depicted the possible
occurrence of the fox. Unlike of the previous stimuli, spatial
and temporal uncertainty was displayed with two different
map symbols. The task of respondents was to find a place,
where is the most probable possibility (both spatial and
temporal) to found a fox. Areas of Interest were again
marked in the stimuli (see Figure 5). AOI A represented the
correct answer. AOI B was marked around the symbol of the
fox, which also served as a map title (recorded incidence of
foxes). Other two AOIs were marked around two parts of the
legend (spatial uncertainty - AOI C and temporal uncertainty
- AOI D). The last AOI (E) was marked around the map field
(except the correct answer location).</p>
      <p>The same settings of ScanGraph as in the previous
example was used, only the parameter value was set up to
. . This example contained more AOIs, so the lower
similarity between sequences can be assumed. When we
tried to use the same value of the parameter as in the previous
example ( . ), only one clique containing two
participants was found. Resulting graph can be seen in
Figure 6.</p>
      <p>
        Total of 13 non-trivial cliques were found in the output of
ScanGraph. Interesting is the comparison between two
cliques containing three participants. The first one is a clique
with participants P14, P22, and P27. All of them belonged to
the group of cartographers. The second clique with
participants P16, P29 and P43 is highlighted in Figure 6. All
these participants were non-cartographers. The difference
between these cliques lies in the fact, that none of the
noncartographers observed the AOI B marked around the map
title. Students of cartography are taught to pay attention to
the map title. Similar behaviour was found in another study
comparing respondents’ reading of different map
compositions [
        <xref ref-type="bibr" rid="ref31">31</xref>
        ]. Non-cartographers were almost entirely
omitting the map title during a free-viewing task.
      </p>
      <p>The longest common subsequence , of sequences
(words) , is the sequence with maximal length | | such
that there exist , … , and , … , such that ∀, : ⟹
∧ and ∀ : .</p>
      <p>
        The second possible algorithm, called Damerau–
Levenshtein distance [
        <xref ref-type="bibr" rid="ref32">32</xref>
        ] is an enhancement of Levenshtein
distance algorithm. In addition, it calculates with transitions.
,
      </p>
      <p>The Damerau–Levenshtein distance , is a distance
between two sequences , , given by counting the minimum
number of operations needed to transform one string into the
other, where an operation is defined as an insertion, deletion,
or substitution of a single character, or a transposition of two
adjacent characters. For example , against
.</p>
      <p>The final selection of the used metric will depend on the
character of data and distribution of AOI within the stimuli.</p>
      <p>
        Unless the calculation uses the exhaustive algorithm it
finds an optimal solution with all non-trivial cliques in the
given graph. The computational time of the exhaustive
algorithm is . When the time exceeds the tolerable
limit, greedy heuristic algorithm is used. The reliability of
the results is arguable. For the purposes of eye-tracking data
analyses, the higher value of similarity (hence lower number
of edges) is investigated. Moreover, the Bron and Kerbosch
algorithm [
        <xref ref-type="bibr" rid="ref33">33</xref>
        ] for maximal clique problem will be tested and
compared to the currently used algorithm.
5
      </p>
    </sec>
    <sec id="sec-4">
      <title>Conclusion</title>
      <p>The paper describes the newly developed tool for the
analysis of eye-movement data. Eye-movements are
represented as sequences of fixations recorded in Areas of
Interest marked in the stimuli. The application uses modified
Levenshtein distance and Needleman-Wunsch algorithms
and visualize the result in the form of a simple graph. Groups
of participants with similar strategy are represented as
cliques of this graph. The paper describes the principles of
the computations. The functionality of the application is
presented in the example of cartographic case study dealing
with map uncertainty visualization.</p>
      <p>The tool is called ScanGraph and is freely available at
www.eyetracking.upol.cz/scangraph.</p>
    </sec>
    <sec id="sec-5">
      <title>Acknowledgment</title>
      <p>We would like to thank Michal Kučera, who's data were used
for the case study. This paper was supported by projects of
Operational Program Education for Competitiveness –
European Social Fund (projects CZ.1.07/2.3.00/20.0170), of
the Ministry of Education, Youth and Sports of the Czech
Republic and the student project IGA_PrF_2016_008 of the
Palacky University.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Çöltekin</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kraak</surname>
          </string-name>
          , M.
          <article-title>-</article-title>
          <string-name>
            <surname>J.</surname>
          </string-name>
          (
          <year>2010</year>
          )
          <article-title>Visual exploration of eye movement data using the space-timecube</article-title>
          .
          <source>In Geographic Information Science</source>
          . Springer, pp.
          <fpage>295</fpage>
          -
          <lpage>309</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Dykes</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Maceachren</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kraak</surname>
          </string-name>
          , M.
          <article-title>-</article-title>
          <string-name>
            <surname>J.</surname>
          </string-name>
          (
          <year>2005</year>
          )
          <article-title>Exploring geovisualization</article-title>
          .
          <source>Elsevier</source>
          ,
          <volume>710</volume>
          p.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Coltekin</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Heil</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Garlandini</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fabrikant</surname>
            ,
            <given-names>S. I.</given-names>
          </string-name>
          (
          <year>2009</year>
          )
          <article-title>Evaluating the effectiveness of interactive map interface designs: a case study integrating usability metrics with eye-movement analysis</article-title>
          .
          <source>Cartography and Geographic Information Science</source>
          ,
          <volume>36</volume>
          (
          <issue>1</issue>
          ), pp.
          <fpage>5</fpage>
          -
          <lpage>17</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Goldberg</surname>
            ,
            <given-names>J. H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kotval</surname>
            ,
            <given-names>X. P.</given-names>
          </string-name>
          (
          <year>1999</year>
          )
          <article-title>Computer interface evaluation using eye movements: methods and constructs</article-title>
          .
          <source>International Journal of Industrial Ergonomics</source>
          ,
          <volume>24</volume>
          (
          <issue>6</issue>
          ), pp.
          <fpage>631</fpage>
          -
          <lpage>645</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Alacam</surname>
            ,
            <given-names>Ö.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dalci</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2009</year>
          )
          <article-title>A usability study of WebMaps with eye tracking tool: the effects of iconic representation of information. In Human-Computer interaction</article-title>
          . New trends. Springer, pp.
          <fpage>12</fpage>
          -
          <lpage>21</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Cutrell</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Guan</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          (
          <year>2007</year>
          )
          <article-title>What are you looking for?: an eye-tracking study of information usage in web search</article-title>
          .
          <source>In Proceedings of the Proceedings of the SIGCHI conference on Human factors in computing systems, ACM</source>
          , pp.
          <fpage>407</fpage>
          -
          <lpage>416</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Hammoud</surname>
            ,
            <given-names>R. I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mulligan</surname>
            ,
            <given-names>J. B.</given-names>
          </string-name>
          (
          <year>2008</year>
          )
          <article-title>Introduction to Eye Monitoring</article-title>
          .
          <source>In Passive Eye Monitoring</source>
          . Springer, pp.
          <fpage>1</fpage>
          -
          <lpage>19</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Delabarre</surname>
            ,
            <given-names>E. B.</given-names>
          </string-name>
          (
          <year>1898</year>
          )
          <article-title>A method of recording eyemovements</article-title>
          .
          <source>The American Journal of Psychology</source>
          ,
          <volume>9</volume>
          (
          <issue>4</issue>
          ), pp.
          <fpage>572</fpage>
          -
          <lpage>574</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Dodge</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cline</surname>
            ,
            <given-names>T. S.</given-names>
          </string-name>
          (
          <year>1901</year>
          )
          <article-title>The angle velocity of eye movements</article-title>
          .
          <source>Psychological Review</source>
          ,
          <volume>8</volume>
          (
          <issue>2</issue>
          ), pp.
          <fpage>145</fpage>
          -
          <lpage>157</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Holmqvist</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nyström</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Andersson</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dewhurst</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jarodzka</surname>
            , H., Van De Weijer,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>2011</year>
          )
          <article-title>Eye tracking: A comprehensive guide to methods and measures</article-title>
          . Oxford University Press, 537 p.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Enoch</surname>
            ,
            <given-names>J. M.</given-names>
          </string-name>
          (
          <year>1959</year>
          )
          <article-title>Effect of the size of a complex display upon visual search</article-title>
          .
          <source>JOSA</source>
          ,
          <volume>49</volume>
          (
          <issue>3</issue>
          ), pp.
          <fpage>280</fpage>
          -
          <lpage>285</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Fabrikant</surname>
            ,
            <given-names>S. I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hespanha</surname>
            ,
            <given-names>S. R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hegarty</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2010</year>
          )
          <article-title>Cognitively inspired and perceptually salient graphic displays for efficient spatial inference making</article-title>
          .
          <source>Annals of the Association of american Geographers</source>
          ,
          <volume>100</volume>
          (
          <issue>1</issue>
          ), pp.
          <fpage>13</fpage>
          -
          <lpage>29</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Ooms</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>De Maeyer</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fack</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          (
          <year>2015</year>
          )
          <article-title>Listen to the Map User: Cognition, Memory, and Expertise</article-title>
          .
          <source>The Cartographic Journal.</source>
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Popelka</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dedkova</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          <article-title>Extinct village 3D visualization and its evaluation with eye-movement recording</article-title>
          .
          <source>2014. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).</source>
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Noton</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stark</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          (
          <year>1971</year>
          )
          <article-title>Scanpaths in saccadic eye movements while viewing and recognizing patterns</article-title>
          .
          <source>Vision Research</source>
          , 9//, 11(
          <issue>9</issue>
          ), pp.
          <fpage>928</fpage>
          -
          <lpage>929</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Anderson</surname>
            ,
            <given-names>N. C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Anderson</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kingstone</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bischof</surname>
            ,
            <given-names>W. F.</given-names>
          </string-name>
          (
          <year>2014</year>
          )
          <article-title>A comparison of scanpath comparison methods</article-title>
          .
          <source>Behavior research methods</source>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>16</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Bahill</surname>
            ,
            <given-names>A. T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stark</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          (
          <year>1979</year>
          )
          <article-title>The trajectories of saccadic eye movements</article-title>
          .
          <source>Scientific American</source>
          ,
          <volume>240</volume>
          (
          <issue>1</issue>
          ), pp.
          <fpage>108</fpage>
          -
          <lpage>117</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Voßkühler</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nordmeier</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kuchinke</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jacobs</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          (
          <year>2008</year>
          )
          <article-title>OGAMA (Open Gaze and Mouse Analyzer): open-source software designed to analyze eye and mouse movements in slideshow study designs</article-title>
          .
          <source>Behavior research methods</source>
          ,
          <volume>40</volume>
          (
          <issue>4</issue>
          ), pp.
          <fpage>1150</fpage>
          -
          <lpage>1162</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Dolezalova</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Popelka</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          (
          <year>2016</year>
          )
          <article-title>ScanGraph: A Novel Scanpath Comparison Method Using Visualization of Graph Cliques</article-title>
          .
          <source>Journal of Eye Movement Research</source>
          , In print.
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Levenshtein</surname>
            ,
            <given-names>V. I.</given-names>
          </string-name>
          (
          <year>1966</year>
          )
          <article-title>Binary codes capable of correcting deletions, insertions, and reversals</article-title>
          .
          <source>Soviet physics doklady</source>
          ,
          <volume>10</volume>
          (
          <issue>8</issue>
          ), pp.
          <fpage>707</fpage>
          -
          <lpage>710</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <surname>Needleman</surname>
            ,
            <given-names>S. B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wunsch</surname>
            ,
            <given-names>C. D.</given-names>
          </string-name>
          (
          <year>1970</year>
          )
          <article-title>A general method applicable to the search for similarities in the amino acid sequence of two proteins</article-title>
          .
          <source>Journal of molecular biology</source>
          ,
          <volume>48</volume>
          (
          <issue>3</issue>
          ), pp.
          <fpage>443</fpage>
          -
          <lpage>453</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <surname>Gross</surname>
            ,
            <given-names>J. L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yellen</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>2005</year>
          )
          <article-title>Graph theory and its applications</article-title>
          . CRC press.
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <surname>Dolezalova</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Popelka</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          (
          <year>2016</year>
          )
          <article-title>Evaluation of user strategy on 2D and 3D city maps based on novel scanpath comparison method and graph visualization</article-title>
          .
          <source>In Proceedings of the ISPRS</source>
          <year>2016</year>
          , Prague.
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <surname>Snopková</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          (
          <year>2016</year>
          )
          <article-title>Tvorba a užití map osobami se sníženou schopností rozpoznání barev</article-title>
          . Brno, Masaryk University.
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <surname>Pulkrtová</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          (
          <year>2016</year>
          )
          <article-title>Vliv červené barvy na vnímání atraktivity žen</article-title>
          . Brno, Masaryk University.
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <surname>Hájková</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2016</year>
          )
          <article-title>Eye tracking vyšetření predilekce očních pohybů u pacientů po cévní mozkové příhodě</article-title>
          . Olomouc, Palacký University Olomouc.
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <surname>Brus</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>2013</year>
          )
          <article-title>Uncertainty vs. spatial data quality visualisations: a case study on ecotones</article-title>
          .
          <source>International Multidisciplinary Scientific GeoConference: SGEM: Surveying Geology &amp; mining Ecology Management</source>
          ,
          <volume>1</volume>
          , pp.
          <fpage>1017</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [28]
          <string-name>
            <surname>Kubíček</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Šašinka</surname>
          </string-name>
          , Č.,
          <string-name>
            <surname>Stachoň</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          (
          <year>2012</year>
          )
          <article-title>UNCERTAINTY VISUALIZATION TESTING</article-title>
          .
          <source>In Proceedings of the Proceedings of the 4th conference on Cartography and GIS</source>
          , Sofia,
          <string-name>
            <given-names>T. BANDROVA</given-names>
            ,
            <surname>M. KONEČNÝ</surname>
          </string-name>
          , G. ZHELEZOV eds.,
          <source>Bulgarian Cartographic Association</source>
          , pp.
          <fpage>247</fpage>
          -
          <lpage>256</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [29]
          <string-name>
            <surname>Kučera</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2016</year>
          )
          <article-title>Uživatelské testování a optimalizace vizualizací nejistoty prostorových dat</article-title>
          . Olomouc, Palacký University Olomouc 64 p.
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          [30]
          <string-name>
            <surname>Maceachren</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Roth</surname>
            ,
            <given-names>R. E.</given-names>
          </string-name>
          , O'brien, J.,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Swingley</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gahegan</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2012</year>
          )
          <article-title>Visual semiotics &amp;amp; uncertainty visualization: An empirical study</article-title>
          .
          <source>Visualization and Computer Graphics</source>
          , IEEE Transactions on,
          <volume>18</volume>
          (
          <issue>12</issue>
          ), pp.
          <fpage>2496</fpage>
          -
          <lpage>2505</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          [31]
          <string-name>
            <surname>Brychtova</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Popelka</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dobesova</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          (
          <year>2012</year>
          )
          <article-title>Eye - Tracking methods for investigation of cartographic principles</article-title>
          .
          <source>In Proceedings of the 12th International Multidisciplinary Scientific GeoConference and EXPO</source>
          , Varna, pp.
          <fpage>1041</fpage>
          -
          <lpage>1048</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          [32]
          <string-name>
            <surname>Damerau</surname>
            ,
            <given-names>F. J.</given-names>
          </string-name>
          (
          <year>1964</year>
          )
          <article-title>A technique for computer detection and correction of spelling errors</article-title>
          .
          <source>Communications of the ACM</source>
          ,
          <volume>7</volume>
          (
          <issue>3</issue>
          ), pp.
          <fpage>171</fpage>
          -
          <lpage>176</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref33">
        <mixed-citation>
          [33]
          <string-name>
            <surname>Bron</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kerbosch</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>1973</year>
          )
          <article-title>Algorithm 457: finding all cliques of an undirected graph</article-title>
          .
          <source>Communications of the ACM</source>
          ,
          <volume>16</volume>
          (
          <issue>9</issue>
          ), pp.
          <fpage>575</fpage>
          -
          <lpage>577</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>