=Paper= {{Paper |id=Vol-2068/wii4 |storemode=property |title=VisualEYEze: A Web-based Solution for Receiving Feedback on Artworks Through Eye-Tracking |pdfUrl=https://ceur-ws.org/Vol-2068/wii4.pdf |volume=Vol-2068 |authors=Bailey Bauman,Regan Gunhouse,Antonia Jones,Willer Da Silva,Shaeeta Sharar,Vijay Rajanna,Josh Cherian,Jung In Koh,Tracy Hammond |dblpUrl=https://dblp.org/rec/conf/iui/BaumanGJSSRCKH18 }} ==VisualEYEze: A Web-based Solution for Receiving Feedback on Artworks Through Eye-Tracking== https://ceur-ws.org/Vol-2068/wii4.pdf
           VisualEYEze: A Web-based Solution for Receiving
              Feedback on Artwork Through Eye Tracking

  Bailey Bauman, Regan Gunhouse, Antonia Jones, Willer Da Silva, Shaeeta Sharar, Vijay Rajanna,
                             Josh Cherian, Jung In Koh, Tracy Hammond
                    Sketch Recognition Lab. Dept. of Computer Science & Engineering
                              Texas A&M University, College Station, Texas
            bailey.bauman@tamu.edu, regangunhouse123@gmail.com, amjones503@tamu.edu,
      willerdasilva@tamu.edu, ssharar@tamu.edu, vijay.drajanna@gmail.com, jcherian14@tamu.edu,
                              rhwjddls@gmail.com, thammond@gmail.com
ABSTRACT                                                                   areas of their composition resonate most with their audience,
Artists value the ability to determine what parts of their                 as it can hint as to whether or not the artist’s intentions are
composition is most appreciated by viewers. This information               evident in the execution. In this regard, knowing not only
normally comes straight from viewers in the form of oral and               where a viewer first looks within a piece of art but also how
written feedback; however, due to the lack of participation on             their eyes travel through the piece and which parts of the
the viewers part and because much of our visual understanding              image receive the most attention is crucial to an artist, as this
of artwork can be subconscious and difficult to express                    feedback can inform the development of future pieces of art.
verbally, the value of this feedback is limited. Eye tracking              Traditionally such feedback only comes in the form of oral
technology has been used before to analyze artwork, however,               and written feedback, which can be limited by difficulties
most of this work has been performed in a controlled                       in expressing opinions fully in words and/or by the viewer’s
lab setting and as such this technology remains largely                    unwillingness to be overly critical. Furthermore, due to the
inaccessible to individual artists who may seek feedback.                  qualitative nature of verbal feedback, it is difficult to collect
                                                                           and especially compare very large samples. Eye tracking
To address this issue, we developed a web-based system                     technology has been used before to analyze artwork and
where artists can upload their artwork to be viewed by the                 provide feedback for the use of artists. However, most studies
viewers on their computer while a web camera tracks their                  that utilize eye tracking as an art analysis tool are performed
eye movements. The artist receives feedback in the form                    in a controlled lab setting and so far this technology is largely
of visualized eye tracking data that depicts what areas on the             inaccessible to individual artists who may seek feedback. To
image looked at the most by viewers. We evaluated our system               rectify the lack of meaningful and authentic feedback, and the
by having 5 artists upload a total of 17 images, which were                difficulty in acquiring specialized hardware for eye tracking,
subsequently viewed by 20 users. The artists expressed that                we developed VisualEYEze, a web-based solution that allows
seeing eye tracking data visualized on their artwork indicating            artists to upload their work and receive immediate, complete,
the areas of interest is a unique way of receiving feedback and            and unbiased feedback from the viewers.
is highly useful. Also, they felt that the platform makes the
artists more aware of their compositions; something that can               Our solution relies on feedback based on a viewer’s eye
especially help inexperienced artists. Furthermore, 90% of the             movements—tracked with a web camera—as they view the
viewers expressed that they were comfortable in providing eye              artwork. Specifically, we used eye tracking data to quantify
movement data as a form of feedback to the artists.                        where viewers look on a piece of art and how long they focus
                                                                           on different parts of the artwork. VisualEYEze provides this
Author Keywords                                                            data to the artist by showing a heat map of the eye movements
eye-tracking; visual data; artwork analysis; heatmaps                      over the artwork. While creating VisualEYEze, we focused
                                                                           on the usability of the system and tried to accommodate the
INTRODUCTION                                                               requirements of the artists and viewers. The artist first logs
The intention of an art piece is often as important as the                 into the system to upload a set of artwork as images, and
execution. As such, artists value the ability to determine what            submits those images to be viewed by the viewers. Viewers
                                                                           can then see the submitted artwork on a web interface while
                                                                           the interface tracks their eye movements as they view the
                                                                           images. A central database stores both the images uploaded
                                                                           as well as the corresponding eye tracking data to generate
                                                                           the visualization. In this way, this system has the potential
                                                                           to positively impact numerous artists and the way their art is
                                                                           evaluated. Artists benefit from this system by being able to
©2018. Copyright for the individual papers remains with the authors.       see how their art is interpreted. This will in turn help them
Copying permitted for private and academic purposes.
WII’18, March 11, 2018, Tokyo, Japan
                                                                       1
create art in such a way that it is more likely to be perceived         viewing artworks. This demonstrates how aesthetic techniques
in the way that they intended. Furthermore, since this is a             can be tested on a broader scale.
web-based system it is cost-effective as artists can easily reach
a large number of viewers as they do not need to be physically          Eye tracking has also been used to identify which aspects of an
present to view the work. Since a web camera is used for eye            artwork are most influential in viewing behavior. When human
tracking, there is no need for specialized hardware, and the            subjects are present in the work, the subject matter itself seems
viewers can view the images and provide feedback at their               to have the larger impact on viewing behavior. However, when
own convenience.                                                        the subject matter is a landscape, technical aspects of the
                                                                        work seem to be the most important [15]. In this space, eye
In order to determine the usability of the system and how               tracking studies by Villani et al. [27] have broadened how
useful the artists may find the eye tracking data the system            users view human figures. In this study, participants focused
provides them, we conducted an evaluation through a multi-              on the faces and arms of figures in social interactions. They
part study. In the first part, we tested the system by having           focused on arms in social contexts and faces in individual
five artists upload their artwork and twenty users view the             context. Empathetic Concern had an effect on examining faces
work while we gathered eye tracking data. In the second part,           in social scenes while Perspective Taking had more effect on
the artists viewed the eye tracking data superimposed on their          the examination of arms in social scenes. Participants who
artwork and provided feedback on the usefulness of such data.           felt emotional concerns looked at the faces more immediately.
Overall, we found that our system could provide artists with            Participants who showed that they could take another person’s
a unique perspective on their work which could be useful in             perspective went more immediately to the arms. This work
improving existing artwork or in beginning new projects. The            well demonstrated how eye tracking is useful in determining
system was determined to be easy to use although the viewer             the interpretation of artwork. In our work, we analyze similar
participation aspect could be improved with a reward system.            information in order to provide it directly to the artist so that
Because of the very low barrier to entry, nearly any computer           they can compare their aesthetic choices with the viewing
user could participate in a new community which focuses on              impact.
facilitating a new type of feedback for artists. We believe that
our cost effective and convenient solution allowing artists to          Research utilizing eye tracking has also been done to
receive meaningful, unbiased feedback will enable artists to            determine what impact a change to a visual composition
reach a larger number of viewers and improve the quality of             element has on the eye tracking scan path. It has been
                                                                        determined that a change in the size, shape, color, contrast,
their work.
                                                                        proportions, or orientation of visual elements in a painting
PRIOR WORK
                                                                        causes a corresponding change in the viewer’s scan path across
                                                                        the artwork. This is helpful information to provide to artists,
Eye tracking has been used in a variety of application contexts
                                                                        and we will allow artists to upload artwork for viewing so
such as interaction and accessibility [5, 19, 20, 21, 22],
                                                                        they can see first hand how differences in versions of their
analytics [2, 4, 10, 23], and diagnostics [1, 9]. In this work,
                                                                        pieces change the outcome of the scan paths [13, 18]. Clare
we specifically focus on the previous research that leverages
                                                                        Kirtley [11] analyzed composition as a feature in artwork using
eye tracking for the analysis of paintings and the various
                                                                        eye tracking data. The research conducted looked at whether
visualization techniques for presenting eye movement data.
                                                                        participants viewed art as suggested in Andrew Loomis’ guide
Thus, the prior work can be categorized into three groups:
                                                                        to composition. The participants were novices and told to
1) analysis of paintings with eye tracking, 2) visualization
                                                                        view pieces of Loomis’ art which followed specific rules of
techniques for eye movement data, 3) web-based eye tracking
                                                                        composition laid out by Loomis. They then compared the data
analytics.
                                                                        collected with the original frameworks to see if participants
Analysis of Paintings with Eye Tracking
                                                                        did follow the appropriate scan paths and fixation points. They
                                                                        noted that participants spent a lot of time looking at focal
With regard to artwork, eye tracking has previously been used
                                                                        points as they were pointed out. They didn’t find data to
to identify areas of interest as perceived by the viewer. This
                                                                        suggest that the top of the image was the preferred exit point.
involved studying how people view indeterminate art. In
                                                                        They also didn’t find significant correlation between the paths
a study conducted by Wallraven et al. [28], viewers were
                                                                        taken and the "ideal path" laid out by Loomis. As such they
asked to determine whether or not specific forms were present
                                                                        noted that the composition is important in determining focal
in the paintings. The resulting fixation data was used to
                                                                        points but that the flow is variable and needs further research.
identify the areas which seemed to most resemble those
figures. The work demonstrated how eye tracking may be                  Research has also been conducted in the comic book sphere
used to validate the aesthetic decisions of an artist. Santella         to determine if comic book artists are successful in their goal
et al. [26], used eye tracking to abstract photographs into             of leading viewers gaze in comics by leveraging eye tracking
painterly renderings. The eye tracking data collected was used          data. A primary organizing principle used by artists to lay
to identify the most important focus areas in a piece; this             out the components of comics is to lead the viewer’s attention
information determined which areas of the work were more                along a deliberate route so that the viewer doesn’t get lost or
or less abstracted. Yanulevskaya et al. [30] were successful in         confused about what the story is. Research has found that there
using a Bag-Of-Visual-Words technique, computer vision, and             is increased consistency in viewer’s eye movements when
eye tracking information to analyze areas of artwork with high          looking at comic books compared to looking at pictures that
emotional contribution and confirm a positive visual bias when


                                                                    2
were not created with the intent of directing viewer attention.         for usability evaluation sets a precedent for how to determine
This helps to show that consistency in viewer fixation on a             the connection between raw data and a user’s thoughts. This
certain spot in an image can imply success by the artist if that        can be adapted to artwork as the idea of visual attention
spot was what they meant to be focused on. In our research,             and cognitive load is relevant to that space as well [7, 24].
we will be analyzing similar information and then compare it            Calculating heat maps based on eye tracking data is an integral
to the artist’s desired outcome to see how accurately they were         part of our project. However, the conventional algorithm to
able to achieve their desired focus area and scan path.                 determine the heat maps is fairly slow with a Θ(n2 ) running
                                                                        time. In the past, researchers have implemented a faster
Visualization Techniques for Eye Movement Data
                                                                        parallel algorithm to calculate heat maps. Computing a heat
                                                                        map in parallel can drastically improve performance by taking
Most traditional visualization methods for data have limited
                                                                        full advantage of the powerful CPU of a modern computer [6].
capabilities that do not well support the space and time
structure of eye tracking data. However, there have been
studies that evaluate the different visualization methods and           Web-based Eye Tracking Analytics
their suitability for eye tracking data. A study by Andrienko           EyesDecide is an online web application that is built for
et al. [3], state that when working with eye tracking data each         market researchers and design specialists1 . It allows people to
visualization method has a specific context of applicability.           upload various forms of media such as an image or URL.
Based on this information, we determined that heat maps and             It then creates a user study where a set of questions can
a map of the trajectory of the eyes will be the most useful for         be asked or some media can be displayed. The study can
our application. Also, Kurzhals [12] found that aside from              then be shared with the users via a generated link. Users can
heat maps and fixation points, the general data presentation of         then look at the media and EyesDecide collects information
gaze data is limited. Most methods for analyzing eye tracking           about where viewers look based on eye tracking data from
data involve first dividing the data points into saccades and           a web based computer camera using the eye tracking API
fixations. Different methods for classifying fixation points            xLabs. The system also generates videos of the target audience
have been compared on multiple factors, including their                 visually exploring interactive content, generates aggregated
effectiveness. Several of these methods have been explored              heat maps, and analyzes how different subgroups look and
and documented [25]. There have been some questions                     interact with different versions of the designs and content. Our
regarding the accuracy and reliability of using eye tracking            application, while similar to EyeDecide, will aim to serve the
software in general. For example, an eye tracker typically              art community in similar ways to how EyesDecide is serving
only records where the user’s cornea is directed towards and            market researchers.
does not take into consideration the peripheral vision of the
user. This discrepancy in eye tracking accuracy is particularly
detrimental when analyzing task based eye tracking data, such           DESIGN MOTIVATION
as browsing a website or navigating some interface. It has              In order to make eye tracking technology available to artists,
been found that users do not make much use of their peripheral          we wanted to design a system that allows artists to upload
vision when looking at artwork, so the direction the cornea is          their work and receive instant feedback. To realize our goal,
pointing is a reliable metric for that specific case [8].               we created a web application that is accessible to any artist,
                                                                        and allows them to easily and instantly share their artwork
Mayer et al. [16], evaluated graphical teaching techniques              with anyone and receive feedback. At any time an artist may
with eye tracking data and comprehension tests to determine             check what feedback has been collected for the work they have
how effective these various teaching methods are. The                   uploaded. Users visit the same website to view artwork. While
results showed that eye tracking fixations are correlated with          viewing artwork, eye-tracking is performed on the viewers
better comprehension. This can be useful in understanding               using a web camera attached to their computers.
the importance of fixations and how that relates to visual
information. Massaro et al. [14], collected eye-tracking data           In our system, artists and viewers represent two separate
from distinctly different types of pictures to determine how            entities: an artist is a user who uploads artwork on which
viewers react differently to these different types of artwork. It       they would like to receive feedback. Each artist has an account
has been found that if there are people represented in a work           to manage their uploaded works and view their feedback. Thus,
of art, the viewer will focus a disproportionate amount on              this account allows artists access to all the data collected
their faces rather than the rest of the picture. However, if the        on their work, and analyze it with a number of provided
artwork is purely of nature, people tend to look all around             techniques. A viewer is a user who is looking at the artwork
the picture in no clear pattern. In this sense, our goal was for        and providing the eye-tracking feedback. Viewers are not
artists to find out more about the patterns in peoples viewing          required to make an account and can view work freely. While
habits.                                                                 artists and viewers represent separate roles in our system, any
                                                                        user can perform both roles if they so choose. To provide more
Another active area of research where eye tracking data is              details about the functionality of our system, below we present
commonly used is usability evaluation. Because gaze location            the available features by user role.
can help researchers identify the focus of an individual on
visual information, it is indispensable when it comes to                Artists:
understanding the cognitive processes of users interacting with
graphical content. The processes for using eye tracking data            1 https://www.eyesdecide.com/ [last accessed Dec 16th 2017]



                                                                    3
Figure 1. System Architecture: the image depicts the interactions among the three modules: 1) Artist’s Interface, 2) Viewer’s Interface, and 3) Database



• Create accounts to manage all of the work that they upload                   Creating a system where the roles of artist and viewer are
  and the related feedback                                                     separate allows the artists to have control over what work
                                                                               they would like feedback on, and gives them direct access
• Upload artwork that they wish to receive feedback on to                      to that feedback. In this way, a viewer is merely a subject
  their accounts                                                               of eye-tracking and as such they do not have to perform any
                                                                               complicated set-up or installation in order to contribute eye
• Set the defining information of their artwork such as the                    tracking data. It was the goal for our system to be intuitive
  desired focal point and title of the work                                    in order to gather the most data. Because everything is
                                                                               accessed by web page and the eye tracking is performed with
• Edit or delete uploaded artwork
                                                                               a standard computer camera, most personal computer users
• Choose to share artwork for feedback by distributing a link                  are able to participate either as an artist or viewer. Any artist
  to view that work directly                                                   who wishes to get information about their work can do so
                                                                               by uploading their art and waiting until it is viewed by the
• View aggregate and individual heat maps generated from                       public. Furthermore, if an artist desires to speed up the process,
  viewer’s gaze data                                                           they can manually share a link to their artwork to get instant
                                                                               feedback.
• View video progression of individual gaze data representing
  viewing order                                                                Architecture
                                                                               VisualEYEze mainly comprises of three modules: 1) Artist’s
• View fixation points of individual and aggregate gaze data                   Interface, 2) Viewer’s Interface, and 3) Central Database. The
                                                                               interaction between these modules is shown in Figure 1.
• View any written feedback provided by viewers who saw
  their artwork                                                                IMPLEMENTATION
                                                                               We used a number of tools and frameworks to develop our
• Compare gaze data to their desired focal point
                                                                               system. We used Ruby and Ruby on Rails to build a web
• Compare their own work to common artwork compositions                        framework and used Bootstrap and CSS for styling. To store
                                                                               the data from the artists and viewers, we used a PostgreSQL
Viewers:                                                                       database. We had several tables, depicted in Figure 2, to store
• Agree to participate in eye-tracking                                         the data related to the artists, heatmaps, pictures from artwork
                                                                               uploads and feedback.
• View slide-show of a subset of uploaded artwork
                                                                               Open Source Components
• Provide gaze data while viewing artwork by having a web                      We integrated several open source components into our system
  camera enabled                                                               to achieve the desired functionalities. The two primary open
                                                                               source modules we used are Webgazer.js for eye tracking, and
• View their fixation points on each artwork after the slide-                  Heatmap.js for visualization of eye tracking data.
  show is completed
                                                                               Webgazer.js
• Optionally provide written feedback on individual works of                   Webgazer is an open source, Javascript application developed
  art based on their fixation points                                           by Brown University researchers. Webgazer is the basis of our


                                                                           4
           Figure 2. Backend database wireframe diagrams



eye tracking system which gathers all of the gaze information
from the viewers. Webgazer utilizes a viewer’s web camera
that is built in to their laptop. It requires minimal calibration
from viewers (we use 9 calibration points on the screen) and
does not require any special lighting. Additionally, Webgazer
is scalable and since it is open sourced it allows our platform
to be extremely accessible to all users [17].
Heatmap.js
Heatmap.js is an open source, Javascript application that                        Figure 4. This is the detailed artwork page that is displayed after
creates heat map visuals from data points. This software allows                  clicking an image on the dashboard. This is the main page to interact
                                                                                 with the feedback for an artwork.
us to generate heat map visualizations of the eye tracking data
collected via Webgazer. Heatmap.js is extremely scalable and
each heatmap can hold 40,000+ data points, making it ideal
for our aggregated heatmaps that are displayed to artists [29].                  Furthermore, the artist would be able to update the artwork’s
                                                                                 title, delete an artwork and all associated data, and also set a
User Interface                                                                   desired focal area as shown in Figure 5.
The user interface is comprised mainly of two components: 1)
artist’s interface, and 2) viewer’s interface.
Artist’s Interface
The artist logs into VisualEYEze and is directed to the
dashboard as shown in Figure 3.




                                                                                 Figure 5. Setting a focal area - to set a focal area, the artist clicks and
                                                                                 drags an area on the image that they believe should be the focal area.


                                                                                 As shown in Figure 6, the artist can even view various
                                                                                 composition frameworks on their art.
                                                                                 Once an artist receives feedback from viewers on a piece, they
                                                                                 can compare the perceived focal point of the viewer to the
                                                                                 desired one that they set (see Figure 7). The heatmap, fixation
                                                                                 points, and the focal area are overlaid on to the artist’s image
Figure 3. Artist Dashboard - shows the artist all of their uploaded works.       to assist in comparison.
They can click on any work to get more detailed information regarding
that piece.                                                                      The composition frameworks page allows artists to see
                                                                                 different general rules for choosing where their subjects are
                                                                                 located in the composition. Each composition framework
This dashboard shows all of the images uploaded previously                       displays a description of the framework and where the focal
by the artist. When the artist clicks on an image, they are taken                areas and objects should be located as well as overlays the
to a page that shows all of the relevant information for that                    framework on the image so artists can easily see how this is
artwork as shown in Figure 4.                                                    represented in their artwork. We include three of the most


                                                                             5
                                                                               are displayed as yellow dots overlaid on the artwork. In
                                                                               eye tracking analysis it is common to classify raw gaze data
                                                                               into fixations and saccades. Saccades represent rapid eye
                                                                               movements while fixations are where the viewer focuses for
                                                                               a period of time. In order to identify fixations in our own
                                                                               data, we implemented a dispersion threshold algorithm. This
                                                                               algorithm has been shown in research to be comparably robust
                                                                               and accurate to other classification methods while remaining
                                                                               efficient and simple to implement [25].
                                                                               Viewer’s Interface
                                                                               The viewer’s interface is simple and does not require a login.
                                                                               Using the URL shared by the artists, the user visits the website
                                                                               and clicks on "Participate as User" on our main page to begin
                                                                               the viewing process (see Figure 8).




Figure 6. Composition framework - artists can click on the different
frameworks to see how their desired focal area lines up with traditional
methods for choosing the location of the focal point.




                                                                               Figure 8. About Page - this page gives a description of our project and
                                                                               related sources.


                                                                               Furthermore, the user is guided through a sequence of
                                                                               instructions asking for the user’s consent to participate in
                                                                               the experiment, and how the camera needs to be set up for
                                                                               calibration. Figure 9 shows the consent form where the user is
                                                                               explicitly informed that the web camera will be used to track
                                                                               the eyes and no video of the user’s face will be recorded.
Figure 7. Individual heatmap example - fixation points, focal area, and
heatmap are overlaid on the artist’s image.




common frameworks: rule of thirds, bisection, and golden
section.
The first heatmap that artists see on the detailed artwork page
is the aggregated heatmap (see Figure 4). This heatmap is the
combined data of all users viewing data on this artwork. Artists
will also see their set focal area overlaid on this heatamp.
Artists also have the option to hide the heatmap overlay on the
aggregated heatmap so they can see the image more clearly
if they desire. Furthermore, to enable the artist to see how                   Figure 9. Consent Page - viewers have to consent to using their webcam
                                                                               before they are allowed to participate. This allows us to make sure
each individual has viewed the artwork, the page lists all                     viewers are aware that their webcam will be used.
the individual heatmaps from all of the individual viewer
feedbacks (see Figure 4). Each of these heatmaps has an option
to view a video progression of the heatmap being created that                  After consenting, the user will be shown a page with a set of
correlates to how the viewer viewed the art in real time, with                 instructions explaining the calibration process, and how the
the first points the user looked showing up first.                             experiment will progress as shown in Figure 10.
Also on these individual heatmaps are the extracted fixation                   To being calibrating the web camera for the user’s eyes, we
points of the user on the artwork (see Figure 7). These points                 ask the user to make sure that the camera can see their face


                                                                           6
                                                                                 Figure 13. Calibrate - once viewers click begin, they are given 9 dots
                                                                                 around the screen to click on in succession in order to calibrate the
Figure 10. Description Page - this page describes what the process for           webcam.
viewing the art will be like.



(see Figure 11), and then we have the user calibrate the camera
so it can accurately detect where the user is looking on the
screen. To proceed with calibration, the user clicks on the
"start" button (see Figure 12). During calibration, the user is
shown 9 dots placed on the vertices of a 3*3 grid that covers
the entire screen (see Figure 13).



                                                                                 Figure 14. Main page for written feedback - all artworks that the viewer
                                                                                 saw will be displayed on this screen. Viewers can choose none, one, or
                                                                                 many artworks to give written feedback on.

Figure 11. Check Face Location Page - viewers are asked to make
sure the green lines are outlining their face accurately which will ensure
accurate eye-tracking data.




Figure 12. Calibrate Start Page - users can read about how long to look          Figure 15. Individual feedback page - viewers can provide feedback on
at the calibration points, when they are ready they click begin.                 this page and see their fixation points. This allows viewers to state why
                                                                                 they were fixating on certain areas on the artwork.

Once the calibration is complete, the user gets a countdown
of 3 seconds and then the slide-show begins where the user is
shown at most 10 images for 10 seconds each. After the user                      uniform as possible. In order to achieve this, we designed
                                                                                 an algorithm that selects artwork for the slide show based
finishes viewing all the images, she is directed to a page that
                                                                                 on how little feedback it has received. This is a greedy
asks the user to provide additional, optional written feedback
                                                                                 algorithm, because we prioritize the artwork that has received
on the images. This written feedback page shows the images
                                                                                 the least amount of feedback each time a viewer wishes to
the user just saw (see Figure 14). On clicking on an image, the
user is shown the points she fixated on the image so that the                    use the platform. In fact, the only heuristic used in this
user can tell the artist why she fixated on these points through                 algorithm is the amount of feedback associated with each
written feedback (see Figure 15).                                                artwork. The algorithm first selects the artwork that has no
                                                                                 feedback associated with it and adds it to the slide show. If the
In order to provide sufficient feedback for each work of art                     number of artworks that have received no feedback is greater
submitted, we needed to keep the distribution of feedback as                     than or equal to the amount of images shown in the slide show,


                                                                             7
then the algorithm can terminate at that point. Otherwise,               they were asked to fill out a short survey regarding the usability
we add images to the slide-show that have the least number               of the system based on their interactions up to this point.
of feedback until there is enough. The algorithm breaks ties
arbitrarily. The purpose of this algorithm is to pull artwork            Artwork Viewing Study
from the database in such a way, so that the distribution of             Once all artists had uploaded their original artworks, users
the artworks between artists is mostly uniform. This costs               were brought in individually to view the work and provide
Θ(nlogn) time, where n is the number of artworks in the                  eye tracking data. This was done using a live version of the
database. Our algorithm uses a greedy strategy in order to               website. In total, we had 20 viewers provide eye tracking data
allow artwork that has little to no feedback to be chosen for            (5 females, 15 males). First, the users calibrated the camera
the slide show.                                                          for eye tracking which preceded the artwork viewing session.
                                                                         Once calibration was complete, each user viewed a timed slide
VALIDATION STUDY                                                         show of ten images. Each image shown to the participant
We began initial evaluation of our system by conducting                  was viewed for ten seconds. In order to ensure each artist got
viewing tests on the selected area of the artworks. This initial         enough feedback, only images with the least number of views,
evaluation was conducted to test the accuracy of the gaze                selected using the greedy algorithm, were shown in each slide
prediction software we were using. To do this, we selected               show. While viewing these images, gaze data was collected on
a set of images and designated certain areas as the region of            their viewing behavior. Once the entire slide-show of artwork
interest by drawing rectangles around those regions. These               was viewed, users were asked to look at their fixation points
images were then showed to a set of participants as a slide-             for each piece to rate accuracy of our system and were given
show, and the participants were instructed to only look inside           the option to provide written feedback to the artist. At the end
the rectangular regions as we performed eye tracking. A total            of the study, viewers were asked to take a short survey about
of 6 participants took part in the study, and these studies were         the usability and accuracy of our system.
conducted in a number of different settings with different
lighting conditions in order to understand how robust the                Artist Study - Part 2
system was. Each participant was tested once in each setting.            After the viewers contributed the eye tracking data, we
We viewed the results of this preliminary experiment by                  presented the data to the artists through our artist dashboard.
generating heat maps for each viewer and artwork and                     The same five artists from our first artist study were asked to
comparing the hot spots of the heatmap to the set region of              come back for a second session. They were asked to log in
interest on the artwork. The results show that the eye tracking          and review the data on their individual artworks. For each
software is fairly accurate under good conditions. However,              artwork the artists were asked to look through their aggregate
there were some instances when the eye tracking was entirely             and individual heat maps. They looked through any written
off. We attribute this to the system having difficulty calibrating       feedback if present. They were also asked to compare the
or locating the viewer’s pupils under bad lighting conditions.           fixation points received to their set focal points. Finally, they
                                                                         were asked to look through the composition frameworks and
EXPERIMENT DESIGN                                                        compare those intersections with the perceived focal points and
Following the preliminary validation study, we conducted a               fixation data. After they completed these tasks we conducted
more comprehensive study. To avoid the inaccuracies and                  a final interview. They were asked a few questions with the
inconsistencies we observed in the preliminary study, the final          main focus of knowing whether they found the eye tracking
evaluation was conducted in a lab setting. All the participants          information to be novel and useful. This included whether
took part in the study on the same computer and under the                or not they received unexpected data, whether the platform
same lighting condition in order to maintain consistency. To             seemed novel, what feedback features they found the most
fully test our system, we conducted evaluations for both the             effective, and so on.
artist and viewer roles. To do this, we brought in artists to
upload their work in the first phase, and the same artists were          RESULTS
invited back to view the eye tracking feedback. Users were               We conducted surveys at each part of the artist study, as well as
brought in to view the uploaded artworks and we tracked their            after the artwork viewing study. Furthermore, semi-structured
eye movements while they viewed the artworks.                            interviews were conducted with the artists after completing the
                                                                         second part of the artist study. All the studies were focused on
Artist Study - Part 1                                                    understanding if the system supports the expected usability by
Our artist participants consisted of 5 individuals (4 female,            the artists and the viewers and if both artists and the viewers
1 male). All participants had over 5 years of experience                 accept the accuracy of the gaze data visualization on the
in art and focused on various mediums including painting,                artworks.
photography, drawing, and linoprint. Each artist uploaded 2
to 4 works of art to our system. This resulted in a pool of 17           Artist Studies
works of art in total. These works were what we used in the              The surveys following the first part of the artist study showed
remainder of our studies. The artists were asked to create an            that all participants found it easy to navigate and upload
account, upload their artworks on a live version of our website,         artwork to the system. To get a better understanding of the
and set the focal point for each work. They were then allowed            utility of our system we asked artists more detailed questions
to browse the dashboard leisurely. Once they were finished,              about their experience after they were able to see the eye


                                                                     8
tracking feedback in the second part of the artist study. Four of       that 11 participants thought the system was accurate to some
the five artists interviewed mentioned that the system was user         degree, 7 thought it was inaccurate, and 2 were unsure.
friendly. One artist suggested the system may be difficult to
navigate at first while another thought that having more control
over the feedback types might add to the usability. When asked
what features artists found unnecessary or distracting, only
one artist responded that they did not understand the purpose
of the composition overlays. In order to understand if our tool
stood out among existing methods in use, we asked the artists
to describe anything they felt was unique to the system. Two
of the artists described the whole system as unique while the
others referred to the eye tracking in general, or heat maps
and video progression features as being unique. We also asked
artists to describe what they found useful about this platform.
Each artist mentioned that they found seeing where the viewer
had looked useful. One artist mentioned that the system let
them get a different perspective on their art and let them see          Figure 16. Question: How well do you think the software reflects where
                                                                        you were looking on the artwork?
where viewers looked in what order to determine where they
keep coming back to on the image.
Furthermore, all the artists mentioned that they got surprising
                                                                        Another concern we had was whether or not people would be
results in some way. Some artists thought that while viewers
                                                                        comfortable having eye tracking performed on them with this
generally looked where they thought, they also jumped to
                                                                        type of platform. No participants claimed to be uncomfortable
looking at other areas that weren’t intended to be looked at
                                                                        with it, while 18 reported to be completely comfortable with it
as often. Artists also commented that people seemed to be
                                                                        (see Figure 17).
looking at an object in their art that wasn’t supposed to be the
main focal area. We also wanted to get the artists’ opinions
on how this platform could contribute to the art community.
Two artists thought that the platform would make artists more
aware of their compositions and make them think more about
what actually draws viewers in. They thought this would
help artists learn to direct the eyes of the viewer, something
that would be especially beneficial to inexperienced artists.
Another artist thought this would give a better mindset when
you start developing artwork and that it would make you think
more about how to set up a photograph or drawing. This artist
also saw the potential to upload versions of a piece to see
initially where people look then make a change and then check
it again. We also asked the artists about some ideas that they
thought could be added to the platform to improve it even               Figure 17. Question: Were you comfortable providing eye tracking data
more. One artist recommended an accuracy benchmark for                  via a web-camera?
the data so that they could know how accurate the eye tracking
was for each viewer. Another artist thought it would be useful
to implement a feature to predict focal points on the image.
                                                                        We also wanted to know if the viewers would use this type
One other artist would like to let viewers flag the areas on
                                                                        of platform on their own time to provide feedback to artists.
the images they viewed that they found most interesting and
                                                                        These results were split, with half the participants stating that
provide comments on those flagged areas. We also received a
                                                                        they would use and the other half stating that they wouldn’t.
request to make the images display for longer than 10 seconds
                                                                        However, fifty percent is still quite high for this type of
each.
                                                                        platform and would be a huge help to artists and the art
                                                                        community.
Viewing Studies                                                         Because this would ideally be an improvement on existing
Following the viewers participating in the eye-tracking portion         forms of feedback, we asked participants about whether or
of the platform, we conducted a short survey to get feedback on         not they thought this eye tracking feedback was preferable to
the system. One aspect of the system that we were concerned             providing written or oral feedback. Thirteen users said eye
with was accuracy since we used a web camera instead of                 tracking was preferable, six where indifferent, and one thought
traditional infrared camera hardware packages. Users were               the traditional methods were preferable (see Figure 18). When
shown the fixation points of their gaze data for each image that        asked to rate the usability of the system on a five point scale
they viewed and were asked to rate the accuracy of the system           (one being easy and five being difficult to use), 35% rated it at
on a five point scale (see Figure 16). The feedback showed              1, 30% at 2, 10% at 3, 15% at 4, 10% at 5 (Figure 19).


                                                                    9
                                                                                that needs further discussion is the accuracy of the system.
                                                                                Seven of the twenty viewers thought that the system did not
                                                                                accurately record their gaze data while two were uncertain.
                                                                                This information was based on the viewer’s opinion after they
                                                                                were able to see the fixation points calculated for their gaze
                                                                                data on each artwork. This could mean that either the gaze data
                                                                                was in fact inaccurate, the fixation points were not an intuitive
                                                                                reflection of their gaze data, or that the users themselves
                                                                                where not certain of their gaze patterns. More testing into
Figure 18. Percentage of the participants who preferred providing gaze-         the accuracy of our gaze prediction and fixation classification
based feedback over written or oral feedback.
                                                                                would be needed to improve these results. One artist also
                                                                                suggested that some form of an accuracy benchmark for each
                                                                                viewer be given to the artist so that they can understand how
                                                                                to interpret the data.

                                                                                CONCLUSION AND FUTURE WORK
                                                                                VisualEYEze is one considerable effort in the right direction
                                                                                for the art community. The eye tracking feedback in our
                                                                                platform provides artists with the knowledge of what viewers
                                                                                are actually looking at and focusing on. This in turn can
                                                                                positively influence future work an artist creates. The service
                                                                                is also accessible for almost anyone to use, because most
Figure 19. Usability of the system on a five point scale (one being easy        modern computers have a built in web camera, which is what
and five being difficult to use)                                                our eye tracking software relies on. Overall, the system was
                                                                                positively received by both the artists and the viewers. The
                                                                                artists enjoyed using the platform and found it useful for eye
                                                                                tracking data to be recorded from viewers looking at their
DISCUSSION
                                                                                artwork. However, based on the results, the viewers don’t have
From the user studies, a strong finding was that artists and                    as much incentive to use the platform as artists, which was an
viewers alike found the system easy to use. For artists, who                    expected outcome.
were more involved with the system as they had to create
accounts, upload work, and utilized a variety of visualizations                 As part of the future work, we would include ways to
and features, this was nearly unanimous. They found that                        incentivize the viewers to view artwork. Prior applications of
all the system features were accessible and that to their                       this nature offer ways for the viewer to follow certain artists,
knowledge, much of the system was unique, especially when                       or view certain genres of artworks they enjoy. The concern
it pertained to eye tracking data. Given that all the artists                   with the latter is that it would skew the data towards popular
reported to have five or more years of experience and rated                     artists getting more traction than newer artists. We would
their ability level at or above average, this is a strong indicator             have to balance the features to make sure even smaller artists
that no other commonly used tools provide the same kind                         would get feedback through the random slide-show viewing
of insights as our eye tracking framework. Every artist                         that is currently in place. Additionally, users are often offered
interviewed mentioned the usefulness of seeing the viewer’s                     monetary compensation for their time. To counteract this,
gaze data and many offered unique suggestions as to how that                    artists pay a fee to use the service. This was not the intention
may be helpful. From verifying artistic choices, to choosing                    of our application and as such we didn’t offer this, but it would
how to improve future work, or even comparing two versions                      be interesting to see our application in this light.
of the same piece, it seemed that the participants could easily
imagine how they might find this data useful in their work.                     Furthermore, we would like to implement increased gaze
This agrees with the results that each artist claimed they would                duration by offering longer periods for gazing data. This
use the system in their own time to get feedback if it were                     way the user would be able to choose the amount of time
available. Overall, it seemed that the artists who participated                 that fits their schedule. We could also have the users mark
in our study found the platform a novel and useful way to                       where they felt they looked at the most beyond the tracking
                                                                                data. Additionally we could add an admin portal for managing
gather feedback about their art.
                                                                                the accounts. Often we noted viewers misinterpreting the
Based on the viewer survey, most participants found the system                  fixation points and to alleviate this we could show them the
easy to use and preferable to traditional forms of providing                    heat maps as well. The accuracy was a big concern on both
feedback to artists. In fact, 50% of viewer participants reported               ends. The compromise of using web-cameras is the accuracy
that they would be willing to use this type of system on their                  hit that the data takes when compared to something like eye
own time. This indicates that it is feasible that the system could              tracking hardware. This was a choice done explicitly to fit our
have a sizable viewer user base. This is of course important                    goal of accessibility to all. We figured the accuracy hit would
because the system would not be helpful to artists without                      be outweighed by allowing artworks to actually be viewed
anyone willing to provide eye-tracking data. One concern                        multiple times by making the entire website more accessible.


                                                                           10
REFERENCES                                                                 of the Arts 0, 0 (0), 0276237417693564. DOI:
 1. Folami Alamudun, Hong-Jun Yoon, Kathleen Hudson,                       http://dx.doi.org/10.1177/0276237417693564
    Garnetta Morin-Ducote, Tracy Hammond, and Georgia
                                                                       12. Kuno Kurzhals, Brian Fisher, Michael Burch, and Daniel
    Tourassi. 2017b. Fractal Analysis of Visual Search
                                                                           Weiskopf. 2014. Evaluating visual analytics with eye
    Activity for Mass Detection During Mammographic
                                                                           tracking. In Proceedings of the Fifth Workshop on Beyond
    Screening. Medical Physics 44-3 (February 21, 2017),
                                                                           Time and Errors: Novel Evaluation Methods for
    832–846. ISSN: 2473-4209,
                                                                           Visualization. ACM, BELIV ’14 Proceedings of the Fifth
    http://dx.doi.org/10.1002/mp.12100.
                                                                           Workshop on Beyond Time and Errors: Novel Evaluation
 2. Folami T Alamudun, Tracy Hammond, Hong-Jun Yoon,                       Methods for Visualization, Paris, France, 61–69.
    and Georgia D Tourassi. 2017a. Geometry and
    Gesture-Based Features from Saccadic Eye-Movement as               13. Paul Locher. 2006. The usefulness of eye movement
    a Biometric in Radiology. In International Conference on               recordings to subject an aesthetic episode with visual art
    Augmented Cognition. Springer, 123–138.                                to empirical scrutiny. Psychology Science 48, 2 (2006),
                                                                           106.
 3. G. Andrienko, N. Andrienko, M. Burch, and D. Weiskopf.
    2012. Visual Analytics Methodology for Eye Movement                14. Davide Massaro, Federica Savazzi, , Cinzia Di Dio,
    Studies. IEEE Transactions on Visualization and                        David Freedberg, Vittorio Gallese, Gabriella Gilli, and
    Computer Graphics 18, 12 (Dec 2012), 2889–2898. DOI:                   Antonella Marchetti. 2012a. When Art Moves the Eyes:
    http://dx.doi.org/10.1109/TVCG.2012.276                                A Behavioral and Eye-Tracking Study. In PLosONE.
                                                                           PLOS, California, US, Article 7, 16 pages.
 4. Wolfang Becker and Reinhart Jürgens. 1979. An analysis
    of the saccadic system by means of double step stimuli.            15. Davide Massaro, Federica Savazzi, Cinzia Di Dio, David
    Vision research 19, 9 (1979), 967–983.                                 Freedberg, Vittorio Gallese, Gabriella Gilli, and
                                                                           Antonella Marchetti. 2012b. When Art Moves the Eyes:
 5. Andrew Duchowski. 2007. Eye tracking methodology:                      A Behavioral and Eye-Tracking Study. PLOS ONE 7, 5
    Theory and practice. Vol. 373. Springer Science &                      (05 2012), 1–16. DOI:
    Business Media, London, UK.                                            http://dx.doi.org/10.1371/journal.pone.0037285
 6. Andrew T. Duchowski, Margaux M. Price, Miriah Meyer,               16. Richard E. Mayer. 2010. Unique contributions of
    and Pilar Orero. 2012. Aggregate Gaze Visualization with               eye-tracking research to the study of learning with
    Real-time Heatmaps. In Proceedings of the Symposium                    graphics. Learning and Instruction 20, 2 (2010), 167 –
    on Eye Tracking Research and Applications (ETRA ’12).                  171. DOI:
    ACM, New York, NY, USA, Article 2, 8 pages. DOI:                       http://dx.doi.org/10.1016/j.learninstruc.2009.02.012
    http://dx.doi.org/10.1145/2168556.2168558
                                                                       17. Alexandra Papoutsaki, Patsorn Sangkloy, James Laskey,
 7. Joseph H Goldberg and Anna M Wichansky. 2002. Eye                      Nediyana Daskalova, Jeff Huang, and James Hays. 2016.
    tracking in usability evaluation: a practitioners guide. In            WebGazer: Scalable Webcam Eye Tracking Using User
    To appear in: Hyönä. North Holland, Amsterdam,                         Interactions. In Proceedings of the Twenty-Fifth
    Netherlands, 1–22.                                                     International Joint Conference on Artificial Intelligence,
 8. Laura A. Granka, Thorsten Joachims, and Geri Gay.                      IJCAI 2016, New York, NY, USA, 9-15 July 2016. Brown
    2004. Eye-tracking Analysis of User Behavior in WWW                    University, Providence, RI 02912, 3839–3845.
    Search. In Proceedings of the 27th Annual International                http://www.ijcai.org/Abstract/16/540
    ACM SIGIR Conference on Research and Development in
                                                                       18. Rodrigo Quian and Carlos Pedreira. 2011. How Do We
    Information Retrieval (SIGIR ’04). ACM, New York, NY,
                                                                           See Art: An Eye-Tracker Study. Frontiers in Human
    USA, Article 69, 2 pages. DOI:
                                                                           Neuroscience 5 (09 2011), 98. DOI:
    http://dx.doi.org/10.1145/1008992.1009079
                                                                           http://dx.doi.org/10.3389/fnhum.2011.00098
 9. Kenneth Holmqvist, Marcus Nyström, Richard
    Andersson, Richard Dewhurst, Halszka Jarodzka, and                 19. Vijay Rajanna. 2016. Gaze Typing Through
    Joost Van de Weijer. 2011. Eye tracking: A                             Foot-Operated Wearable Device. In Proceedings of the
    comprehensive guide to methods and measures. OUP,                      18th International ACM SIGACCESS Conference on
    Oxford, UK.                                                            Computers and Accessibility (ASSETS ’16). ACM, New
                                                                           York, NY, USA, 345–346. DOI:
10. Purnendu Kaul, Vijay Rajanna, and Tracy Hammond.                       http://dx.doi.org/10.1145/2982142.2982145
    2016. Exploring Users’ Perceived Activities in a
    Sketch-based Intelligent Tutoring System Through Eye               20. Vijay Rajanna and Tracy Hammond. 2016. GAWSCHI:
    Movement Data. In Proceedings of the ACM Symposium                     Gaze-augmented, Wearable-supplemented
    on Applied Perception (SAP ’16). ACM, New York, NY,                    Computer-human Interaction. In Proceedings of the Ninth
    USA, 134–134. DOI:                                                     Biennial ACM Symposium on Eye Tracking Research and
    http://dx.doi.org/10.1145/2931002.2948727
                                                                           Applications (ETRA ’16). ACM, New York, NY, USA,
                                                                           233–236. DOI:
11. Clare Kirtley. 0. How Images Draw the Eye: An                          http://dx.doi.org/10.1145/2857491.2857499
    Eye-Tracking Study of Composition. Empirical Studies


                                                                  11
21. V. Rajanna, A. Malla, R. Bhagat, and T. Hammond. 2018.           26. Anthony Santella and Doug DeCarlo. 2002. Abstracted
    DyGazePass: A Gaze Gesture-Based Dynamic                             Painterly Renderings Using Eye-tracking Data. In
    Authentication System to Counter Shoulder Surfing and                Proceedings of the 2Nd International Symposium on
    Video Analysis Attacks. In 2018 IEEE International                   Non-photorealistic Animation and Rendering (NPAR ’02).
    Conference on Identity, Security and Behavior Analysis               ACM, New York, NY, USA, 75–ff. DOI:
    (ISBA).                                                              http://dx.doi.org/10.1145/508530.508544

22. Vijay Rajanna, Seth Polsley, Paul Taele, and Tracy               27. Daniela Villani, Francesca Morganti, Pietro Cipresso,
    Hammond. 2017. A Gaze Gesture-Based User                             Simona Ruggi, Giuseppe Riva, and Gabriella Gilli. 2015.
    Authentication System to Counter Shoulder-Surfing                    Visual exploration patterns of human figures in action: an
    Attacks. In Proceedings of the 2017 CHI Conference                   eye tracker study with art paintings. Frontiers in
    Extended Abstracts on Human Factors in Computing                     Psychology 6 (2015), 1636. DOI:
    Systems (CHI EA ’17). ACM, New York, NY, USA,                        http://dx.doi.org/10.3389/fpsyg.2015.01636
    1978–1986. DOI:
                                                                     28. Christian Wallraven, Kathrin Kaulard, Cora KÃijrner,
    http://dx.doi.org/10.1145/3027063.3053070
                                                                         and Robert Pepperell. 2008. In The Eye of The Beholder:
23. Keith Rayner, Marcia Carlson, and Lyn Frazier. 1983.                 The Perception of Indeterminate Art. Leonardo 41, 2
    The interaction of syntax and semantics during sentence              (2008), 116–117. DOI:
    processing: Eye movements in the analysis of                         http://dx.doi.org/10.1162/leon.2008.41.2.116
    semantically biased sentences. Journal of verbal learning
                                                                     29. Patrick Wied. 2017. Dynamic Heatmaps for the Web.
    and verbal behavior 22, 3 (1983), 358–374.
                                                                         (2017). https://www.patrick-wied.at/static/heatmapjs/
24. Sawyer Lynn Ricard. 2015. A Usability and Eyetracking                Accessed December 1, 2017.
    Study Over the Effectiveness of the Layouts of Three
                                                                     30. Victoria Yanulevskaya, Jasper Uijlings, Elia Bruni,
    Video-Based Websites: YouTube, Hulu, and Vimeo. (May
                                                                         Andreza Sartori, Elisa Zamboni, Francesca Bacci, David
    2015).
                                                                         Melcher, and Nicu Sebe. 2012. In the Eye of the
    https://asu-ir.tdl.org/asu-ir/handle/2346.1/30383
                                                                         Beholder: Employing Statistical Analysis and Eye
25. Dario D Salvucci and Joseph H Goldberg. 2000.                        Tracking for Analyzing Abstract Paintings. In
    Identifying fixations and saccades in eye-tracking                   Proceedings of the 20th ACM International Conference
    protocols. In Proceedings of the 2000 symposium on Eye               on Multimedia (MM ’12). ACM, New York, NY, USA,
    tracking research & applications. ACM, ACM, New                      349–358. DOI:
    York, NY, 71–78.                                                     http://dx.doi.org/10.1145/2393347.2393399




                                                                12