=Paper= {{Paper |id=Vol-1618/FuturePD_paper2 |storemode=property |title= JogChalking: Capturing and Visualizing Affective Experience for Recreational Runners |pdfUrl=https://ceur-ws.org/Vol-1618/FuturePD_paper2.pdf |volume=Vol-1618 |authors=Nabil Bin Hannan,Felwah Alqahtani,Derek Reilly |dblpUrl=https://dblp.org/rec/conf/um/HannanAR16 }} == JogChalking: Capturing and Visualizing Affective Experience for Recreational Runners== https://ceur-ws.org/Vol-1618/FuturePD_paper2.pdf
           JogChalking: Capturing and Visualizing Affective
                Experience for Recreational Runners
          Nabil Bin Hannan                                Felwah Alqahtani                                 Derek Reilly
     Faculty of Computer Science                     Faculty of Computer Science                  Faculty of Computer Science
        Dalhousie University                            Dalhousie University                         Dalhousie University
             nabil@dal.ca                                  fl823899@dal.ca                              reilly@cs.dal.ca




ABSTRACT                                                               2. PROTOTYPE
We present JogChalker, a system that allows recreational runners         The current prototype consists of a capture application and a
to capture their affective experience while running using touch        visualization tool. The capture application is written in Java and
gestures. Using a small set of simple gestures, a runner can record    runs on Android devices (Figure 1(a)), and provides a full screen
affect while running without looking at a screen or entering into a    gesture capture interface. A yellow trace line shows a gesture as it
multi-step interaction. Gestures are recognized, but also recorded     is being made, and it is recorded in real time. A standard Android
at high fidelity, as we believe how the gesture is made may itself     gesture recognition library classifies the gesture once completed.
be expressive and useful for runners to review. We present our         In addition to the time and location in which a gesture is made, we
initial prototype, describe the goals, structure, and outcomes of a    record the traversal of the drawn gesture in terms of elapsed time
four-week participatory design session, and discuss the consequent     and screen coordinates, as well as the width of the touch area and
capture and visualization implications for JogChalker that we are      the device pressure (if supported by the hardware) throughout the
currently exploring. JogChalker provides new opportunities for         gesture. Using this we generate an SVG animation so that the
self-tracking affective experience during running and for helping      gesture can be replayed on the visualization interface. From the low
runners recall and interpret their runs.                               level data a number of higher-level attributes of the gesture can be
                                                                       determined, including repetition, total area, average speed, and total
Keywords                                                               time taken. After initial testing we identified five candidate
Gesture; visualization; design; running; emotion.                      running-related emotional states to support (bored, tired, mellow,
                                                                       euphoric, exhilarated), and developed simple candidate gestures for
1. INTRODUCTION                                                        evaluation (Figure 1(b)). These were initial gesture sets, we chose
Running is a physical activity enjoyed by many. It has low barriers    gestures as we didn't want to focus on gesture elicitation during the
to participation, and for most, running is an active pastime rather    design process, but rather the design of the mobile interface and
than a competitive sport. However not every run is as enjoyable as     visualization dashboard.
the next, and personal preferences for runners vary: weather
conditions, location, terrain, music, time of day, etc. Popular
mobile applications such as Runkeeper, Runtastic, and
Endomondo track running data and let runners visualize and share
their runs. Aside from freeform annotation at the end of a run, such
applications don’t currently provide a means of capturing the
affective experience of a run. Consequently, their visualization
interfaces emphasize physical performance over the qualitative but
critically important notion of enjoyment. Without a means of
capturing emotion or affective experience during runs, runners
don’t have a way of tracking and identifying patterns that correlate
with a positive running experience. Such a feature would enable
runners to better choose the time, place, and circumstances of their         (a)                (b)                         (c)
leisure runs. Manual tracking tools like Moodmap [1] and Emotion
Map [2] allow users to tag locations and times with emotions, and
present these on a map. Typical widget-based interfaces can be            Figure 1: (a) mobile screen for gesture application (b) armband
difficult or impossible to use when physically active, however [3].       with gesture list (c) initial map-based visualization
In this paper we present JogChalker, a system that allows
                                                                                                           .
recreational runners to capture their affective experience while
running using touch gestures.




                                                                                   Figure 2: Sample visualization after session 1
                                                                         single affective experience visualization interface. We additionally
                                                                         gave them some scenarios to consider when refining their design
                                                                         (e.g., running on a rainy day, running in a crowded area). Session 4
JogChalker’s visualization tool is written in JavaScript, and was        was conducted in the style of a Future Technology Workshop [4].
built using Mapbox Studio (Figure 1(c)). Currently the tool              The group brainstormed about alternative methods that recreational
displays a single running route (obtained using a manual export          runners could use to capture and visualize affective experience.
from a running tracker on the mobile device). A list of the gestural
annotations made during the run is provided, and these are also
marked on the route itself using teardrop markers, above which are       4. OUTCOMES
the SVG gesture images.                                                  Recorded emotions varied; 1 participant drew gestures for mellow,
                                                                         exhilarated and bored, 3 others used tired and mellow mainly. We
3. PARTICIPATORY DESIGN                                                  annotated designs and identified themes that emerged in the designs
Informal testing of the initial JogChalker prototype generated many      and participant comments across the 4 design sessions. Due to
questions, including: Is it comfortable to make gestures while           space constraints we briefly discuss some highlights here.
running? Is making a gesture emotionally expressive? Will runners        Participants liked that gesture capture was automatic. They found
use the candidate states and gestures? How should affective              recording gestures tricky when in full run, but didn’t mind slowing
experience be visualized and queried after a run, or after many          down to do so. All participants wanted to define their own gestures,
runs? How could JogChalker be integrated with existing running           and found it difficult to distinguish between euphoric and
data capture tools?
We wanted to further develop the prototype for a field study to
explore some of these questions. We employed a participatory
design approach with recreational runners. After pilot testing with
two lab colleagues we arrived at the methodology summarized
here. We recruited 4 recreational runners (one female and three
male, age 25-35), who each participated in 4 design sessions
distributed over a 4 week period. Each session was divided into two
parts – capture interface design and visualization interface design.              Figure 3: Sample visualization after session 2
After first receiving training on making the 5 gestures (for bored,
mellow, tired, euphoric, exhilarated), participants ran for 30-60
mins using the capture tool prior to each session. Participants were
asked simply to run a familiar route. They used an Android
smartphone with a pressure-sensitive screen worn on an armband,
and a Mio heart rate wristband. Gestures were displayed on the side
of the armband for quick reference (Figure 1(b)). Since participants
were not used to recording emotions while running, the mobile
device would vibrate if no gesture was recorded over a 10 minute                         Figure 4: group designs, session 3
interval; otherwise participants were not prompted to record             exhilarated. They all wanted to be able to record voice annotations,
gestures. The Runkeeper application was also launched on the             instead of or in addition to gestures. Integration with Runkeeper
phone, and we preloaded the phone with a personal playlist if they       was refined toward a simple interface to enter recording modes, and
preferred to listen to music while running. Participants also wore a     a screen for reviewing and deleting annotations (see Figure 4) on
GoPro camera while running. This was to generate a video stream          the mobile. The group also suggested that recording an emotion
that we provided as a potential element to include in the                could immediately trigger a change in music playlist. Our
visualization interface, and to get a record of whether they slowed      participants did not mention discomfort with the armband but did
down or stopped, and whether they looked at the screen when              discuss using a smart watch as an alternative.
making a gesture.
The first two PD sessions were done individually. In session 1
participants sketched potential modifications to the capture
application using pen, paper, and post-its (Figure 2). They were         Visualization interface designs maintained a simple map-based run
then shown the visualization prototype, Runkeeper’s visualization        plot; most debate centered around whether data other than route and
interface, and the GoPro video feed. They were provided with pen,        gesture location should always be visible or only after a selection
paper, and a set of paper widgets (including elements from the two       interaction. When a gesture location is selected in the group design,
visualizations and others not presented on either visualization          a synchronized video stream would play the corresponding
including video, music, and weather data) and sketched a single          segment, and biometric data, music, weather, and the gesture itself
visualization interface that would integrate the captured gestures       would be displayed in a popup. (see Figure 5) Despite prompting,
with other data they deemed relevant for visualizing their               the notion of visualizing long term data patterns was not explored
experience (Figure 2). In session 2, participants used the same tools,   in detail by the group.
to envision how to integrate gesture capture into Runkeeper, and
work on their visualization design after viewing those made by the
other runners (Figure 3).
The last two sessions were conducted as a group. In Session 3, the
group presented and discussed each member’s designs, then
worked to create a single integrated gesture capture design, and a
                                                                         [3] Florian Mueller, Joe Marshall, Rohit Ashok Khot, Stina
                                                                         Nylander, and Jakob Tholander. 2014. Jogging with technology:
                                                                         interaction design supporting sport activities. In CHI 2014
                                                                         Extended Abstracts. ACM, New York, NY, USA, 1131-1134.
                                                                         [4] Giasemi N. Vavoula, Mike Sharples, Paul D. Rudman. 2002.
                                                                         Developing the 'Future technology workshop' method. In
                                                                         Proceedings of the International Workshop on Interaction Design
                                                                         and Children (pp. 65–72). Eindhoven: The Netherlands




         Figure 4: group designs, session 3
5. DISCUSSION
Despite some difficulties, our participants were satisfied with
gesture as a means of recording affective experience when running,
although they all felt that options for audio and custom gesture
should be available. It is important to note that participants did not
use audio annotations during the runs, and it may have its own
issues (background noise, feeling awkward, breathlessness).
Our PD approach may have limited novelty and variety in the
visualization interface; participants were primed by the initial
prototype and Runkeeper’s visualization, and the final result was a
fairly straightforward “mashup” of the 2 interfaces. Showing the
sketches of other users did encourage participants to think about
their decisions, however the designs were very similar to begin
with.
Our PD methodology also did not allow us to explore more nuanced
aspects of gestural affect capture and visualization, including
whether animating gestures supports inference and recall of
affective experience, and whether and how long-term use of the
interface supports discovery of running patterns leading to
enjoyment. Our future work will explore both questions.
6. FUTURE WORK
We are further investigating how collecting and visualizing
affective experience alongside traditional running biometric and
geospatial data can be used to generate richer insights into what can
influence a runner’s performance. One focus of this work is to
determine effective visual representations of gesture data, including
an assessment of whether an individual can interpret emotional
intensity by viewing an animation showing how a gesture was
made. We are also exploring how JogChalking might encourage
richer, more subjective recollections of running experiences. Over
the long term this may help runners to discover the running patterns
which lead to enjoyment for them, and for supporting tools to
provide recommendations based on this data.




7. REFERENCES
[1] Angela Fessl, Verónica Rivera-Pelayo, Viktoria Pammer, and
Simone Braun. 2012. Mood tracking in virtual meetings.
In Proceedings of the 7th European conference on Technology
Enhanced Learning (EC-TEL'12). Springer-Verlag, Berlin,
Heidelberg, 377-382.
[2] Yun Huang, Ying Tang, and Yang Wang. 2015. Emotion
Map: A Location-based Mobile Social System for Improving
Emotion Awareness and Regulation. In Proceedings of CSCW
2015. ACM, New York, NY, USA, 130-142.