<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Chinmay Sudarshan Kulkarni</string-name>
          <email>kulkarni.chinmay.sudarshan@image.iit.tsukuba.ac.jp</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Chokiu Leung</string-name>
          <email>leung@css.risk.tsukuba.ac.jp</email>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Tiago Rodrigues</string-name>
          <email>tiago.rodrigues@capgemini.com</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Hitesh Pandya</string-name>
          <email>hitesh.pandya@capgemini.com</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Bruno</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Coelho</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Yoshinari Kameda</string-name>
          <email>kameda@ccs.tsukuba.ac.jp</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Bachelor's Program in Interdisciplinary Engineering, University of Tsukuba</institution>
          ,
          <addr-line>1-1-1 Tennodai Tsukuba, Ibaraki</addr-line>
          ,
          <country country="JP">Japan</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Center for Computational Sciences, University of Tsukuba 1-1-1 Tennodai Tsukuba</institution>
          ,
          <addr-line>Ibaraki</addr-line>
          ,
          <country country="JP">Japan</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Engineering and R&amp;D Services, Capgemini Japan K.K</institution>
          ,
          <addr-line>1-23-1 Toranomon, Minato City, Tokyo</addr-line>
          ,
          <country country="JP">Japan</country>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>Institute of Systems and Information Engineering, University of Tsukuba</institution>
          ,
          <addr-line>1-1-1 Tennodai Tsukuba, Ibaraki</addr-line>
          ,
          <country country="JP">Japan</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>This paper details an experimental approach to enhance Personal Mobility Device (PMD) safety and user experience through a Meta Quest 3-integrated Heads-Up Display (HUD) system. Addressing risks from distracted riding and inconsistent regulations, the study evaluates real-time data display for Segway Ninebot Mini Pro 2 users. Findings indicate a statistically significant reduction in overall cognitive workload with the HUD and improved adherence to target speed. While the HUD led to slower riding and longer completion times, the research highlights a nuanced trade-off between perceived effort and riding precision. This work contributes to micromobility safety research, informing future PMD design and urban policy.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>Personal Mobility Devices (PMDs), including electric scooters and self-balancing transporters, have
rapidly emerged as convenient and sustainable urban transportation solutions. However, their
increasing prevalence introduces significant safety concerns, primarily due to distracted riding,
varying user awareness, and a notable absence of standardized regulations across jurisdictions. This
creates a complex challenge for municipalities, who must balance the benefits of micromobility with
the imperative to ensure public safety for all.</p>
      <p>A common issue may involve PMD users diverting their visual attention to smartphone screens
for critical operational data, potentially increasing cognitive load and diminishing situational
awareness. This highlights the critical importance of providing real-time information directly to the
user to enhance their riding experience. The emerging landscape suggests that Head-Mounted
Displays (HMDs) combined with micromobility solutions may represent a promising future direction
for user interaction and information delivery in this domain.</p>
      <p>A key challenge in this transformative scheme is that the precise influence of such HUDs on the
actual riding experience of PMD users remains largely unexplored and not fully understood. This
research addresses these challenges by integrating a Heads-Up Display (HUD) system, utilizing the
Meta Quest 3 augmented reality (AR) headset, to project real-time operational data directly within
the rider's field of vision.</p>
      <p>a)</p>
      <p>b)</p>
      <p>This project addresses the challenges on integrating a Heads-Up Display (HUD) system using
the Meta Quest 3 augmented reality (AR) headset to provide real-time operational data directly
within the rider's field of vision (Figure 1). The core objective is to assess how HUD technology can
enhance PMD user safety by mitigating the cognitive load associated with glancing at mobile screens
for critical information. Concurrently, this study explores the technical and practical challenges of
integrating real-time data from the Segway application onto the Meta Quest 3. By investigating these
aspects, this paper aims to contribute to the discourse on micromobility safety, advocating for a dual
approach of technological innovation and regulatory harmonization.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related Works</title>
      <p>The emergence of micromobility devices has provided a valuable solution for urban transportation,
but their widespread adoption has highlighted significant safety challenges [1]. A key concern is the
cognitive load and distraction riders experience, which often stems from the need to interact with
external devices like smartphones for navigation or speed monitoring [2]. Research has shown that
these distractions can lead to a decrease in situational awareness and an increase in accident risk.
For instance, a study by Qawasmeh et al. found that improper lane use and other hazardous rider
behaviors are a significant factor in micromobility crash severity [3]. Similarly, Pashkevich’s
eyetracking study demonstrated that e-scooter riders can experience notable attention lapses, for
example, 2.3-second durations, when checking handlebar-mounted displays for vital information [4].
This underscores the critical need for an interface that can present information to the rider without
diverting their gaze and attention from the road [5].</p>
      <p>
        To address similar issues in other domains, Heads-Up Displays (HUDs) and Augmented Reality
Heads-Up Displays (AR HUDs) have been developed and studied extensively. In the automotive
industry, AR HUDs have proven effective at reducing driver cognitive load and improving safety by
projecting information directly onto the windshield, thus allowing drivers to maintain their focus on
the road [
        <xref ref-type="bibr" rid="ref1">6</xref>
        ]. Winkler et al. suggest that AR HUDs can enhance a driver's situational awareness and
improve reaction times to potential hazards. The technology has also been adapted for two-wheeled
vehicles, with companies developing HUD systems for motorcycle and bicycle helmets that display
information such as speed and navigation [8], [
        <xref ref-type="bibr" rid="ref3">9</xref>
        ]. These systems are designed to keep the user’s eyes
forward, as research shows this improves reaction times and aids in obstacle detection [
        <xref ref-type="bibr" rid="ref4">10</xref>
        ]. However,
a major technical challenge for these systems is ensuring low-latency projections to prevent motion
sickness and maintaining display visibility under varying light conditions [
        <xref ref-type="bibr" rid="ref5">11</xref>
        ], [
        <xref ref-type="bibr" rid="ref6">12</xref>
        ].
      </p>
      <p>
        Despite the substantial body of work on micromobility safety and the proven benefits of AR HUDs
in other transportation contexts, a critical research gap exists concerning their comprehensive
application to self-balancing personal mobility devices. A key limitation in existing AR HUD research
is a "stable platform bias," as much of the work has been conducted on vehicles where balance is not
a primary cognitive demand, such as cars or even conventional bicycles [
        <xref ref-type="bibr" rid="ref7">13</xref>
        ]. As explained by
Billinghurst et al., this is a crucial distinction, as operating a self-balancing PMD imposes a
continuous and dominant cognitive load for postural control, fundamentally differentiating it from
other modes of transport [
        <xref ref-type="bibr" rid="ref7">13</xref>
        ], [
        <xref ref-type="bibr" rid="ref8">14</xref>
        ]. Therefore, simply adapting an AR HUD from a car or bicycle is
insufficient, as the interface could inadvertently increase the already high cognitive demands of the
rider [
        <xref ref-type="bibr" rid="ref8">14</xref>
        ], [
        <xref ref-type="bibr" rid="ref9">15</xref>
        ]. While some related research has explored the use of AR for balance training and
rehabilitation [
        <xref ref-type="bibr" rid="ref10">16</xref>
        ], [
        <xref ref-type="bibr" rid="ref11">17</xref>
        ], these applications do not address the real-time perceptual and interactive
needs of an individual operating a device in a dynamic urban environment.
      </p>
      <p>This lack of dedicated research highlights a pressing need to develop and evaluate AR HUD
systems specifically designed for the unique human-factors challenges of self-balancing PMDs. This
research gap represents a significant opportunity to fundamentally improve the safety and user
experience of this unique form of personal mobility by creating an interface that is sensitive to the
rider’s cognitive demands for both navigation and balance.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Benefits of Real-Time AR-HUD</title>
      <p>
        PMD adoption has surged, but safety measures lag. Nearly half of PMD accidents involve
selfinflicted injuries, and 40% result in head trauma, yet only 10-15% of riders wear helmets[
        <xref ref-type="bibr" rid="ref12">18</xref>
        ]. This
highlights the urgent need for better safety solutions.
      </p>
      <p>
        Augmented reality (AR) could improve road safety by providing real-time hazard alerts without
distracting users—a concept known as "Heads-Up computing." Studies show AR warnings enhance
reaction times, but their effectiveness depends on clear information design[
        <xref ref-type="bibr" rid="ref1">6</xref>
        ]. Poorly designed
displays increase cognitive load, worsening distraction. Optimizing UI elements, like size and
placement, is crucial to reducing mental and physical strain in safety-critical systems.
      </p>
      <p>
        The design of information displays significantly influences user experience and cognitive load.
In-vehicle information systems (IVIS) with touch screens can induce higher visual, manual, and
cognitive distraction compared to physical buttons[
        <xref ref-type="bibr" rid="ref1">6</xref>
        ]. Assessing and predicting cognitive
performance is paramount in safety-critical contexts, as mental workload directly links to
performance efficacy. Furthermore, VR interface design principles, such as element size and distance,
can impact physical effort and mental demand, emphasizing the need for meticulous UI optimization
to minimize cognitive and physical strain.
      </p>
    </sec>
    <sec id="sec-4">
      <title>4. Evaluation of AR-HUD</title>
      <p>The primary purpose of this research is to experimentally evaluate the tangible efficacy of an AR
HUD system, specifically utilizing the Meta Quest 3 headset, in directly enhancing both the safety
and overall user experience for individuals operating a Segway Ninebot Mini Pro 2. This evaluation
is achieved through a rigorous quantification of the system's impact on subjective cognitive workload
and objective riding performance. A secondary, yet equally important, purpose is to identify and
analyze any potential trade-offs in performance or significant individual differences in user response
that may arise from the introduction of this AR interface.</p>
      <sec id="sec-4-1">
        <title>Hypothesis for Solving the Issue(s)</title>
        <p>As previously articulated in the introduction, the central tenet of this research is grounded in the
primary hypothesis: that the Meta Quest 3-integrated AR HUD system will result in a statistically
significant reduction in the overall perceived cognitive workload, as measured by the Weighted
NASA-TLX Score, for PMD users. Complementing this, the secondary hypotheses anticipate
quantifiable improvements in objective performance metrics, specifically predicting a reduction in
deviation from requested speed (smaller margin), an increase in sampled speed, and a decrease in
overall task completion time, all indicative of a safer and more efficient riding experience.
4.2.</p>
      </sec>
      <sec id="sec-4-2">
        <title>Verification</title>
        <p>This study employed a robust within-subjects experimental design to rigorously evaluate the
hypotheses, involving a cohort of 20 first-time Segway users. The within-subjects approach was
chosen to minimize inter-individual variability, as each participant served as their own control by
completing both experimental conditions.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Experiment and Results</title>
      <p>5.1.</p>
      <sec id="sec-5-1">
        <title>Hardware and Setup</title>
        <p>The primary Personal Mobility Device utilized was a Segway Ninebot Mini Pro 2, selected for its
integrated sensors and robust connectivity capabilities. The Augmented Reality display was provided
by a Meta Quest 3 headset. The Segway's proprietary application was mirrored onto the Meta Quest
3 (Figure 2(c)), enabling the projection of real-time operational data (e.g., current speed, battery life,
tilt angles) directly into the rider's field of vision via the headset's Passthrough mode (Figure 1(b),
2(a)). This setup aimed to create an immersive experience wherein critical information was
seamlessly integrated into the rider's natural visual field without requiring overt attention shifts.
5.2.</p>
      </sec>
      <sec id="sec-5-2">
        <title>Experimental Procedure</title>
        <p>The study protocol was meticulously structures to ensure consistency and control:</p>
        <p>Comprehensive Briefing: Prior to any practical trials, all participants received a detailed
briefing outlining the study's objectives, the operational principles of the PMD, and the
functionalities of the AR HUD system.</p>
        <p>Training Session: A dedicated training session was conducted in a safe, enclosed environment.
This allowed participants to familiarize themselves with the Segway's controls and, for the
headset condition, to acclimate to the AR interface and the overlaid information.
Test Ride: Following training, participants navigated a standardized rectangular obstacle
course (12.5m x 2m) designed to simulate typical urban riding challenges (Figure 3). The
course included various turns and a zig-zag segment, specifically crafted to challenge aspects
of rider control, motor skills, and cognitive functions. The order of conditions (with or
without HUD) was counterbalanced across participants to mitigate order effects.
Data Collection: Both subjective and objective data were systematically collected to provide
a holistic evaluation:
</p>
        <p>Subjective Workload: After completing each riding condition, participants completed
the NASA-TLX (National Aeronautics and Space Administration Task Load Index)
questionnaire. This tool measured perceived workload across six core subscales: Mental
Demand, Physical Demand, Temporal Demand, Performance, Effort, and Frustration.
These individual ratings were then combined to calculate a comprehensive Weighted
Score, representing overall cognitive burden.
where  ̅ is the mean of the differences between paired observations,  is the standard
deviation of these differences, and n is the number of paired observations (participants).
The p-value derived from this t-statistic indicates the probability of observing such a
difference if no true effect existed
(1)
(2)
(3)
●</p>
        <p>Correlational Analysis: To explore the intricate relationships between the changes (i.e., the
difference between the "no headset" and "with headset" conditions) in subjective and objective
metrics, Pearson correlation coefficients (r) were calculated. This approach allowed for a
direct assessment of how the impact of the headset on one variable related to its impact on
another for the same individual.
○</p>
        <p>Pearson Correlation Coefficient:
 =</p>
        <p>∑( −  ̅)( −  )
∑( −  ̅) ∑( −  )
,
(4)
where P-values were concurrently derived for all t-tests and Pearson correlations to ascertain
statistical significance.
5.4.</p>
      </sec>
      <sec id="sec-5-3">
        <title>Results</title>
        <p>The comprehensive analysis of the full 20-participant dataset provides robust empirical evidence
regarding the AR HUD's impact on PMD user experience and performance.
5.4.1.</p>
      </sec>
      <sec id="sec-5-4">
        <title>Cognitive Workload (NASA-TLX):</title>
        <p>The primary hypothesis, predicting a significant reduction in overall cognitive workload with the
AR-HUD, was strongly supported:
</p>
        <p>Overall Weighted Score: A statistically significant reduction in the overall perceived
cognitive workload was observed when participants utilized the AR HUD. The mean
Weighted Score decreased from x =51.45±24.37 (no headset) to x =43.03±19.82 (with headset),
with a t-statistic of t(19)=2.121 and a p-value of p=0.047. This finding suggests that the AR
HUD successfully alleviated the perceived mental burden on riders.</p>
        <p>Temporal Demand: A statistically significant reduction was specifically noted in the
Temporal Demand subscale. Participants reported feeling less time pressure or a less frantic
pace with the headset (x =35.25±23.65) compared to the no-headset condition
(x =53.25±24.78). This finding (t(19)=2.902, p=0.009) was even more statistically significant
with the expanded dataset, reinforcing that the immediate and integrated information
delivery of the HUD allowed riders to perceive the task as less hurried, suggesting a more
comfortable and potentially safer experience.</p>
        <p>Effort: While the mean perceived Effort showed a reduction (from x =53.75±29.46 to
x =45.00±29.60), this change was not statistically significant (t(19)=1.322, p=0.202) in the full
dataset.</p>
        <p>Other NASA-TLX subscales (Mental Demand, Physical Demand, Performance, and
Frustration) consistently showed non-significant trends towards reduction.
5.4.2.</p>
      </sec>
      <sec id="sec-5-5">
        <title>Objective performance metrics</title>
        <p>As for the objective Performance Metrics, the analysis of objective performance metrics revealed
consistent adaptive changes in riding behavior, suggesting a nuanced impact beyond simple speed
or efficiency gains:</p>
      </sec>
      <sec id="sec-5-6">
        <title>Sampled Speed (km/h): A statistically significant decrease in sampled speed was observed</title>
        <p>when participants used the AR HUD. The mean sampled speed was  ̅=5.36±1.76 km/h without
the headset, which reduced to  ̅=4.48±1.37 km/h with the headset (t(19)=3.423, p=0.003).</p>
      </sec>
      <sec id="sec-5-7">
        <title>Margin (km/h): The Margin from requested speed showed a statistically significant</title>
        <p>reduction, decreasing from  ̅=2.01±1.33 km/h (no headset) to  ̅=1.13±0.82 km/h (with headset)
(t(19)=3.423, p=0.003). This indicates improved precision.
5.4.3.</p>
      </sec>
      <sec id="sec-5-8">
        <title>Individual differences and correlations</title>
        <p>Analysis of individual participant data indicated variability in responses, complementing the
grouplevel findings. For instance, while a majority (14 out of 20) of participants experienced a reduction in
their Weighted NASA-TLX Score with the HUD, 6 participants reported an increase. This highlights
the importance of individual user characteristics in AR adoption. Similarly, 16 riders demonstrated
improved precision (smaller margin) with the headset, while 4 showed worsened precision. In terms
of speed, 16 riders were slower, and 4 were faster with the headset. These individual patterns suggest
diverse strategies or adaptation rates among users when interacting with the AR HUD.</p>
        <p>To further explore the relationships between these observed changes, Pearson correlation
coefficients were calculated between the differences (No Headset - With Headset) for various
subjective and objective metrics. A statistically significant negative correlation (r=−0.479,
p=0.033) was identified between a reduction in perceived Physical Demand (i.e., the headset made
the task feel physically easier) and an increase in Adjusted Completion Time (i.e., the participant
took longer to complete the course). This intriguing relationship may suggest a potential trade-off:
when the AR HUD alleviated physical exertion, riders might have adopted a more relaxed pace,
leading to a longer, less hurried completion. This could indicate that reduced physical strain,
facilitated by the HUD, may allow users to prioritize comfort and a deliberate pace over raw speed.
Other strong internal correlations within TLX subscales were also observed, such as between the
change in Effort and the change in Frustration (r=0.824, p&lt;0.001), and between the change in Effort
and the change in Weighted Score (r=0.819, p&lt;0.001). These robust internal correlations may suggest
that improvements in perceived effort and frustration are tightly linked and collectively contribute
to the overall reduction in cognitive workload experienced by the rider.
The empirical evidence from this study suggests that the Meta Quest 3-integrated AR HUD system
may influence PMD user experience and performance in complex ways. The data appears to indicate
a statistically significant reduction in overall perceived cognitive workload, particularly concerning
temporal demand, which could imply a more comfortable and less stressful riding experience.</p>
        <p>However, the objective performance metrics suggest a nuanced trade-off. While the HUD was
associated with a statistically significant reduction in sampled speed and an increase in completion
time, there was also a statistically significant improvement in precision, as indicated by a smaller
margin from the requested speed. This observed relationship between subjective workload reduction
and altered objective performance may suggest that the HUD could encourage a more controlled and
deliberate riding style, potentially prioritizing precision over raw speed, especially for users new to
such interfaces. This interpretation is further supported by the correlation between reduced physical
demand and increased completion time, where a less strenuous task might lead to a more relaxed
pace. The identified individual variability in responses may also highlight the importance of
personalized AR experiences in future designs, as not all users responded identically to the
intervention.</p>
        <p>These findings take on particular significance when viewed through the lens of a self-balancing
PMD's unique cognitive demands. Unlike more stable vehicles, operating a Segway requires a
continuous and dominant cognitive load for postural control. The statistically significant reduction
in overall perceived cognitive workload, especially the reduction in temporal demand, suggests that
the AR HUD successfully offloaded some of this cognitive burden by making critical operational data
more accessible. This points to a key design insight; for devices where rider balance is a primary
concern, an AR HUD is not merely a convenience but a potential safety-critical tool that can free up
mental resources. Future interface designs for PMDs should, therefore, prioritize the heads-up
display of information essential for safety and control, such as speed and battery life, to mitigate the
cognitive friction associated with looking down at a mobile screen. This approach allows riders to
maintain a continuous, forward-facing visual field, directly addressing a core challenge of
micromobility safety.</p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>6. Conclusion</title>
      <p>Building upon these insights and addressing the inherent limitations of the current study (e.g.,
sample size, first-time users, simulated environment), several promising avenues for future research
may emerge. These include conducting longitudinal studies to investigate the effects of prolonged
use and extensive training with the AR HUD, and expanding participant cohorts to include
individuals with varying levels of PMD experience and diverse demographics for more
comprehensive insights.</p>
      <p>Future work could also involve real-world trials in authentic, uncontrolled urban environments,
alongside the integration of advanced data collection methods, such as eye-tracking technology, to
precisely measure visual attention shifts, gaze patterns, and cognitive tunneling with and without
the HUD. Furthermore, exploring the development of adaptive HUD designs that can intelligently
adjust information density, presentation modality, and visual saliency based on real-time factors like
rider proficiency or environmental complexity may be beneficial. Finally, engaging proactively with
urban planners, policymakers, and PMD manufacturers could advocate for the development of
harmonized regulatory frameworks that support the safe and effective integration of advanced
safety-enhancing technologies like AR HUDs into micromobility ecosystems.</p>
      <sec id="sec-6-1">
        <title>Declaration on Generative AI</title>
        <p>During the preparation of this work, the author(s) used Grammarly in order to: Grammar and
Spelling check, Paraphrase and Reword. Further, the author used Google Gemini 2.5 Pro in order to:
Improve Writing Style, Peer review simulation. After using these tools and services, the authors
reviewed and edited the content as needed and takes full responsibility for the publication’s content.
[1] National Transportation Safety Board, MICROMOBILITY: DATA CHALLENGES ASSOCIATED
WITH ASSESSING THE PREVALENCE AND RISK OF ELECTRIC SCOOTER AND ELECTRIC
BICYCLE FATALITIES AND INJURIES, Safety Research Report SRR-22-01, National
Transportation Safety Board, Washington, DC, 2022.
[2] National Transport Commission, DRIVER DISTRACTION: A REVIEW OF SCIENTIFIC</p>
        <p>LITERATURE, National Transport Commission, Melbourne, VIC, 2022.
[3] B. Qawasmeh, J.-S. Oh, V. Kwigizile, Micro-Mobility Safety Assessment: Analyzing Factors
Influencing the Micro-Mobility Injuries in Michigan by Mining Crash Reports, Future
Transportation 4 (2024) 1580–1601. doi:10.3390/futuretransp4040076.
[4] A. Pashkevich, T. Burghardt, S. Pulawska-Obiedowska, M. Sucha, Visual Attention and Speeds
of Pedestrians, Cyclists, and Electric Scooter Riders when using Shared Road – a field eye tracker
experiment, Case Studies on Transport Policy 10 (2022). doi:10.1016/j.cstp.2022.01.015.
[5] D. L. Strayer, J. Turrill, J. M. Cooper, J. R. Coleman, N. Medeiros-Ward, F. Biondi, Assessing
Cognitive Distraction in the Automobile, Human Factors 57 (2015) 1300–24.
doi:10.1177/0018720815575149.
[8] Renfrew Group Design, Digilens Motorcycle Helmet
https://renfrewgroup.com/portfolio/digilens-motorcycle-helmet-hud/.</p>
        <p>HUD.</p>
        <p>(2024).</p>
        <p>URL:
https://disti.com/wp</p>
      </sec>
      <sec id="sec-6-2">
        <title>A. Supplementary Graphs</title>
        <p>Figure A.1: Mean NASA-TLX weighted scores by condition
Figure A.2: Mean Sampled Speed and Margin by condition</p>
      </sec>
    </sec>
    <sec id="sec-7">
      <title>B. Supplementary Tables and Charts</title>
      <sec id="sec-7-1">
        <title>T-statistic</title>
      </sec>
      <sec id="sec-7-2">
        <title>P-value</title>
      </sec>
      <sec id="sec-7-3">
        <title>Significant (p&lt;0.05)</title>
        <p>Mental Demand
Physical Demand
Temporal Demand
Performance</p>
        <p>Effort</p>
        <p>Frustration
Weighted Score</p>
        <p>Requested Speed
Sampled Speed (km/h)
Margin (km/h)</p>
        <p>Timestamp</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>M.</given-names>
            <surname>Winkler</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Soleimani</surname>
          </string-name>
          ,
          <article-title>A Review of Augmented Reality Heads Up Display in Vehicles: Effectiveness, Application, and</article-title>
          <string-name>
            <surname>Safety</surname>
          </string-name>
          ,
          <source>International Journal of Human-Computer Interaction</source>
          <year>2025</year>
          (
          <year>2025</year>
          )
          <fpage>1</fpage>
          -
          <lpage>16</lpage>
          . doi:
          <volume>10</volume>
          .1080/10447318.
          <year>2024</year>
          .
          <volume>2443252</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <article-title>[7] CrossHelmet, CrossHelmet X1 features: HUD, Noise Cancellation &amp; much more</article-title>
          . (
          <year>2024</year>
          ). URL: https://www.crosshelmet.com/features/.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>M.</given-names>
            <surname>Taha</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.M.S.</given-names>
            <surname>Ahmed</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Aziz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.A.</given-names>
            <surname>Bari</surname>
          </string-name>
          , Smart HUD Helmet,
          <source>International Journal of Emerging Technologies and Innovative Research (JETIR) 8</source>
          (
          <year>2021</year>
          )
          <fpage>a947</fpage>
          -
          <lpage>a953</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [10]
          <article-title>Rinf.tech, AR-HUD Development Challenges, Approaches, and</article-title>
          <string-name>
            <surname>Benefits.</surname>
          </string-name>
          (
          <year>2024</year>
          ). URL: https://www.rinf.
          <article-title>tech/ar-hud-development-challenges-approaches-and-benefits%E2%80%8B/.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>A.</given-names>
            <surname>Hazarika</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Rahmati</surname>
          </string-name>
          ,
          <article-title>Towards an Evolved Immersive Experience: Exploring 5G-</article-title>
          and
          <string-name>
            <surname>Beyond-Enabled</surname>
          </string-name>
          Ultra-Low-Latency
          <source>Communications for Augmented and Virtual Reality, Sensors</source>
          <volume>23</volume>
          (
          <year>2023</year>
          )
          <article-title>3682</article-title>
          . doi:
          <volume>10</volume>
          .3390/s23073682.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [12]
          <string-name>
            <surname>DiSTI</surname>
          </string-name>
          ,
          <string-name>
            <surname>Augmented Reality</surname>
            <given-names>HUD</given-names>
          </string-name>
          , (
          <year>2021</year>
          ), URL: content/uploads/2021/05/AugmentedRealityHUD-DiSTI-English.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>M.</given-names>
            <surname>Billinghurst</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Clark</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <article-title>A Survey of Augmented Reality, volume 8 of Foundations and Trends® in Human-Computer Interaction</article-title>
          : Vol.
          <volume>8</volume>
          ; No.
          <issue>2-3</issue>
          , pp
          <fpage>73</fpage>
          -
          <lpage>272</lpage>
          . doi:
          <volume>10</volume>
          .1561/1100000049
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>Z. G.</given-names>
            <surname>Gursoy</surname>
          </string-name>
          , U. Yilmaz,
          <string-name>
            <given-names>H.</given-names>
            <surname>Celik</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Arpinar-Avsar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Kirazci</surname>
          </string-name>
          ,
          <article-title>Effect of Individualized Cognitive and Postural Task Difficulty Levels on Postural Control during Dual Task Condition</article-title>
          ,
          <source>Gait &amp; Posture</source>
          <volume>96</volume>
          (
          <year>2022</year>
          )
          <fpage>1</fpage>
          -
          <lpage>8</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.gaitpost.
          <year>2022</year>
          .
          <volume>05</volume>
          .001.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>G.</given-names>
            <surname>Andersson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Hagman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Talianzadeh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Svedberg</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. C.</given-names>
            <surname>Larsen</surname>
          </string-name>
          ,
          <source>Effect of Cognitive Load on Postural Control, Brain Research Bulletin</source>
          <volume>58</volume>
          (
          <year>2002</year>
          )
          <fpage>135</fpage>
          -
          <lpage>139</lpage>
          . doi:
          <volume>10</volume>
          .1016/s0361-
          <volume>9230</volume>
          (
          <issue>02</issue>
          )
          <fpage>00770</fpage>
          -
          <lpage>0</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>S.</given-names>
            <surname>Blomqvist</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Seipel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Engström</surname>
          </string-name>
          ,
          <article-title>Using Augmented Reality Technology for Balance Training in the Older Adults: a Feasibility Pilot Study</article-title>
          ,
          <source>BMC Geriatr 21</source>
          (
          <year>2021</year>
          )
          <article-title>144</article-title>
          . doi:
          <volume>10</volume>
          .1186/s12877-021- 02061-9.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>A.</given-names>
            <surname>Etaat</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Haghbin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Kersten-Oertel</surname>
          </string-name>
          ,
          <article-title>An Online Balance Training Application using Pose Estimation and Augmented Reality</article-title>
          ,
          <source>in: Proceedings of the 8th International Conference on Information and Communication Technologies for Ageing Well</source>
          and e-Health,
          <source>SciTePress</source>
          (
          <year>2022</year>
          ), pp.
          <fpage>168</fpage>
          -
          <lpage>176</lpage>
          . doi:
          <volume>10</volume>
          .5220/0010973800003188.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [18]
          <string-name>
            <surname>European</surname>
            <given-names>Commission</given-names>
          </string-name>
          ,
          <source>Road safety thematic report -Personal Mobility Devices</source>
          . (
          <year>2021</year>
          ). URL: https://road-safety.transport.ec.europa.eu/document/download/6f698f20-f43f-
          <fpage>4066</fpage>
          -816ad1db217b3bc4_
          <article-title>en?filename=road_safety_thematic_report_personal_mobility_devices_tc_final</article-title>
          . pdf.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>