=Paper= {{Paper |id=Vol-3704/paper1 |storemode=property |title=Designing a Comparative Study to Evaluate VR Body-Mounted Menu Layouts: Challenges and Methodology |pdfUrl=https://ceur-ws.org/Vol-3704/paper1.pdf |volume=Vol-3704 |authors=Harsanjit S Bhullar,Nanjia Wang,Frank Maurer |dblpUrl=https://dblp.org/rec/conf/realxr/BhullarWM24 }} ==Designing a Comparative Study to Evaluate VR Body-Mounted Menu Layouts: Challenges and Methodology== https://ceur-ws.org/Vol-3704/paper1.pdf
                                Designing a Comparative Study to Evaluate VR
                                Body-Mounted Menu Layouts: Challenges and
                                Methodology
                                Harsanjit S Bhullar1,*,† , Nanjia Wang1 and Dr.Frank Maurer1
                                1
                                    University of Calgary, ICT Building, 856 Campus Pl NW, Calgary, AB, T2N 4V8. Canada


                                              Abstract
                                              The increase of Virtual Reality and Augmented Reality technologies has prompted the development of
                                              user interfaces that are both efficient and intuitive to navigate. This study proposes a comprehensive
                                              comparative analysis of various VR menu layouts, with the aim of identifying the most effective design
                                              that balances efficiency, ease of use, and user satisfaction. Building upon foundational work on 3D menu
                                              taxonomy and ergonomic considerations, this research aims to evaluate conventional and innovative
                                              menu layouts through user performance metrics and cognitive load assessments. The anticipated outcome
                                              is a set of design recommendations for VR UI menus that enhance the user interface design landscape in
                                              immersive environments.

                                              Keywords
                                              Virtual Reality, Extended Reality, Body Mounted UI, MR Body Mounted Menus




                                1. Introduction
                                The arrival of Virtual Reality (VR) and Augmented Reality (AR), collectively referred to as
                                Extended Reality (XR) [1], represents a strong shift in the way we interact with digital content.
                                These immersive technologies have not only transformed entertainment but have also begun to
                                reshape various industries by offering novel ways to visualize data, train personnel, and facilitate
                                remote collaboration. As VR and AR become more widespread, designing user interfaces that
                                are both efficient and intuitive to interact with is critical. These interfaces are the tools in which
                                users interact with the virtual and augmented spaces, and their design can greatly influence the
                                overall user experience.
                                   While VR and AR technologies continue to expand and integrate into various sectors, there
                                is still a gap in the analysis of user interface menu structures within these environments.
                                Existing research focuses on individual aspects of UI design and doesn’t consider all menu
                                structures in VR and AR environments. For example, Bowman and Wingrave [2] discuss the
                                importance of evaluating 3D menu systems within VR for usability and suggest that the field
                                lacks a comprehensive understanding of how different UI types affect user interaction.

                                RealXR: Prototyping and Developing Real-World Applications for Extended Reality, June 4, 2024, Arenzano (Genoa), Italy
                                *
                                  Corresponding author.
                                †
                                  These authors contributed equally.
                                $ harsanjit.bhullar@ucalgary.ca (H. S. Bhullar); nanjia.wang1@ucalgary.ca (N. Wang); frank.maurer@ucalgary.ca
                                (Dr.Frank Maurer)
                                 0000-0003-2276-512X (N. Wang); 0000-0002-0240-715X (Dr.Frank Maurer)
                                            © 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).




CEUR
                  ceur-ws.org
Workshop      ISSN 1613-0073
Proceedings
   These studies have shown that while users may prefer certain interaction mechanisms, such
as gaze-based controls, these can introduce errors and demand additional investigation to refine
their usability. The design of body mounted UIs in VR need be validated through user evaluations
and are crucial in VR where traditional input devices are not visible to the user. The goal of this
comparative study proposal on body mounted UIs is to determine which configurations offer
the best functionality, user comfort, and satisfaction. This extends the scope of prior research
by comparing traditional systems against a spectrum of new and existing body-mounted menu
designs.


2. Background and Related Work




 (a) Baseline Oculus Layout[3]          (b) Radial Menu[4]          (c) Hierarchical Drop-down[4]

Figure 1: Common Body-mounted UI Menus from Existing Studies[3, 4]


   From the standard controller-based navigation seen in platforms like Oculus (Figure 1a) to
innovative approaches like the rotating 3D hand menu, the evolution of menu interfaces in
VR/AR is looking to maximize the spatial capabilities of these environments. Radial menus
(Figure 1b) arrange options in a circular format for efficiency, while floating menus aim to reduce
cognitive strain by leveraging users’ natural spatial awareness. Hierarchical drop-down menus
(Figure 1c) provide a familiar structure adapted for VR. Each of these interfaces, as discussed in
studies by researchers [5, 4, 3, 6, 7] introduce distinct interaction schemes with their own pros
and cons.
   The challenge in VR body mounted UI development is finding a balance of user-centric
design and functionality. Developing these intuitive and efficient body mounted UIs is difficult
especially when considering diverse user preferences and capabilities. Innovations like the
toolkit for automatically generating VR hierarchy tile menus [3] outline progress in this direction
but adaptable authoring tools and empirical validation of UI effectiveness is still ongoing. This
research focuses on the serious applications of VR such as immersive training and simulation,
architecture, and urban planning. Using VR in these fields requires interfaces that prioritize
efficiency, enabling users to access or navigate their desired functions without unnecessary
distractions. This is particularly important in professional settings where the precision and
speed of interaction can significantly impact the outcome.
   Reviewing existing studies [8] on VR’s impact on architectural design review meetings high-
lights the impact of VR in serious applications. Liu et al.’s findings show VR’s potential not
just for enhancing the design review process but also for developing a deeper understanding
of architectural projects before construction. This showcases VR in professional fields, where
immersive experiences can lead to better-informed decisions and collaborative outcomes. This
comparative research study proposal specifically targets body-mounted UIs within VR environ-
ments, exploring their application in serious environments. By examining various UI layouts
and their suitability for tasks in training, planning, and design, we hope to identify design
principles that ensure these interfaces are both intuitive and effective. The goal is to contribute
to the development of VR applications that are not only advanced but also practically useful,
enhancing efficiency and user experience in professional and serious environments.
   Additional related work on body-referenced graphical menus in VR environments [9] com-
pares menu placements (spatial, arm, hand, waist), shapes (linear, radial), and selection tech-
niques (ray-casting, head, eye gaze). The study found spatial, hand, and waist menus were
significantly faster than arm menus, and eye gaze was more error-prone with higher target
re-entries compared to other selection techniques. A toolkit for automatically generating
and modifying VR menus, VRMenuDesigner [10], organizes menus and functions with object-
oriented thinking to make the system understandable and extensible. It includes tools for
quickly generating and modifying elements and several built-in menus. Depth-based 3D gesture
multi-level radial menus [11] have also been explored for virtual object manipulation, using X, Y
translations of the finger with boundary crossing for navigation between menus. An evaluation
of pie menus for system control in VR [12] compared four implementations: Pick-Ray and
Pick-Hand with 6-DoF selection, and Hand-Rotation and Stick-Rotation with 1-DoF. The study
proposed examining their influence on selection time, error rate, user experience, usability
and presence. LaViola et al.’s book "3D User Interfaces: Theory and Practice" [13] provides an
in-depth view of 3D UIs, serving as a reference for both researchers and practitioners. It covers
input/output devices, interaction techniques, UI design, and future directions like augmented
reality.


3. Challenges
The research by Bao et al. primarily focused on hierarchical tile menus [3]. It did not explore
the adaptability of these menus to various user preferences or how different body-mounted
designs could enhance accessibility for users. Wang, Hu, and Chen compared fixed and handheld
menus in VR environments, providing critical insights into performance and user preferences
[14]. However, it limits its focus to these two types of menu interfaces without examining a
broader spectrum of body-mounted UIs. Azai et al. explored an innovative approach to menu
interactions using natural hand gestures but the study [7] does not actively compare all other
menu types to determine its design and functionality to other published designs.
   While these studies share insight into different body mounted UIs, there is a need for com-
prehensive research that compares the effectiveness and user satisfaction across a wider range
of body-mounted UI designs. The complex input methods (controllers, gesture, and voice
recognition), along with the various menu options that need to be developed to complete this
comparative study creates several technical challenges. Since the research approach aims to
deliver a seamless and intuitive user experience to users testing the various body mounted UIs,
achieving accurate and consistent gesture recognition is very important and discrepancies in
gesture recognition can weaken the user experience, leading to frustration and disengagement.
    • Limited Scope of Previous Studies: Existing research [3, 7, 4, 5] concentrate on specific
      menu design aspects without covering the full spectrum of body-mounted UI possibilities,
      highlighting a gap in understanding and the need for broader explorations.
    • Accessibility Challenges of Body-Mounted UI Designs: The exploration of body-
      mounted UIs as a means to improve accessibility remains limited. Designing intuitive,
      universally accessible interfaces requires innovative thinking and attention to diverse
      user needs.
    • Increasing Complexity of Multi-Modal Input Integration: The integration of various
      input methods into VR menus introduces complexity in design and requires testing to
      ensure seamless operation across different interaction styles.
    • Problem of Achieving Accurate and Consistent Gesture Recognition: Consistently
      interpreting user gestures correctly is vital for a positive VR experience. Technical
      challenges arise in ensuring reliable gesture recognition across various hardware and
      software configurations.
    • The Challenge of Creating a Seamless User Experience: Crafting a VR interface
      that offers a seamless and intuitive interaction requires overcoming challenges related to
      system design, responsiveness, and user engagement, requiring the need for user-centric
      development strategies.


4. Methodology
For this study, we plan to use the MRTK3 library in Unity for the development of VR menus such
as the Rotating 3D Hand Menu and the Tulip Menu, each presenting challenges as they are not
readily available within the toolkit. While MRTK3 provides a solid foundation for developing
immersive experiences, extending its functionalities to accommodate additional menu layouts
requires an in-depth understanding of its architecture and the customization of its components.
This includes the development of new input modules tailored for advanced gesture and voice
recognition techniques, which are required for this study.
    • Implementing a Wide Array of Menu Layouts: To improve on the existing research
      [7, 3, 4, 8, 15], our study will assess a variety of menu layouts including baseline, rotating
      3D hand, radial, floating, tool rack, and tulip menus to ensure a comprehensive evaluation
      of their effectiveness, efficiency, and user satisfaction in VR settings.
    • Investigating impact of different input modalities: The study will utilize multiple
      input modalities that are adapted on commercially available VR HMDs, such as controllers,
      hand tracking, and voice commands. We will investigate the usability of these input
      modalities when users interact with different body-mounted UIs and look into the potential
      impact brought by the integration of multiple input modalities.
    • Ergonomic Design Considerations: To enhance accessibility with body-mounted
      designs, our study prioritizes the development of ergonomic menu designs to mitigate
      issues like the ’Gorilla arm’ syndrome [4, 15], reducing physical strain and enhancing
      interaction comfort.
    • User-Centric Evaluation Framework: We will employ a user-centric evaluation frame-
      work that includes metrics such as task completion time, error rates, ease of navigation,
      and cognitive load, measured using the NASA Task Load Index. Additionally, user ex-
      perience will be assessed through questionnaires and interviews to gather qualitative
      feedback.


5. Conclusion
This paper proposes a comparative study to address the critical gap in empirical research
concerning the evaluation of various VR body-mounted menu layouts. By conducting the
comparative analysis, our study sets out to identify the most effective design that focuses on
efficiency, ease of use, and user satisfaction. Drawing upon foundational research on 3D menu
taxonomy and ergonomic considerations [5, 4], we aim to evaluate a spectrum of conventional
and innovative menu layouts, assessed through metrics such as task completion time, error rate,
cognitive load, and user experience.
   Our study will leverage the Mixed Reality Toolkit 3 for the development and testing of these
menu layouts, applying a standardized labeling system to ensure consistent interaction focus.
The intention behind this study is not just to improve the design landscape of VR interfaces but
to do so in a manner informed by a deep understanding of user preferences and the cognitive
demands involved. As we proceed with the detailed development and user testing phases, our
goal is to derive actionable design recommendations that enhance the user interface design
in VR environments. We believe that the outcomes of this study will contribute meaningful
advancements to the field, promoting interfaces that are not only technologically advanced but
also focused on the needs and preferences of users across diverse VR applications.


References
 [1] R. Skarbez, M. Smith, M. Whitton, Revisiting milgram and kishino’s reality-virtuality
     continuum, Frontiers in Virtual Reality 2 (2021). doi:10.3389/frvir.2021.647997.
 [2] D. Bowman, C. Wingrave, Design and evaluation of menu systems for immersive virtual
     environments, in: Proceedings IEEE Virtual Reality 2001, 2001, pp. 149–156. doi:10.1109/
     VR.2001.913781.
 [3] X. Bao, Y. Bian, M. Qi, R. Liu, W. Gai, J. Liu, H. Luan, C. Yang, Y. Wang, A
     toolkit for automatically generating and modifying vr hierarchy tile menus, Com-
     puter Animation and Virtual Worlds 35 (2024) e2208. URL: https://onlinelibrary.
     wiley.com/doi/abs/10.1002/cav.2208.       doi:https://doi.org/10.1002/cav.2208.
     arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/cav.2208.
 [4] M. Pourmemar, C. Poullis, Visualizing and interacting with hierarchical menus in immer-
     sive augmented reality, in: Proceedings of the 17th International Conference on Virtual-
     Reality Continuum and Its Applications in Industry, VRCAI ’19, Association for Comput-
     ing Machinery, New York, NY, USA, 2019. URL: https://doi.org/10.1145/3359997.3365693.
     doi:10.1145/3359997.3365693.
 [5] R. Dachselt, A. Hübner, Three-dimensional menus: A survey and taxonomy, Comput-
     ers & Graphics 31 (2007) 53–65. URL: https://www.sciencedirect.com/science/article/pii/
     S0097849306001853. doi:https://doi.org/10.1016/j.cag.2006.09.006.
 [6] M. N. Lystbæk, P. Rosenberg, K. Pfeuffer, J. E. Grønbæk, H. Gellersen, Gaze-hand alignment:
     Combining eye gaze and mid-air pointing for interacting with menus in augmented
     reality, Proc. ACM Hum.-Comput. Interact. 6 (2022). URL: https://doi.org/10.1145/3530886.
     doi:10.1145/3530886.
 [7] T. Azai, M. Otsuki, F. Shibata, A. Kimura, Open palm menu: A virtual menu placed in
     front of the palm, in: Proceedings of the 9th Augmented Human International Conference,
     AH ’18, Association for Computing Machinery, New York, NY, USA, 2018. URL: https:
     //doi.org/10.1145/3174910.3174929. doi:10.1145/3174910.3174929.
 [8] Y. Liu, F. Castronovo, J. Messner, R. Leicht, Evaluating the impact of virtual reality on
     design review meetings, Journal of Computing in Civil Engineering 34 (2020) 04019045.
     doi:10.1061/(ASCE)CP.1943-5487.0000856.
 [9] I. Lediaeva, J. LaViola, Evaluation of body-referenced graphical menus in virtual environ-
     ments, in: Proceedings of Graphics Interface 2020, GI 2020, Canadian Human-Computer
     Communications Society / Société canadienne du dialogue humain-machine, 2020, pp. 308
     – 316. doi:10.20380/GI2020.31.
[10] S. Hou, B. H. Thomas, X. Lu, Vrmenudesigner: A toolkit for automatically generating and
     modifying vr menus, in: 2021 IEEE International Conference on Artificial Intelligence and
     Virtual Reality (AIVR), 2021, pp. 154–159. doi:10.1109/AIVR52153.2021.00036.
[11] M. M. Davis, J. L. Gabbard, D. A. Bowman, D. Gracanin, Depth-based 3d gesture multi-level
     radial menu for virtual object manipulation, in: 2016 IEEE Virtual Reality (VR), 2016, pp.
     169–170. doi:10.1109/VR.2016.7504707.
[12] M. Mundt, T. Mathew, An evaluation of pie menus for system control in virtual reality,
     2020, pp. 1–8. doi:10.1145/3419249.3420146.
[13] J. LaViola, E. Kruijff, R. McMahan, D. Bowman, I. Poupyrev, 3D User Interfaces: Theory
     and Practice, Usability, Pearson Education, 2017. URL: https://books.google.ca/books?id=
     fxWSDgAAQBAJ.
[14] Y. Wang, Y. Hu, Y. Chen, An experimental investigation of menu selection for immersive
     virtual environments: Fixed versus handheld menus - virtual reality, SpringerLink (2020).
     URL: https://doi.org/10.1007/s10055-020-00464-4. doi:10.1007/s10055-020-00464-4.
[15] H. Kharoub, M. Lataifeh, N. Ahmed, 3d user interface design and usability for immersive
     vr, Applied Sciences 9 (2019). URL: https://www.mdpi.com/2076-3417/9/22/4861. doi:10.
     3390/app9224861.