=Paper= {{Paper |id=Vol-3618/forum_paper_14 |storemode=property |title=Towards an augmented reality approach to build use-case diagrams |pdfUrl=https://ceur-ws.org/Vol-3618/forum_paper_14.pdf |volume=Vol-3618 |authors=Ana Rita Rebelo,João Araújo,João Costa Seco,Rui Nóbrega |dblpUrl=https://dblp.org/rec/conf/er/RebeloASN23 }} ==Towards an augmented reality approach to build use-case diagrams== https://ceur-ws.org/Vol-3618/forum_paper_14.pdf
                                Towards an augmented reality approach to build use
                                case diagrams
                                Ana Rita Rebelo1 , João Araújo1 , João Costa Seco1 and Rui Nóbrega1
                                1
                                    NOVA LINCS, NOVA School of Science and Technology, Caparica, Portugal


                                                                         Abstract
                                                                         Augmented Reality (AR) and Virtual Reality (VR) environments offer users unique cognitive and behav-
                                                                         ioral benefits. AR and VR can improve the user experience by providing more immersive and engaging
                                                                         environments, enhancing communication, collaboration, and the visualisation of complex systems. Mo-
                                                                         tivated by these opportunities, we propose an AR-based approach for constructing use case diagrams,
                                                                         a well-known diagram used to gather system requirements. The proposed solution employs tangible
                                                                         objects with AR markers to create a 3D representation of the diagram, enabling a more natural spatial
                                                                         organisation across space to facilitate interpretation. To evaluate this approach, we conducted a user
                                                                         study that demonstrated the intuitiveness and engagement of the AR application. Users also highlighted
                                                                         the potential of this solution in educational contexts such as classrooms or group work, as it allows for
                                                                         spatial organisation of the diagram, fosters more immersive meetings, and reduces communication gaps
                                                                         among stakeholders. Overall, this study contributes to a better understanding of the benefits of AR and
                                                                         VR technologies in requirements engineering, showcasing their potential to advance the field.

                                                                         Keywords
                                                                         Augmented reality, Virtual reality, Requirements engineering, Use case diagram




                                1. Introduction
                                Augmented Reality (AR) is a technology that overlays digital information, such as images,
                                videos, or data, onto the real world through devices like smartphones or AR glasses. On the
                                other hand, Virtual Reality (VR) is a more immersive technology that engrosses users in a
                                computer-generated environment, isolating them from the physical world and providing a
                                sensory-rich simulation of reality through VR headsets.
                                  AR and VR applications have become increasingly sought-after systems in recent year. Differ-
                                ent areas of the industry have sought these applications as cutting-edge solutions for developing
                                more immersive and efficient training environments in clinical settings [1], collaborative sys-
                                tems between workers [2], industrial digital twins [3] or motor rehabilitation scenarios for
                                patients [4]. Inclusively, Software Engineering has benefited from the unique qualities of AR
                                and VR to educate students and young engineers [5, 6, 7, 8, 9], enhance the visualisation and

                                ER2023: Companion Proceedings of the 42nd International Conference on Conceptual Modeling: ER Forum, 7th SCME,
                                Project Exhibitions, Posters and Demos, and Doctoral Consortium, November 06-09, 2023, Lisbon, Portugal
                                Envelope-Open ar.rebelo@campus.fct.unl.pt (A. R. Rebelo); joao.araujo@fct.unl.pt (J. Araújo); joao.seco@fct.unl.pt
                                (J. Costa Seco); rui.nobrega@gmail.com (R. Nóbrega)
                                Orcid 0000-0001-5049-4086 (A. R. Rebelo); 0000-0001-5914-1631 (J. Araújo); 0000-0002-2840-3966 (J. Costa Seco);
                                0000-0002-3620-7279 (R. Nóbrega)
                                                                       © 2023 Copyright for this paper by its authors.
                                                                       Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
                                    CEUR
                                    Workshop
                                    Proceedings
                                                  http://ceur-ws.org
                                                  ISSN 1613-0073
                                                                       CEUR Workshop Proceedings (CEUR-WS.org)




CEUR
                  ceur-ws.org
Workshop      ISSN 1613-0073
Proceedings
understanding of software systems in three-dimensional (3D) virtual environments [10, 11, 12]
and improve communication and collaboration between workers [13, 14, 15, 16].
   Traditional representation of software systems has always been accomplished through 2D
approaches, such as text, charts or graphs. With the expansion of AR and VR technologies,
new approaches have emerged to represent systems in 3D environments. Although represent-
ing models in 2D is quite adequate, studies have highlighted emergent users’ cognitive and
behavioural advantages on users when using AR and VR environments. Namely, some studies
suggest that representing information in 3D makes it easier to identify and recall structures
compared to its 2D representation [17, 18, 12, 19, 20]; virtual experiences leverage affordances
of human perception such as spatial memory, motor, and manipulation to better understand 3D
visualisation [21].
   Just as in other domains, employing the 3D format within the context of Requirements
Engineering (RE) offers its own set of benefits. For example, it can be used for engaging
in RE learning and training [22, 23], easing the task of collecting requirements from special
needs populations [24], or increasing empathy during the requirement elicitation phase by
immersing stakeholders into a virtual world to simulate personas and environmental conditions
[25, 13, 14, 24]. The integration of AR and VR technologies hold the potential to reshape the
fundamental facets of requirement elicitation, communication, and modeling. This exploration
of the fundamental contributions of AR/VR applications within the RE domain underscores
their potential to not only address long-standing challenges but also foster the evolution of the
field.
   Aiming to capitalise on these contributions and further advance the effectiveness of require-
ments engineering processes, we developed an AR approach for creating use case diagrams,
a behavioural UML diagram commonly employed in the requirement elicitation phase of a
new system. We propose a 3D environment to visualise and decompose the diagram for better
interpretation and also to allow collaboration between multiple users working on the same
diagram. We do not expect this approach to replace the traditional 2D format, which already
allows a clear representation of actors, use cases and their relationships. With this proposal, we
are particularly interested in raising the potential of 3D graph representation.
   The benefits of this solution include the possibility to spatially organise and decompose
more complex diagrams. We envision spatial organisation and decomposition of more intri-
cate diagrams along three dimensions, providing a novel means to comprehend the complex
relationships between actors and use cases. For instance, consider a complex software system
involving numerous actors and intricate interactions. In a 3D environment, the diagram can be
spatially organised, allowing stakeholders to explore these interactions from multiple angles,
thereby enhancing their understanding of the system’s dynamics. Furthermore, this possibility
can foster discussions in a collaborative environment where each participant could drag and
drop the diagram elements as if they were physical objects, rather than digital images on a 2D
plane.
   As a proof of concept prototype, we are primarily interested in whether users adapt and
can easily create the diagram in the suggested 3D environment. To this end, we conducted a
user study with 10 computer engineering students familiar with use case diagrams so that they
could compare the traditional 2D format and our 3D proposal. The results showed that the
participants could interact intuitively with the application and found value in using it. They
specifically mentioned its potential usefulness in classroom and group work settings.
  Within this paper, we commence by introducing, in Section 2, related work concerning the
advantages associated with the utilisation of AR/VR technology in RE tasks. Next, Section 3
presents the proposed AR approach and Section 3.0.2 shows details about the system design
and implementation. Section 4 describes the user study conducted and the results obtained. In
Section 5, we discuss the results, followed by the conclusions and future work in Section 6.


2. Related work
Within the RE life-cycle, the elicitation process stands out as a pivotal step. Any errors or
incompleteness in requirements can significantly impact project success and cost [26]. Eliciting
requirements is particularly challenging, requiring collaboration among individuals from diverse
backgrounds towards a common goal. Here, the integration of cutting-edge technologies like AR
and VR holds substantial promise in revolutionising how requirements are gathered, understood,
and refined. These immersive technologies provide a dynamic environment that can potentially
address the longstanding challenges of requirement elicitation, communication, and modeling.
This section explores the fundamental contributions that AR/VR applications offer to the realm
of RE and examines the pertinent literature that has explored these contributions.
   One of the foremost advantages of incorporating AR/VR in RE is the automation of pro-
cesses. Stakeholders can immerse themselves in a virtual environment, replicating real-world
scenarios and role-playing actions just as they would in reality [14, 16]. Consequently, business
process models can be generated automatically based on these actions, eliminating the need
for stakeholders to familiarise themselves with modeling grammar. This approach not only re-
duces errors during requirements elicitation and specification but also accelerates the modeling
process, minimising the time stakeholders need to invest in constructing a process view.
   Still related to the possibility of immersing in a virtual world, AR/VR applications enable
the embodiment of stakeholder perspectives through 3D avatars based on user persona docu-
ments. This strategy immerses stakeholders in a first-person view, facilitating comprehensive
understanding and validation of requirements within a simulated real-world environment
[25, 13, 14, 24]. Moreover, simulating various environmental factors and user conditions, such
as accessibility requirements, proves invaluable in the holistic gathering, analysis, and validation
of requirements.
   Furthermore, AR and VR provide an interactive platform for enhancing communication
during RE meetings, mitigating challenges often linked with stakeholder collaboration [13,
14, 15, 16]. Integrating AR/VR technologies with Machine Learning (ML) and Deep Learning
(DL) mechanisms further enables the automated classification of requirements, assisting in
categorising requirements discussed in meetings and interviews with stakeholders and users.
   Additionally, the integration of VR offers a distinct advantage in creating, manipulating, and
visualising models and diagrams within a native 3D space [23]. This spatial experience allows
requirements engineers to engage with models at a deeper level, facilitating better abstractions
and accommodating layered complexity that is challenging to achieve in conventional 2D spaces.
   The educational potential of AR and VR applications extends to providing immersive platforms
for students and software engineers to learn and practice building models and diagrams within a
3D environment. These applications enhance content engagement and foster retention through
hands-on learning experiences [22, 23].


3. Create use case diagrams in Augmented Reality
Building upon the insights garnered from the literature review, we propose an AR-based
approach for developing use case diagrams. This solution has significant potential for enhancing
the creation of models and diagrams, facilitating communication, and supporting training and
learning in RE activities. AR allows for the creation and manipulation of 3D models, simplifying
the interpretation of intricate diagrams that pose challenges when reproduced on a 2D surface.
Additionally, the use of AR/VR applications provides a comprehensive learning experience,
improving content engagement and retention.
   We chose the use case diagram as a case study because it is a well-known diagram for
requirements gathering. We chose AR instead of VR to create a collaborative environment that
is easily accessible in classrooms or remotely; participants can join using just their smartphones.
Figure 1 shows an overview of a use case diagram created with our application.
   Our user study aims to investigate the usability and effectiveness of creating use case diagrams
with AR, a relatively new approach. We will gather participant feedback on satisfaction, ease of
use, engagement, and perceived usefulness of the AR technology. By examining these factors,
the study can provide valuable insights to improve the development outcomes of RE.

3.1. Feature selection
We conducted a brainstorming session with two Software Engineering experts to determine
the functionalities for the first p rototype. The session focused on identifying limitations of
current 2D editors and finding ways to address them in our solution. The following features
were selected:




                                                                                 Tangible object tagged
                                                                                   with an AR marker


Figure 1: An AR mobile app to draw use case diagrams. The integration of tangible objects allows users
to decompose and arrange the diagram in the 3D space.
    • Create use case diagrams in 3D space: The user interface allows users to add actors,
      use cases, and relationships, and arrange them in a 3D space in a way that is considered
      most appropriate to facilitate diagram interpretation.
    • Highlight requirements: Highlight elements of the diagram that satisfy a particular
      requirement of the system (e.g. usability, security).
    • Decompose the diagram by actors: AR markers are used to represent each actor in
      the diagram, making it possible to decompose the diagram to analyse each actor or only
      a subset of actors.
    • Collaboration: Support multiple users’ collaboration in the same diagram (in-person
      or remotely) using their mobile devices. Face-to-face collaboration, in particular, fosters
      interactive communication as all participants can be around the diagram and manipulate
      it more naturally than on a computer screen.
    • Export the 2D diagram: Export the diagram into a traditional format so it can be
      archived along with the rest of the system documentation.

   The system workflow is shown in Figure 2. In-person users can manipulate AR markers to
arrange diagram elements. While remote users cannot interact with physical markers, they can
interact with virtual elements by adding, moving, or editing them. While creating the diagram,
the system simultaneously updates an XML file, which can be later exported in a 2D format.
For the proof of concept, we utilised three markers attached to objects with simple geometric
shapes, allowing one-handed manipulation. The user study focused on a bank ATM system
with three actors, as shown in Figure 1.




Figure 2: System workflow.
3.2. Implementation details
The application was developed using Unity1 , a cross-platform game engine that supports
intuitive tools for interactive 3D content and a straightforward integration with the Android
SDK. The AR tracking relies on the Vuforia SDK, which uses computer vision technology to
track markers. Briefly, the Vuforia library2 uses Natural Feature Tracking (NFT) algorithms to
detect feature key points and determine the scale of the marker in real-time.
   To enable collaboration between multiple devices, we used the Mirror Networking API3 that
allows all changes made to the diagram to be automatically updated on the server side and
reflected on the client side (i.e. on users’ devices).

3.3. User interface
Figure 3 shows some screenshots of the developed prototype. The option to export the diagram
to 2D format appears in the top right corner, generating a PDF document that will be saved on
the device. The button below allows the users to choose the category of requirements they want
to highlight in the diagram. Figures 3c and 3d illustrate the example of highlighting diagram
elements related to authentication requirements. The buttons in the bottom left corner allow the
user to add actors and use cases to the diagram. If any element is selected, the options to edit,
delete or add a relationship appear in the bottom right corner. To add a relationship between
two elements, the users must select the elements they want to link. It is also possible to choose
the type of relationship, as shown in Figure 3b. Users can also manipulate tangible objects
tagged with markers. As shown in Figure 3a, each marker belongs to an actor. In this way, it
is possible to decompose the diagram, working separately with each actor. Furthermore, in a
collaborative scenario, each user can work on a single mark and join all the parts afterwards.


4. User study
We conducted a user study with two main objectives: evaluating usability and gathering
participant feedback. We aimed to understand participants’ perspectives on our AR approach, the
most valuable features, potential enhancements, advantageous scenarios, and benefits compared
to traditional 2D versions. Students and professors with computer science backgrounds were
recruited for this study. All the participants were volunteers and had the minimum knowledge
on use case diagrams, so they could compare our innovative approach and the conventional
one. This section presents the design and results of our user study.

4.1. Experimental design
The study was conducted in a controlled laboratory environment under the same conditions
for all participants. Each participant began the session by reading an informed consent form
explaining the experiment’s context, data anonymity, and the right to withdraw at any time.

1
  Unity: https://unity.com/ - Last accessed 15/10/2023
2
  Vuforia: https://library.vuforia.com/articles/Training/vuforia-fusion-article.html - Last accessed 15/10/2023
3
  Mirror Networking: https://mirror-networking.com/ - Last accessed 15/10/2023
(a) The AR markers were (b) Window to edit the (c) Window to select the (d) Part of the dia-
    glued to tangible ob-   relationship type be-  category of require-     gram     with    the
    jects and each marker   tween two diagram      ments the user wants     authentication-
    represents an actor.    elements.              to highlight in the di-  related     elements
                                                   agram.                   highlighted.

Figure 3: Four screenshots of the AR mobile application.


Participants were then asked to perform the following tasks to build part of the case study
scenario in Figure 1. The order of the tasks was the same for all participants, following the
logical order of drawing a use case diagram: 1) Add the ”Customer” actor; 2) Add the use case
”Login”; 3) Add the relationship between ”Customer” and ”Login”; 4) Add the use case ”Confirm
password”; 5) Add the ”include” relationship between ”Login” and ”Confirm password”; 6)
Add the use case ”Check balance”; 7) Edit the use case ”Check balance” to ”Deposit funds”; 8)
Delete the use case ”Deposit funds”; 9) Highlight the diagram elements related to authentication
requirements.
   After completing each task, participants were asked to rate their level of agreement with the
statement ”I found this task easy/intuitive to accomplish” on a scale of 1 (strongly disagree) to 7
(strongly agree). After completing the nine tasks, participants were allowed to freely explore
the app and interact with the markers. Once completed this phase, participants filled in the
System Usability Scale (SUS)4 questionnaire and answered the following open-ended questions:
1) What was the feature you appreciated most about this product? 2) What was the feature you
liked least about this product? 3) What features would you add/change to improve this app and
add benefits that the traditional 2D version does not have? 4) What benefits/potential do you
see this AR approach having compared to the traditional desktop (2D) approach?.
   Finally, participants completed a characterisation questionnaire providing demographic

4
    System Usability Scale (SUS): https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html -
    Last accessed 15/10/2023
information including age, education level, experience with AR applications, and use case
diagrams.

4.2. Results and analysis
The results of the data analysis are comprised in this section. Firstly, we show the demographic
information of the participants. Next, we present the quantitative results concerning the 7-
point Likert question and the SUS questionnaire. Finally, qualitative results are described,
including the answers to the open-ended questions and comments made during the experiment.
Descriptive statistics were used to analyse the task ratings and the SUS questionnaire, while
qualitative content analysis was used to analyse the open-ended questions.

Demographics A total of 10 participants (9 male, 1 female) aged 22-40 (M = 26.20, SD = 6.41)
participated in the experiment. All participants had a computer science background: 5 bachelor’s,
4 master’s, 1 PhD. Regarding their experience with using case diagrams, 1 participant was
familiar with this diagram and used it from time to time in work; the remaining 9 participants
indicated that although they do not often use this type of diagram, they were familiar enough
to successfully perform the proposed tasks. Regarding the experience with AR applications, 1
participant claimed to have never tried AR before, 4 participants have tried it a few times, 3
participants use it once in a while, and 2 use it regularly (monthly, weekly or daily).

Likert ranking Table 1 presents the users’ ratings on the statement ”I found this task intuitive”
on a 7-point Likert scale (1=”Strongly Disagree” and 7=”Strongly Agree”). The median value of
the rating and the respective quartiles are presented for each task. In general, we can observe
that users found the tasks intuitive since the ratings given were all higher than 4.


Table 1
Results on the ratings obtained to the statement ”I found this task easy”, on a 7-point Likert scale.
                                     Task                               Median     Q1     Q3
          1. Add the ”Customer” actor.                                    7.00     7.00   7.00
          2. Add the use case ”Login”.                                    7.00     6.25   7.00
          3. Add the relationship between ”Customer” and ”Login”.         7.00     6.00   7.00
          4. Add the use case ”Confirm password”.                         7.00     7.00   7.00
          5. Add the ”include” relationship between ”Customer”            6.00     6.00   7.00
          and ”Confirm password”.
          6. Add the use case ”Check balance”.                            7.00     7.00   7.00
          7. Edit the use case ”Check balance” to ”Deposit funds”.        7.00     7.00   7.00
          8. Delete the use case ”Deposit funds”.                         7.00     7.00   7.00
          9. Highlight the diagram elements related to                    6.50     6.00   7.00
          authentication requirements.
SUS Table 2 presents the results obtained with the SUS questionnaire, filled out by the
participants at the end of the experiment. Each statement was rated on a scale from 1 to 7
(1=”Strongly Disagree” and 7=”Strongly Agree”). The final score was calculated following
the guidelines of the questionnaire author [27]. SUS scores range from 0 to 100 and provides
insights about usability performance in effectiveness, efficiency, and overall ease of use. The
result of 86.33 gives the system category A (”Excellent”) in usability. Looking at each question
individually, we can see that participants found the system intuitive and easy to use.

4.3. Qualitative results
According to the open-ended responses to the first question, the participants appreciated several
features of the AR application. The ability to physically manipulate the diagram was highlighted
as a positive feature, providing a more engaging and dynamic workspace. Many participants also
mentioned the ease and simplicity of creating new elements and features, as well as the ability
to quickly move actors on and off screen to concentrate on one area of interest at a time. The
collaborative feature was also often mentioned as positive, with participants noting the ability
to work with multiple users using tactile pieces. The use of AR for spatial organisation was also
seen as beneficial to make the experience more dynamic than the traditional 2D version. Overall,
the participants appreciated the creativity and interactivity of the AR-based environment.
   Regarding the second question, one of the main concerns expressed by participants was the
interface design, which some found to be rudimentary and static. Some suggested that labelling
relationships could be more intuitive by, for example, allowing users to click directly on the line


Table 2
SUS results.
                      System Usability Scale (SUS)                      Median     Q1     Q3
       1. I think that I would like to use this system frequently.       5.00      4.00   5.75
       2. I found the system unnecessarily complex.                      1.00      1.00   2.00
       3. I thought the system was easy to use.                          6.00      6.00   7.00
       4. I think that I would need the support of a technical           1.00      1.00   1.00
       person to be able to use this system.
       5. I found the various functions in this system were              6.50      5.25   7.00
       well integrated.
       6. I thought there was too much inconsistency in this system.     1.00      1.00   1.75
       7. I would imagine that most people would learn to use this       6.00      6.00   7.00
       system very quickly.
       8. I found the system very cumbersome to use.                     1.00      1.00   2.75
       9. I felt very confident using the system.                        6.00      5.25   6.00
       10. I needed to learn a lot of things before I could get going    1.00      1.00   2.00
       with this system.
                                 SUS Score                                       86.33
connecting two elements to change the label automatically. Other minor issues mentioned were
overlapping elements. Some participants did not immediately realise how to move the objects
after inserting them. Although they were able to figure out how to do it within a few seconds,
they suggested adding a restriction that new elements should appear only in free spaces not
already occupied by other elements.
   About the third question, participants provided valuable insights and suggestions on how
to improve the app and make it stand out from the traditional 2D version. The most common
suggestions were to add more feedback, such as using vibration or sound, and add more
dynamic features like automatically assigning relationships or having use cases attract each
other based on proximity. Additionally, some suggested adding a menu of possible items to
add or prompting to name an element as it’s created to streamline the process. Almost all the
participants highlighted this approach’s potential in real-time collaboration and the ability to
edit collaboratively with the team anywhere, anytime.
   In response to the last question about the benefits and potential of the AR approach compared
to the traditional 2D approach, participants highlighted several advantages. One of the main
benefits is the ability to quickly isolate parts of the diagram being worked on, such as quickly
focusing on one actor’s relationships. Additionally, the AR approach allows for collaborative
work with multiple people in real-time, making communication, planning and brainstorming
easier and more engaging. Participants also mentioned that AR could be used to teach concepts
in classrooms or create more intuitive interactions in real-world settings. The AR approach
has the potential to make pre-planning with diagrams less tedious, especially in small teams
working in the same room. Finally, participants noted that AR has the potential for more
engaging experiences; when the diagram is grounded by a marker, AR can allow for intuitive
collaborative interactions, such as handing someone a diagram just by giving them the marker.


5. Discussion
The user study contributed to evaluating the usability of our AR-based method and gathering
insights from participants regarding the effectiveness, user perceptions, and inherent advantages
of our AR-driven approach in the realm of use case diagram creation.
   The user study results showed that participants received the AR approach for creating use
case diagrams well. The Likert ranking consistently rated tasks above 4 on a 7-point scale,
indicating that the AR approach is easy to use and understand, even for those unfamiliar with AR
applications and/or use case diagram design. The SUS questionnaire results further supported
the positive feedback from the Likert ranking, with a score of 86.33 out of 100, indicating that
the AR approach is an effective method for creating use case diagrams.
   The qualitative results of the study provided valuable insights into the participants’ opinions
on the AR approach. The participants appreciated the physical manipulation of the diagram,
the simplicity of creating new elements, the collaborative aspect of the application, and the use
of AR for spatial organisation. Participants also provided suggestions for improving the app,
such as adding more feedback and dynamic features and enhancing the interface design, which
shows a genuine interest in making this model design process more interactive.
   Regarding the benefits and potential of the AR approach compared to the traditional 2D
approach, participants noted several advantages, such as the ability to quickly isolate parts of
the diagram being worked on, collaborative work with multiple people in real-time, and more
engaging experiences. These advantages could make pre-planning with charts more attractive
and less tedious, especially in small teams working in the same physical space.
  Overall, the user study results demonstrate that the AR approach is an effective and intuitive
method for creating use case diagrams. The positive feedback from the participants highlights
the potential of AR technology in enhancing the usability and engagement of diagramming tools,
and the study’s findings can inform future research on the use of AR in software engineering.


6. Conclusions and future work
In this paper, we begin by reviewing the present contributions that AR/VR technologies can
provide to the RE activities. Building upon the benefits identified during the literature review,
we introduce an AR-driven approach for creating use case diagrams. We validate our approach
through a user study, and our findings affirm the positive reception and effectiveness of our
method. As we conclude this investigation, our focus shifts towards the future, where we
identify potential avenues for additional research and development in the ever-evolving realm
of AR-enhanced software engineering.
   Inspired by the contribution of creating 3D models, we proposed our 3D approach for
creating use case diagrams to provide a more immersive and layered experience challenging to
replicate in the traditional 2D space. The user study results suggested that the AR approach
can be an effective method for creating use case diagrams. The participants appreciated the
physical manipulation of the diagram, the simplicity of creating new elements and features, the
collaborative aspect of the application, and the use of AR for spatial organisation. The findings
of this review and our proposed 3D approach demonstrate the potential of AR/VR in enhancing
the usability of RE processes and also the high receptivity of this type of solution by people
with a background in computer engineering.
   Future research should focus on applying these techniques to assess their effectiveness in
practical settings, explore the challenges associated with their adoption, and evaluate their
impact on software development outcomes. Building on the results of this work, it would
be interesting to compare the completion time of software engineers using a traditional 2D
interface versus our AR approach for modeling the same diagram. Since the participants highly
praised the collaborative component of the application, it would also be interesting to further
study the usability of this solution in a collaborative environment.
   Overall, we believe that our study’s findings can contribute valuable insights to future research
on the use of AR in software engineering and can guide the development of more interactive
and engaging diagramming tools.


Acknowledgments
This work was supported by NOVA LINCS (UIDB/04516/2020) with the financial support of
FCT.IP.
References
 [1] C. Gießer, J. Knode, A. Gruenewald, T. J. Eiler, V. Schmuecker, R. Brueck, Skillslab+ -
     augmented reality enhanced medical training, in: 2021 IEEE International Conference
     on Artificial Intelligence and Virtual Reality (AIVR), 2021, pp. 194–197. doi:10.1109/
     AIVR52153.2021.00043 .
 [2] V. Pereira, T. Matos, R. Rodrigues, R. Nóbrega, J. Jacob, Extended reality framework
     for remote collaborative interactions in virtual environments, in: 2019 International
     Conference on Graphics and Interaction (ICGI), 2019, pp. 17–24. doi:10.1109/ICGI47575.
     2019.8955025 .
 [3] C. Jacob, F. Espinosa, A. Luxenburger, D. Merkel, J. Mohr, T. Schwartz, N. Gajjar, K. Rekik,
     Digital twins for distributed collaborative work in shared production, in: 2022 IEEE
     International Conference on Artificial Intelligence and Virtual Reality (AIVR), 2022, pp.
     210–212. doi:10.1109/AIVR56993.2022.00042 .
 [4] M. C. Howard, A meta-analysis and systematic literature review of virtual reality re-
     habilitation programs, Computers in Human Behavior 70 (2017) 317–327. URL: https:
     //www.sciencedirect.com/science/article/pii/S0747563217300134. doi:https://doi.org/
     10.1016/j.chb.2017.01.013 .
 [5] G. Rodríguez, A. Soria, M. Campo, Teaching scrum to software engineering students
     with virtual reality support, in: F. Cipolla-Ficarra, K. Veltman, D. Verber, M. Cipolla-
     Ficarra, F. Kammüller (Eds.), Advances in New Technologies, Interactive Interfaces and
     Communicability, Springer Berlin Heidelberg, Berlin, Heidelberg, 2012, pp. 140–150.
 [6] M. D. Nazligul, M. Yilmaz, U. Gulec, M. A. Gozcu, R. V. O’Connor, P. M. Clarke, Overcoming
     public speaking anxiety of software engineers using virtual reality exposure therapy, in:
     J. Stolfa, S. Stolfa, R. V. O’Connor, R. Messnarz (Eds.), Systems, Software and Services
     Process Improvement, Springer International Publishing, Cham, 2017, pp. 191–202.
 [7] A. Akbulut, C. Catal, B. Yıldız,             On the effectiveness of virtual reality
     in the education of software engineering,                Computer Applications in En-
     gineering Education 26 (2018) 918–927. URL: https://onlinelibrary.wiley.
     com/doi/abs/10.1002/cae.21935.                doi:https://doi.org/10.1002/cae.21935 .
     arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/cae.21935 .
 [8] C. Zirkelbach, A. Krause, W. Hasselbring, Hands-On: Experiencing Software Architecture
     in Virtual Reality, Research Report, Kiel University, 2019. URL: https://www.uni-kiel.de/
     journals/receive/jportal%5fjparticle%5f00000350.
 [9] D. Amaral, G. Domingues, J. P. Dias, H. S. Ferreira, A. Aguiar, R. Nóbrega, Live software
     development environment for java using virtual reality, SciTePress, 2019, pp. 37–46.
     doi:10.5220/0007699800370046 .
[10] U. Erra, G. Scanniello, Towards the visualization of software systems as 3d forests: The
     codetrees environment, in: Proceedings of the 27th Annual ACM Symposium on Applied
     Computing, SAC ’12, Association for Computing Machinery, New York, NY, USA, 2012, p.
     981–988. URL: https://doi.org/10.1145/2245276.2245467. doi:10.1145/2245276.2245467 .
[11] P. Khaloo, M. Maghoumi, E. Taranta, D. Bettner, J. Laviola, Code park: A new 3d code
     visualization tool, in: 2017 IEEE Working Conference on Software Visualization (VISSOFT),
     2017, pp. 43–53. doi:10.1109/VISSOFT.2017.10 .
[12] R. Mehra, V. S. Sharma, V. Kaulgud, S. Podder, A. P. Burden, Towards immersive com-
     prehension of software systems using augmented reality - an empirical evaluation, in:
     2020 35th IEEE/ACM International Conference on Automated Software Engineering (ASE),
     2020, pp. 1267–1269.
[13] O. Wang, B. Cheng, T. Hoang, C. Arora, X. Liu, Virtual reality enabled human-centric re-
     quirements engineering, in: 2021 36th IEEE/ACM International Conference on Automated
     Software Engineering Workshops (ASEW), 2021, pp. 159–164. doi:10.1109/ASEW52652.
     2021.00041 .
[14] J. Harman, R. Brown, D. Johnson, S. Rinderle-Ma, U. Kannengiesser, Virtual Business Role-
     Play: Leveraging Familiar Environments to Prime Stakeholder Memory During Process
     Elicitation, in: J. Zdravkovic, M. Kirikova, P. Johannesson (Eds.), Advanced Information
     Systems Engineering, volume 9097, Springer International Publishing, Cham, 2015, pp.
     166–180. doi:10.1007/978- 3- 319- 19069- 3_11 , series Title: Lecture Notes in Computer
     Science.
[15] R. Hellmuth, J. Frohnmayer, Requirements Engineering for Stakeholders of Factory Conver-
     sion: LoD Visualization of a Research Factory via AR Application, Procedia Manufacturing
     45 (2020) 25–30. URL: https://linkinghub.elsevier.com/retrieve/pii/S235197892031074X.
     doi:10.1016/j.promfg.2020.04.036 .
[16] S. Panichella, M. Ruiz, Requirements-Collector: Automating Requirements Specification
     from Elicitation Sessions and User Feedback, in: 2020 IEEE 28th International Requirements
     Engineering Conference (RE), IEEE, Zurich, Switzerland, 2020, pp. 404–407. URL: https:
     //ieeexplore.ieee.org/document/9218156/. doi:10.1109/RE48521.2020.00057 .
[17] P. Irani, C. Ware, Diagrams based on structural object perception, in: Proceedings of the
     Working Conference on Advanced Visual Interfaces, AVI ’00, Association for Computing
     Machinery, New York, NY, USA, 2000, p. 61–67. URL: https://doi.org/10.1145/345513.345254.
     doi:10.1145/345513.345254 .
[18] E. Ai-Lim Lee, K. W. Wong, C. C. Fung, How does desktop virtual reality enhance learning
     outcomes? a structural equation modeling approach, Computers & Education 55 (2010)
     1424–1442. URL: https://www.sciencedirect.com/science/article/pii/S0360131510001661.
     doi:https://doi.org/10.1016/j.compedu.2010.06.006 .
[19] D. A. Vincenzi, B. Valimont, N. Macchiarella, C. Opalenik, S. N. Gangadharan, A. E.
     Majoros, The effectiveness of cognitive elaboration using augmented reality as a training
     and learning paradigm, Proceedings of the Human Factors and Ergonomics Society Annual
     Meeting 47 (2003) 2054–2058. URL: https://doi.org/10.1177/154193120304701909. doi:10.
     1177/154193120304701909 . arXiv:https://doi.org/10.1177/154193120304701909 .
[20] T. Zuo, M. V. Birk, E. D. van der Spek, J. Hu, The effect of fantasy on learning and
     recall of declarative knowledge in ar game-based learning, Entertainment Computing 46
     (2023) 100563. URL: https://www.sciencedirect.com/science/article/pii/S1875952123000186.
     doi:https://doi.org/10.1016/j.entcom.2023.100563 .
[21] A. Elliott, B. Peiris, C. Parnin, Virtual reality in software engineering: Affordances,
     applications, and challenges, in: 2015 IEEE/ACM 37th IEEE International Conference on
     Software Engineering, volume 2, 2015, pp. 547–550. doi:10.1109/ICSE.2015.191 .
[22] M. Al Zahrani, M. Fawzy, Engineering education gaming: Case study of engineering ethics
     game modeling, in: 2020 Industrial Systems Engineering Conference (ISEC), 2020, pp. 1–5.
     doi:10.1109/ISEC49495.2020.9230280 .
[23] O. Ochoa, A. Babbit, Incorporating a Virtual Reality Environment in the Teaching of
     Analysis of Software Requirements, in: 2019 IEEE Frontiers in Education Conference
     (FIE), IEEE, Covington, KY, USA, 2019, pp. 1–5. URL: https://ieeexplore.ieee.org/document/
     9028676/. doi:10.1109/FIE43999.2019.9028676 .
[24] A. Bhimani, P. Spoletini, Empowering requirements elicitation for populations with special
     needs by using virtual reality, in: Proceedings of the SouthEast Conference, ACM SE
     ’17, Association for Computing Machinery, New York, NY, USA, 2017, p. 268–270. URL:
     https://doi.org/10.1145/3077286.3078467. doi:10.1145/3077286.3078467 .
[25] A. Gregoriades, J. Hadjicosti, C. Florides, M. Pampaka, H. Michail, A driving simulator
     for discovering requirements in complex systems, in: Proceedings of the Conference
     on Summer Computer Simulation, SummerSim ’15, Society for Computer Simulation
     International, San Diego, CA, USA, 2015, p. 1–10.
[26] P. Rajagopal, R. Lee, T. Ahlswede, C.-C. Chiang, D. Karolak, A new approach for software
     requirements elicitation, in: Sixth International Conference on Software Engineering,
     Artificial Intelligence, Networking and Parallel/Distributed Computing and First ACIS
     International Workshop on Self-Assembling Wireless Network, 2005, pp. 32–42. doi:10.
     1109/SNPD- SAWN.2005.5 .
[27] J. Brooke, et al., Sus-a quick and dirty usability scale, Usability evaluation in industry 189
     (1996) 4–7.