<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Digital Avatars to Immersive Cultural Narratives⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Liliana Cecere</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Simone Pierre Dembele</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Muhammad Khan</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Angelo Lorusso</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Domenico Santaniello</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Carmine Valentino</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>University OF Tartu</institution>
          ,
          <addr-line>Ülikooli 18, 50090 Tartu</addr-line>
          ,
          <country country="EE">Estonia</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>University of Salerno</institution>
          ,
          <addr-line>Fisciano (SA) 84084</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>University of the West of England</institution>
          ,
          <addr-line>Bristol</addr-line>
          ,
          <country country="UK">UK</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2025</year>
      </pub-date>
      <abstract>
        <p>The metaverse is emerging as a novel domain for digital interaction, transforming accessibility and the cultural experience. Virtual museums transcend the physical constraints of conventional museums, providing immersive and dynamic experiences. This research seeks to analyse the benefits and problems presented by the metaverse by investigating spatial iterations and the influence of access devices on the user experience within museums. Virtual environments may emulate or transform actual settings, eliminating geographical constraints and leveraging digital capabilities to provide unique experiences, including animated artwork and interactive narratives. The devices employed substantially afect the experience: PCs, equipped with VR headsets and sophisticated peripherals, provide immersion and accuracy; smartphones, constrained by their small displays, excel in accessibility; tablets compromise portability and performance, yet are inferior to PCs regarding precision. The quality of internet connectivity is essential; mobile connections may induce delay, undermining fluidity and collaborative activities. User engagement varies from passive exploration to active cooperation, including artefact reconstruction. Advanced technologies facilitate intricate interactions, whereas portable devices promote simpler, more intuitive tasks. Moreover, social interactions inside the metaverse rely on non-verbal cues and interface ergonomics, underscoring the necessity of creating inclusive and flexible digital settings. In conclusion, the metaverse presents a distinctive chance to democratise cultural access, transcending traditional borders and fostering global interaction. Tailoring design and infrastructure may enhance accessibility and significance for everyone.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Metaverse</kwd>
        <kwd>Cultural Heritage</kwd>
        <kwd>Human Computer Interaction</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        In recent years, the metaverse has emerged as a fascinating digital horizon and a transformative platform
for culture. New multimodal technologies are turning actual settings into virtual ones. Cultural heritage
(CH) uses XR technologies to attract visitors and encourage repeat visits, making heritage content more
engaging, immersive, and meaningful[
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Digital museum spaces overcome the physical limitations of
traditional museums and introduce dynamic, immersive, and interactive experiences. They also serve
as a powerful medium for the narration and preservation of CH, enabling a richer and more detailed
understanding of past civilisations and cultural narratives[
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>
        User interaction in virtual museums is vital to optimise the cultural, educational, and social efects of
digital experiences. Indeed, it is the key to turning virtual museums into interactive, learning, and
innovative hubs. Museums can encourage passive exploration or active engagement in collaborative
activities like artefact reconstruction or instructional games. The type of museum, its geometry and
spatial organisation, and the device used afect the nature of these interactions: smartphones favour
intuitive and straightforward interactions, PCs allow for more complex activities due to superior
computational capabilities and systems, and tablets facilitate moderately complex interactions[
        <xref ref-type="bibr" rid="ref3 ref4">3, 4</xref>
        ].
The metaverse ofers 3D avatars that match the user’s features, unlike previous digital worlds that
recognised users by ID and username[
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
      </p>
      <p>
        Standard avatars are no longer enough in the metaverse; therefore, the metahuman is born. DT, our
metahuman, allows us to reside in virtual realms and experience our motions and expressions in real time
. Due to machine learning algorithms, these synthetic people may mirror our moods, sentiments, and
personalities. Modern technology allows users to customise their metaverse interlocutors, which has
been demonstrated to increase satisfaction and engagement, self-esteem , and spontaneous assistance.
Social and communication dynamics are becoming increasingly complicated and a subject of research and
attention. In virtual worlds, non-verbal signs, body movement, dialogue pauses, and facial expressions
afect communication with other users[
        <xref ref-type="bibr" rid="ref6 ref7">6, 7</xref>
        ].
      </p>
      <p>This study examines the multiple dimensions of spatial iterations in the metaverse, using digital museum
environments as an example. It analyses the user’s interaction with the museum space, the artworks
within it, and the connections that may form between users or groups of users. It also examines how
avatar experience—novice, expert, or museum guide—afects the activity. Finally, access devices’ efects
on user experience, virtual environment perception, interactivity, and engagement are examined.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related Work</title>
      <p>
        Recently, academic articles on the metaverse have increased due to rising interest. Academic and
non-academic scholars have studied the new virtual worlds, applying their findings to real-world
circumstances. In , a Mars-themed virtual exhibition environment was constructed to enable social
interactions between participants using avatars, mimicking video game dynamics. Using avatars to
navigate[
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. Virtual exhibition spaces in the metaverse are being adopted due to the advancement of
cutting-edge technologies and museums’ desire to ofer visitors more engaging, dynamic, and interactive
activities and experiences to enhance exhibitions. These environments inspire people to interact with
exhibitions and each other, building community and shared learning[
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. The authors’ planned approach
in for future museums allows visitors to virtually ’walk’ through historical sites and interact with detailed
reconstructions of significant events, allowing them to explore rare artefacts and fragile ecosystems
without risking damage and enriching knowledge. VR and holographic technology have also improved
museum and educational visitor engagement and learning. In, the authors proposed hand gesture-based
content control to increase holographic display usability[
        <xref ref-type="bibr" rid="ref10">10</xref>
        ].
      </p>
      <p>
        Meanwhile, VR-based 3D modelling and museum interaction technologies have increased visitor
satisfaction and engagement. Museums’ audience engagement has changed due to digital technology.
Online materials and digital access allow museums to transform visitor interaction and learning.
According to, VR technology improves users’ experience of spatial presence, which is crucial for
immersive VR applications. Avatars provide real-time social interactions between users, facilitating
involvement across geographical and physical boundaries[
        <xref ref-type="bibr" rid="ref11">11, 12</xref>
        ] . Thus, visitors can browse exhibitions,
take guided tours, and connect with users worldwide as if they were in the same spot[13].
Many have studied avatars and their Metaverse interactions to engage users. Due to its prominent
role, many platforms have included avatar functionality to enhance users’ virtual experiences. Roblox
employs more realistic 3D animated avatars that can vibrantly communicate emotions. Users can choose
an avatar from diferent degrees of realism [30]. Zepeto, a platform that promotes self-expression and
social interaction, lets users personalise their avatars with hairstyles, makeup, accessories, clothing, and
footwear. Due to its adaptability, Zepeto has enhanced user engagement and virtual interactions[14].
These examples demonstrate how avatars are crucial to metaverse social interactions. Realism, for
social interactions in a computer-mediated environment, avatars should reflect the person’s appearance
and behaviour; anthropomorphic avatars are more believable and attractive to users.
      </p>
    </sec>
    <sec id="sec-3">
      <title>3. The interaction in metaverse</title>
      <p>As the metaverse rapidly evolves as a new immersive space, avatars are revolutionising virtual world
interaction, allowing people to relate, work, and have fun diferently. Despite their conceptual similarity,
avatars and metahumans difer in their characteristics, use, and complexity. Metahumans, sophisticated
digital models created using photogrammetry, artificial intelligence, and motion capture, are a step
beyond avatars. Motion capture accurately recreates metahumans’ skin textures, facial expressions,
and body movements. Motion capture uses special suits with markers (active or passive), sensors, and
multiple cameras to capture body and facial movements.</p>
      <p>After processing the movement data, specialised software creates a digital skeleton replicating the
movements. Then, the recorded movements are applied to an avatar or metahuman. Metahuman
creation and use are still rare due to their complexity and cost. Avatars’ simplicity, intuitiveness, and
accessibility have led to more metaverse use. Avatars, though simpler, allow users to choose aesthetic
aspects like body shape, clothing, accessories, and, in some cases, movements and behaviour, and
are used in community settings to improve social interaction and collaboration. Avatars can explore
environments and reinvent museum spaces in the virtual museum context. Avatars can be customised to
represent visitors and explore spaces, linger over works, participate in guided experiences, or be museum
guides. In this context, avatars can be analysed on three levels: avatar-environment, avatar-artworks,
and avatar-avatar.</p>
      <p>Avatar-museum interaction depends on the museum space, architecture, and exploration. Depending
on the desired experience, one can create paths that users must follow or leave free to move around.
Dynamic spaces like exhibition halls or interactive museum areas can be fully experienced by using
interactive screens, activating multimedia content like audio, video, or projections, or opening doors
or changing the layout. In addition, metaverse museums can highlight extraordinary architecture,
allowing avatars to ’fly’, teleport, or change perspective to see hidden details.</p>
      <p>The metaverse’s most engaging interaction is between avatars and art. Most traditional museums limit
interaction to preserve the CH on display and respect its historical and artistic value. Visual observation
is the only interaction allowed. Visitors cannot create any creative interaction with the work. Still,
they can analyse it from diferent angles to understand its details and meanings, whether sculpture,
paintings, or pictures. Thus, real museums prefer non-invasive viewing and prohibit touching and flash
photography to avoid damage from wear and tear, fingerprints, and external agents.
With virtual spaces, this perspective and visitors’ approach to the works on display change drastically.
Metaverse museums ofer more creative possibilities for interacting with paintings and sculptures than
traditional museums due to the digital nature of the works and the lack of physical constraints. In
the metaverse, avatars allow users to explore works more closely. All metaverse three-dimensional
reproductions can be rotated, enlarged, and lifted up to see hidden details.</p>
      <p>Technology ofers many possibilities for paintings: as the avatar passes by, paintings can be activated
to show animations, stories, or explanations about the work and its historical context; visitors can click
on specific points in the painting to learn more about symbolic or technical details; and even analysis
techniques can be simulated to show elements not visible to the human eye. Avatars can also change
colours or add temporary elements to show how artistic choices afected the work and its perception.
In the metaverse, ancient vases, tools, and utensils become living, interactive objects, helping us
understand their function, history, and cultural significance.</p>
      <p>Users can visualise vases or tools in historical simulations to understand their use and context.
Simulations allow avatars to hold a Neolithic axe and cut virtual wood, experiencing the ergonomics of
the object, or manipulate fragments of an object to reconstruct it, as with fragmented vases found
in archaeological excavations. The metaverse lets us see how objects looked new, before age and
damage. In these cases and in painting, the level of detail can be infinitely expanded, allowing us to
see engravings, cracks, wear, and microscopic decorations that are not visible in real life. Clicking on
a vase may reveal an animated scene of a potter or an ancient banquet. Visitors can learn about an
object’s creator, technology, and historical context. Prehistoric animal traps and pottery wheels can be
animated to show how they were used.</p>
      <p>Like in reality, avatars can dynamically interact in the metaverse, making museums places of cultural and
social exchange. These interactions can be enhanced by digital tools that foster community and active
participation. Avatars can talk about art, history, and socialising. This builds community and bonds
between visitors and enthusiasts beyond temporal and geographical barriers. Metaverse museums can
host conferences, artistic performances, and workshops where avatars interact with experts, artists, and
other visitors to build relationships. In interactive activities, avatars can collaborate and communicate
via text, audio, or digital gesture chat. To enhance the interaction, they can tilt their head to show
interest or raise a hand to ask questions.</p>
      <p>Changing avatars or groups of avatars afects their roles in museum spaces, shaping the collective
experience. Avatars can guide visitors through exhibits and explain works and historical contexts.
Humans or artificial intelligences can interact naturally with these avatars to tailor content to visitors.
Other avatars can play artists and use virtual spaces to exhibit their digital works, perform immersive
performances, or interact with the public to explain their creations. Directly communicating with the
public lets them explain their work, answer questions, and get real-time feedback. Guest avatars, or
museum visitors, can freely explore the spaces, interact with the art, participate in discussions, or be
guided by others.</p>
      <p>Turns of speech are one of avatar interaction’s most intriguing and complicated aspects. Turns of speech
are how conversation participants alternate speaking and listening without overlapping. It underpins
real-world and virtual interpersonal communication. This alternation follows implicit or explicit rules
that vary by context, communication method, and culture to avoid overlap. Real-world characteristics
and signs indicate a round passage. In a conversation, the speaker uses verbal and nonverbal cues like a
change in intonation, a pause, or a look to suggest that they are about to give up, while the listener
can use hand gestures or facial expressions to take the floor. These signals are easily recognisable in
real life, which can cause significant issues in the virtual world. Technological factors and interface
characteristics change word turns in virtual spaces like the metaverse. These environments present
unique face-to-face communication challenges. One of the biggest challenges is the lack of complete
non-verbal cues. Virtual environments often lack eye contact, facial expressions, and gestures, making
it harder to manage natural speech turns, leading to awkward pauses or overlaps in a conversation.
Thus, efective speaking turn management in virtual museums promotes inclusivity, allows everyone to
express themselves, improves communication, and reduces social inertia.</p>
      <p>Artificial intelligence systems can assign speaking turns based on the order of requests, the length of
previous speeches, or the perceived importance of the contribution. In addition, shift request systems
can be added to online and video call apps so users can ”raise a virtual hand” or request to speak
through the interface. Requests are displayed chronologically for fair distribution. Your avatar could
also indicate your desire to talk visually or audibly. Indeed, raising an arm or moving the head can
signal the other players to stop. In the case of large groups of avatars, such as guided museum tours or
conferences, artificial intelligence can monitor and manage interactions by analysing who has spoken,
their speech length, and the order of requests. To make interactions more straightforward, visual
indicators like lights or icons can indicate who is speaking or queuing to intervene.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Case Study Application</title>
      <p>In conclusion, metaverse museums have changed how art and culture are experienced, combining
technology, human interaction, and experiential design. These virtual environments can overcome
physical and geographical barriers and make CH accessible to global audiences. Users can inhabit
dynamic three-dimensional spaces, customise their avatars to express themselves, and interact with
artworks using interactive tools beyond traditional museums. Animations, narratives, and additional
content enhance the aesthetic experience and encourage active participation and collaboration in digital
works.</p>
      <p>These spaces have potential but also drawbacks. These spaces require high-speed internet, compatible
devices, and digital skills, which are not widely available. VR headsets and other immersive
technologies are expensive, which can limit access in educational or resource-poor settings. Additionally,
scalability and accessibility must be integrated throughout development. Despite this, the metaverse
ofers possibilities beyond geography. Hybrid storytelling, gamification, and education can happen in
virtual museums. Users collaborate to solve historical puzzles, reconstruct artefacts, and participate in
thematic debates to create cultural meaning.</p>
      <p>Deeper learning is promoted by stimulating individual and collective creativity. The constant evolution
of virtual headsets, VR, and AR devices promises to make the experience more fluid, immersive, and
interactive. Finally, the metaverse iteration redefines audience, art, and space, making the museum a
dynamic ecosystem. Virtual museums are more than just places to visit; they create global
communities where people and technology come together to enhance the cultural experience and create new
connections, learning, and inspiration.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Declaration on Generative AI</title>
      <p>During the preparation of this work, the author(s) used GPT in order to: Formatting assistance. After
using these tool(s)/service(s), the author(s) reviewed and edited the content as needed and take(s) full
responsibility for the publication’s content.
[12] M. Casillo, F. Colace, A. Lorusso, D. Santaniello, C. Valentino, Integrating physical and virtual
experiences in cultural tourism: An adaptive multimodal recommender system, IEEE Access 13
(2025) 28353–28368. doi:10.1109/ACCESS.2025.3539205.
[13] M. Qu, Y. Sun, Y. Feng, Digital media and vr art creation for metaverse, in: Proceedings of the 2022
2nd Asia Conference on Information Engineering (ACIE), IEEE, 2022. doi:10.1109/ACIE55485.
2022.00018.
[14] R. Cheng, N. Wu, S. Chen, B. Han, Will metaverse be nextg internet? vision, hype, and reality,
IEEE Network 36 (2022) 197–204. doi:10.1109/MNET.117.2200055.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Guo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Yuan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Guo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Fu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Jin</surname>
          </string-name>
          ,
          <article-title>Applications of metaverse-related technologies in the services of us urban libraries</article-title>
          ,
          <source>Library Hi Tech</source>
          (
          <year>2023</year>
          ). doi:
          <volume>10</volume>
          .1108/LHT- 10- 2022- 0486.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>B.</given-names>
            <surname>Kye</surname>
          </string-name>
          , N. Han,
          <string-name>
            <surname>E</surname>
          </string-name>
          . Kim,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Park</surname>
          </string-name>
          , S. Jo,
          <article-title>Educational applications of metaverse: possibilities and limitations</article-title>
          ,
          <source>Journal of Educational Evaluation for Health Professions</source>
          <volume>18</volume>
          (
          <year>2021</year>
          )
          <article-title>32</article-title>
          . doi:
          <volume>10</volume>
          .3352/ jeehp.
          <year>2021</year>
          .
          <volume>18</volume>
          .32.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>H.</given-names>
            <surname>Lin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Gan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Gan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Qi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. S.</given-names>
            <surname>Yu</surname>
          </string-name>
          ,
          <article-title>Interaction in metaverse: A survey</article-title>
          ,
          <source>in: Proceedings of the 2023 IEEE International Conference on Big Data (BigData)</source>
          , IEEE,
          <year>2023</year>
          , pp.
          <fpage>2473</fpage>
          -
          <lpage>2482</lpage>
          . doi:
          <volume>10</volume>
          .1109/BigData59044.
          <year>2023</year>
          .
          <volume>10386876</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>J.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <article-title>Advertising in the metaverse: Research agenda</article-title>
          ,
          <source>Journal of Interactive Advertising</source>
          <volume>21</volume>
          (
          <year>2021</year>
          )
          <fpage>141</fpage>
          -
          <lpage>144</lpage>
          . doi:
          <volume>10</volume>
          .1080/15252019.
          <year>2021</year>
          .
          <volume>2001273</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>S.</given-names>
            <surname>Mystakidis</surname>
          </string-name>
          , Metaverse, Encyclopedia
          <volume>2</volume>
          (
          <year>2022</year>
          )
          <fpage>486</fpage>
          -
          <lpage>497</lpage>
          . doi:
          <volume>10</volume>
          .3390/encyclopedia2010031.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>M.</given-names>
            <surname>Casillo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Cecere</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Colace</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Lorusso</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Santaniello</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Valentino</surname>
          </string-name>
          ,
          <article-title>Exhibition spaces in the metaverse: a novel design approach</article-title>
          ,
          <source>in: Proceedings of the 2023 8th IEEE History of Electrotechnology Conference (HISTELCON)</source>
          , IEEE,
          <year>2023</year>
          , pp.
          <fpage>116</fpage>
          -
          <lpage>119</lpage>
          . doi:
          <volume>10</volume>
          .1109/HISTELCON56357.
          <year>2023</year>
          .
          <volume>10365847</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>J.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Yeo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <article-title>Metaverse current status and prospects: Focusing on metaverse field cases</article-title>
          ,
          <source>in: Proceedings of the 2022 IEEE/ACIS 7th International Conference on Big Data, Cloud Computing, and Data Science (BCD)</source>
          , IEEE,
          <year>2022</year>
          , pp.
          <fpage>332</fpage>
          -
          <lpage>336</lpage>
          . doi:
          <volume>10</volume>
          .1109/BCD54882.
          <year>2022</year>
          .
          <volume>9900579</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>X.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , et al.,
          <article-title>Metaverse for cultural heritages</article-title>
          ,
          <source>Electronics</source>
          <volume>11</volume>
          (
          <year>2022</year>
          )
          <article-title>3730</article-title>
          . doi:
          <volume>10</volume>
          .3390/ electronics11223730.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>C. I.</given-names>
            <surname>Nwakanma</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. N.</given-names>
            <surname>Njoku</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.-Y.</given-names>
            <surname>Jo</surname>
          </string-name>
          ,
          <string-name>
            <surname>C.-H. Lim</surname>
          </string-name>
          , D.-S. Kim, '
          <article-title>creativia' metaverse platform for exhibition experience</article-title>
          ,
          <source>in: Proceedings of the 2022 13th International Conference on Information and Communication Technology Convergence (ICTC)</source>
          , IEEE,
          <year>2022</year>
          . doi:
          <volume>10</volume>
          .1109/ICTC55196.
          <year>2022</year>
          .
          <volume>9952599</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>L.</given-names>
            <surname>Cecere</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Colace</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. B.</given-names>
            <surname>Gupta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Lorusso</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Messina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Valentino</surname>
          </string-name>
          ,
          <article-title>Metaverse and museum: a case study</article-title>
          ,
          <source>Procedia Structural Integrity</source>
          <volume>64</volume>
          (
          <year>2024</year>
          )
          <fpage>2189</fpage>
          -
          <lpage>2196</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.prostr.
          <year>2024</year>
          .
          <volume>09</volume>
          . 336.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>A.</given-names>
            <surname>Della Greca</surname>
          </string-name>
          , I. Amaro,
          <string-name>
            <given-names>N.</given-names>
            <surname>Frugieri</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Barra</surname>
          </string-name>
          ,
          <string-name>
            <surname>G. Tortora,</surname>
          </string-name>
          <article-title>The impact of virtual scenarios on empathy: a user study on the role of empathic abilities and environmental context in emotional facial expression replication</article-title>
          ,
          <source>in: 2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)</source>
          , IEEE,
          <year>2025</year>
          , pp.
          <fpage>574</fpage>
          -
          <lpage>581</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>