<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Virtual interactive catalogue for viewing bibliographic content using Kinect 2</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Leudis Estrada González</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Yadira Ramírez Rodríguez</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Omar Correa Madrigal</string-name>
          <email>ocorrea@uci.cu</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>University of Informatics Sciences</institution>
          ,
          <addr-line>Road to San Antonio de los Baños, Km 21⁄2, Torrens, La Lisa, Havana</addr-line>
          ,
          <country country="CU">Cuba</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>University of Informatics Sciences</institution>
          ,
          <addr-line>Road to San Antonio de los Baños, Km 21⁄2, Torrens, La Lisa, Havana</addr-line>
          ,
          <country country="CU">Cuba</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>University of Informatics Sciences</institution>
          ,
          <addr-line>Road to San Antonio de los Baños, Km 21⁄2, Torrens, La Lisa, Havana</addr-line>
          ,
          <country country="CU">Cuba</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>The main objective of gesture recognition in computer science is to interpret human movements and body language through mathematical algorithms. The Human-Computer Interface Technologies developed at the University of Informatics Sciences are committed to integrating this solution in a virtual catalogue that allows the visualization and dynamic interaction with bibliographic contents, making use of the Kinect 2 gesture sensor and the Unity 3D game engine. This catalogue offers a positive experience to users who need to explore the university´s bibliographic collection, by integrating elements that enrich the constant teaching and learning process, in the interests of acquiring better professional training.</p>
      </abstract>
      <kwd-group>
        <kwd>1 Virtual catalogue</kwd>
        <kwd>gesture recognition</kwd>
        <kwd>kinect 2</kwd>
        <kwd>bibliographic content</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>In recent years, countless technological advances have contributed to the development of
gesturebased interfaces. The creation of revolutionary devices such as MS-Kinect, Leap Motion and MyO has
marked a milestone in the proliferation of a new generation of applications with which users interact
without the need for physical contact, thanks to the use of machine vision algorithms to interpret
gestures. The aforementioned, together with the constant rise of information technologies, resulting
from the emergence and use of the Internet, has revolutionized the traditional information systems in
place to date, making them increasingly immersive and interactive with users.</p>
      <p>The University of Informatics Sciences (UCI) constantly supports the Teaching and Learning
Process, combining innovation with the knowledge of a wide range of professionals in order to train
engineers who are increasingly prepared in the area of informatics. To support this task, the Directorate
of Scientific and Technical Information proposes the use of virtual libraries, which constitute a
necessary resource for the access and management of digitized information. With this new resource,
teachers can gain an insight into the texts and other subjects they wish to research, after having consulted
them according to their own search criteria. Similarly, new ways of accessing content are needed to
meet the new realities and challenges facing the teaching-education process.</p>
      <p>Hence, the use of the virtual catalogue aims to solve deficiencies associated with difficulties in
userinformation interaction, as well as in user access to information. It should also have a balance in the
amount of content offered, in a way that benefits the user´s experience without affecting their attention.
The aim of this research is to develop a digital catalogue for viewing and interacting with books,
journals, scientific papers and other bibliographic content using Kinect 2.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Materials and Methods</title>
      <p>In order to formalize a proposed solution, it is essential to approach the current environment in which
the research is framed and to understand the important concepts, the characteristics of the tools and the
processes that the application automates.</p>
      <p>2.1.</p>
    </sec>
    <sec id="sec-3">
      <title>Necessary Concepts 2.2.</title>
    </sec>
    <sec id="sec-4">
      <title>Related Work</title>
      <p>
        For a better understanding of the solution, definitions were studied for the terms: Online Public
Access Catalogues (OPACs) [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ], Natural User Interfaces (NUI) [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ], Gesture-Based Interfaces [
        <xref ref-type="bibr" rid="ref4 ref5">4, 5</xref>
        ]
and Gestures [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ].
      </p>
      <p>
        The works studied [
        <xref ref-type="bibr" rid="ref5 ref6">5, 6</xref>
        ] allow the creation of a solid base from which the development of all kinds
of applications oriented to gesture control, either for educational or entertainment purposes, can be
initiated. Based on this idea, and in order to obtain the best possible results in terms of visualization,
interaction with the contents and user experience, it is determined to develop a digital catalogue based
on gestures for educational use that provides a solution to the research problem in question.
2.3.
      </p>
    </sec>
    <sec id="sec-5">
      <title>Data extraction and generation with the Kinect 2</title>
      <p>
        For the extraction and control of data provided by the Kinect 2 it was essential to understand the
architecture of the device; how the software is managing the hardware and how the Kinect responds to
it. Figure 1 shows a general schematic of how the application works in its interaction with the Kinect
2. First, the sensor is connected to the USB 3.0 port of the computer. At the user level of the operating
system, there is the SDK 2.0 that makes the connection to the Kinect 2 and allows to obtain the data
from the depth sensors, the infrared color camera, as well as the microphone array [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
2.4.
      </p>
    </sec>
    <sec id="sec-6">
      <title>System Design</title>
      <p>
        Based on the study carried out, an architecture based on the principles of the architectural pattern is
proposed: layered architecture, with components ordered within it, in such a way that the elements
necessary for a platform application of this type are structured. The components in each layer
communicate with those in other layers through defined interfaces or instances of classes [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ].
      </p>
      <p>The above analysis allows to elaborate the details of each of the components used so that they can
provide sufficient information at the time of implementation. As part of the solution, all interfaces that
allow classes to communicate and collaborate with each other are determined, depending on the layer
where they are located. The layers are defined as follows:</p>
      <p>1. Presentation Layer: It is made up of the main elements of the system, such as: the main
application controller (AppManager), the Sound controller (Sound Manager), the gestures
(KinectManager) and the scenes (SceneManager).</p>
      <p>2. Contents Layer: In this layer are the different scenarios, as well as the objects and contents
associated with them.</p>
      <p>
        After the research, an interactive virtual catalogue was implemented for the visualization of
bibliographic content with Kinect 2. In order to guarantee the necessary services, the following
functionalities were implemented in the application: Content Catalogue, Content Management
Module and Gesture Recognition Module. For the development of the proposed modules, the steps
established by the agile XP methodology will be followed, where the scheme explained by [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] will be
respected.
      </p>
    </sec>
    <sec id="sec-7">
      <title>Conclusions</title>
      <p>Based on this document, a virtual catalogue was developed which resulted in a novel model for the
visualization and interaction with the contents stored in it. The application brings this content to the
users, no longer in the traditional way, but in a completely new and interesting way. This is achieved
thanks to the interoperability of the technologies and tools used, which, together with the application
created, contribute to increasing the knowledge of the students and employees of our university.</p>
    </sec>
    <sec id="sec-8">
      <title>3. Acknowledgements</title>
      <p>We would like to thank all our colleagues in the Directorate of the Scientific and Technical
Information Department of the university, as well as our closest friends and family.
4. References</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Martin</surname>
            ,
            <given-names>Lynne M.</given-names>
          </string-name>
          ,
          <year>2019</year>
          .
          <article-title>Evaluating OPACs, or, OPACs are reference tools, too! In: Assessment and Accountability in Reference Work</article-title>
          . Routledge. p.
          <fpage>201</fpage>
          -
          <lpage>220</lpage>
          . ISBN 9780429343926.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Wigdor</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Wixon</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <year>2011</year>
          .
          <article-title>Know your platform</article-title>
          .
          <source>Brave NUI World. S.l.: Elsevier</source>
          , pp.
          <fpage>167</fpage>
          -
          <lpage>176</lpage>
          . ISBN 9780123822314.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>Natural</given-names>
            <surname>User</surname>
          </string-name>
          Interfaces - The University of Colima Experience (n.d.).
          <source>SG Buzz. Retrieved April 29</source>
          ,
          <year>2022</year>
          , from https://sg.com.mx/revista/43/interfaces-naturales
          <article-title>-usuario-la-experiencia-la-universidad-colima</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Roccetti</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Marfia</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Semeraro</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2012</year>
          ).
          <article-title>Playing into the wild: A gesture-based interface for gaming in public spaces</article-title>
          .
          <source>Journal of Visual Communication and Image Representation</source>
          ,
          <volume>23</volume>
          (
          <issue>3</issue>
          ),
          <fpage>426</fpage>
          -
          <lpage>440</lpage>
          . https://doi.org/10.1016/J.JVCIR.
          <year>2011</year>
          .
          <volume>12</volume>
          .006.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>O.</given-names>
            <surname>Erazo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Pico</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Erazo</surname>
          </string-name>
          , and
          <string-name>
            <given-names>R.</given-names>
            <surname>Pico</surname>
          </string-name>
          ,
          <article-title>"Touch-free manual gesture-based user interfaces for the classroom: a literature review,"</article-title>
          <source>UTE Focus</source>
          , vol.
          <volume>5</volume>
          ,
          <source>n.o 4</source>
          , pp.
          <fpage>34</fpage>
          -
          <lpage>53</lpage>
          , Dec.
          <year>2014</year>
          , doi: 10.29019/enfoquequeute. v5n4.
          <fpage>46</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <article-title>[6] CHENYI, Qin</article-title>
          and
          <string-name>
            <surname>CHOI</surname>
          </string-name>
          , Jongwon,
          <year>2021</year>
          .
          <article-title>Human gestures and five elements with Kinect-based interactive installation in modern new media art</article-title>
          .
          <source>TECHART Journal of Arts and Imaging Science. Online</source>
          .
          <year>2021</year>
          . Vol.
          <volume>8</volume>
          , no.
          <issue>3</issue>
          , p.
          <fpage>15</fpage>
          -
          <lpage>19</lpage>
          . DOI 10.15323/techart.
          <year>2021</year>
          .
          <volume>8</volume>
          .
          <issue>8</issue>
          .3.15.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Rahman</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2017</year>
          ).
          <article-title>Beginning Microsoft Kinect for windows SDK 2.0: Motion and depth sensing for natural user interfaces (1st ed</article-title>
          .). APRESS.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Pressman</surname>
          </string-name>
          , Roger S. Software Engineering. s.l.:
          <string-name>
            <surname>Connecticut</surname>
          </string-name>
          ,
          <year>2002</year>
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>ESCRIBANO</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <year>2002</year>
          .
          <article-title>Introduction to Extreme Programming</article-title>
          . Introduction to Extreme Programming.
          <year>2002</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>