<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Simulation of Cosmonaut Rescue Using Virtual Reality</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Mikhail Mikhaylyuk</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Andrey Maltsev</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Evgeny Strashnov</string-name>
          <email>strashnov_evg@mail.ru</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>36-1 Nakhimovskiy Ave.</institution>
          ,
          <addr-line>Moscow, 117218</addr-line>
          ,
          <country country="RU">Russia</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Federal State Institution «Scientific Research Institute for System Analysis of the Russian Academy of Sciences»</institution>
        </aff>
      </contrib-group>
      <abstract>
        <p>This paper presents original solutions for creation of training complex learning cosmonauts to control a space jet pack on purpose self-rescue when emergency happens. An approach is proposed in which training is carried out in a virtual environment using virtual reality gloves and headset. The idea is that control of virtual space jet pack model is performed by interaction of virtual hands, copying movements of cosmonaut's hands, with three-dimensional model of jet pack's control panel. To implement the training complex, methods and approaches were developed for movement synchronization simulation of virtual and real hands, as well as simulation of jet pack's control panel and thrusters. Approbation of proposed methods and approaches was carried out as part of our virtual environment system VirSim developed at the SRISA RAS. Results obtained in the paper can be used to create training complex for learning cosmonauts to rescue when they accidentally separate from the International Space Station. Virtual reality, simulation, stereo visualization, training complex, space jet pack, virtual control</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>Currently, cosmonauts periodically need to carry out spacewalks and perform operations on the
surface of the International Space Station (ISS). Special safety tethers are actively used for such
operations, with the help of which cosmonaut attaches himself to spacecraft. However, there is a risk
of unsuccessful attachment or malfunction of the tether, with the result that cosmonaut is detached from
spacecraft and cannot return. In the event of this emergency, it is relevant to use cosmonaut rescue
system (for example, SAFER – Simplified Aid For EVA Rescue [1, 2]), which represents a space jet
pack put on spacesuit. The pack includes a set of thrusters, which create a thrust by means of supplying
compressed air from a balloon. It allows cosmonaut to implement movements and rotations in
weightlessness. Special control panel is used to control space jet pack in manual or automatic mode.
The problem is to teach cosmonauts the skills of such control when training in terrestrial conditions.
Therefore, an important and topical area of research is the creation of training systems for learning
cosmonauts to use rescue equipment when separation from the ISS.</p>
      <p>In recent years, advanced technologies of virtual reality (VR) are actively introduced into various
spheres of human life. In particular, space simulators [3, 4, 5] are created which train cosmonaut by
means of immersion in virtual environment. The advantage of VR technology application is in
improving of visual perception quality of computer-synthesized environment by the operator. Many
scientific groups are engaged in development of such space simulators. Motion tracking systems for
parts of a person’s body (head, torso, legs and hands) were considered in publication [3] on the example
of cosmonaut performing various operations during a spacewalk. Paper [4] presents a developed
simulator for controlling manned spacecraft by joysticks as part of the planned mission of the Moon
exploration. Hardware solutions [5] based on VR technologies and created in the NASA laboratory are</p>
      <p>2021 Copyright for this paper by its authors.
widely used. They also include training cosmonauts to control space jet pack with using a prototype of
real control panel.</p>
      <p>This paper presents new solutions for implementation of training complex that learns cosmonauts to
control space jet pack. They are based on VR technologies and our virtual environment system. The
main scientific idea is that a control of jet pack model is executed by interaction of real cosmonaut with
elements of virtual three-dimensional control panel. For this, the Oculus Rift CV1 virtual reality
headset, Oculus Touch controllers and Manus Prime II VR gloves are used as hardware solutions. These
devices allow implementing visual feedback with virtual environment and tracking motions of
operator’s head and hands. Suggested solutions include methods and approaches for control and
dynamics simulation of virtual hands, elements of virtual control panel and virtual model of cosmonaut
with space jet pack. Approbation of proposed methods and approaches was carried out as part of our
virtual environment system (VES) VirSim. It showed their effectiveness for learning control of space
jet pack.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Hardware and software of training complex</title>
      <p>Developed training complex of cosmonaut rescue consists of three major groups of components:
hardware block, software complex and digital visual models (DVM). Figure 1 illustrates the structure
proposed for this complex. Hardware block solves the problem of tracking the operator’s head, hands
and fingers, as well as transmission of virtual environment images to his eyes. Full tracking system for
hands and fingers is implemented on the base of original union two independent devices: Manus Prime
II VR gloves and Oculus Touch controllers (Figure 2). The latter are necessary to determine hand
positions in real space due to the lack of such functionality in Manus gloves. Data received from sensors
of Oculus Rift CV1 headset are used to track current orientation of head. Furthermore, the headset is
need to display rendered image of virtual scene.</p>
      <sec id="sec-2-1">
        <title>Hardware block</title>
      </sec>
      <sec id="sec-2-2">
        <title>Manus Prime II</title>
      </sec>
      <sec id="sec-2-3">
        <title>VR gloves</title>
      </sec>
      <sec id="sec-2-4">
        <title>Oculus Touch controllers</title>
      </sec>
      <sec id="sec-2-5">
        <title>Oculus Rift CV1</title>
      </sec>
      <sec id="sec-2-6">
        <title>VR headset</title>
      </sec>
      <sec id="sec-2-7">
        <title>Software complex</title>
      </sec>
      <sec id="sec-2-8">
        <title>Control subsystem</title>
      </sec>
      <sec id="sec-2-9">
        <title>Dynamics subsystem</title>
      </sec>
      <sec id="sec-2-10">
        <title>Visualization subsystem DVM</title>
      </sec>
      <sec id="sec-2-11">
        <title>Virtual scene (environment)</title>
      </sec>
      <sec id="sec-2-12">
        <title>Virtual model of cosmonaut with jet pack</title>
        <p>Software complex consists of control, dynamics and visualization subsystems. They are original
products, which were designed from the ground up without using third-party software. Control
subsystem provides receiving information from hardware block, as well as virtual control elements
located in DVM. Acquired data are used to compute control schemes [6] for dynamic elements of DVM
and generate control signals on this basis with transmitting them to dynamics subsystem. Based on these
signals, dynamics subsystem computes new positions and orientations of virtual scene’s moving
objects, after which it executes collision detection and response between objects in the scene.
Visualization subsystem performs distributed rendering of virtual environment on multi-core graphics
processor (GPU) with computing realistic illumination of its objects. The result of the rendering is a
stereo pair, which is sent to the Rift VR headset. All components of the software complex provide
realtime operation.</p>
        <p>DVM of training complex includes two part. The first is own highly detailed virtual scene of
cosmonaut environment (models of the ISS and the Earth) containing about a million polygons. The
second is virtual model of cosmonaut with jet pack created by us. In addition to geometry and textures,
this model also contains functional scheme for control of its head rotation, hand and fingers movements
based on signals from hardware block, as well as operation of rescue device's thrusters based on signals
from toggle switches and joystick located on its virtual control panel. Next, we will consider in more
detail methods and approaches of dynamics and control simulation for virtual model of cosmonaut with
space jet pack.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Simulation of virtual hand</title>
      <p>Self-rescue training for cosmonaut with jet
pack consists in practicing operations, most of
which are performed with the help of his hands.</p>
      <p>Therefore, to solve the assigned task, we created
a virtual model of the hand shown in Figure 3. Its
design was carried out in the computer simulation
system 3ds Max. The compressions and stretches
of the fabric upper layers in this model, resulting
from the bending of the fingers, are set by
applying special skin modifier to polygonal
geometry. For this, developed model contains a
set of bones, changing positions and orientations
of which affects vertices of its geometry with
some coefficients.</p>
      <p>In addition to the bones and visual model of
virtual hand, developed VES also uses its
dynamic model. It is represented as a system of
articulated bodies, which is described by a set of
generalized coordinates q  q1, , qN T , where
N is the number of degrees of freedom. These
coordinates define position and orientation of the hand, as well as joint angles of fingers. The task of
hand dynamics simulation is to ensure its movement in zero gravity and in the presence of collisions
with virtual environment objects. In general, collision response includes modeling contact, impact, and
friction between interacting bodies. This leads to the fact that it is required to handle M constraints
of applied forces (torques applied at hand’s joints), J  M N is the constraint Jacobian matrix, λ  M
is the vector of Lagrange multipliers.</p>
      <p>
        Simulation of the hand dynamics by formula (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ) is implemented by means of extended version of
the recursive Featherstone method, in which the constraint processing and computation of the vector λ
are based on the impulse approach [8] and integration of the equations of motion using the semi-implicit
Euler scheme.
      </p>
      <p>Virtual hand control is realized in copying mode by means of Manus VR glove and Oculus Touch
controller, which are put on the human hand. Then the hand position and orientation, as well as joint
angles of fingers, define the target vector q d of generalized coordinates. Proposed solution for virtual
hand control is that the torque Qe is formed in the following form</p>
      <p>
        Qe  K p (q  qd )  Kd (q  qd ) ,
(
        <xref ref-type="bibr" rid="ref2">2</xref>
        )
where K p and K d are diagonal matrices with positive coefficients.
      </p>
      <p>
        Control law (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ) allows realizing the movement of virtual hand in the copying mode.
      </p>
    </sec>
    <sec id="sec-4">
      <title>4. Simulation of virtual control panel for space jet pack</title>
      <p>In this paper, an approach is proposed in which control of virtual space jet pack model is realized
using virtual three-dimensional control panel, the prototype of which is the hand controller module of
the SAFER rescue system [1, 2]. Created model of this panel is shown in Figure 4. It contains a display,
two status indicators, three toggle switches, one button and four-axis joystick with a button. The display
is used to show remaining fuel and battery level in percent. Indicators of control panel status consist of
two LEDs. The red LED labeled “THR” lights when the thrusters are active, and the green LED labeled
“AAH” lights when the automatic attitude hold is enabled. Two-position toggle switch labeled “PWR”
turns control panel on and off. Two-position toggle switch labeled “MODE” selects motion modes,
where “ROT” and “TRAN” are associated with rotational and translational motion commands,
respectively. The button labeled “RTRN” initiated mode of automatic return of cosmonaut to
predetermined position. The joystick has three rotary axes and one transverse axis, which set
movements and rotations of cosmonaut model. For example, deflection of the joystick in the Z-axis
direction when the “ROT” mode is on sets rotation around the roll axis. The mode of automatic attitude
hold for cosmonaut is switched on and off by means of the joystick’s button.</p>
      <p>Interaction of the operator with developed control panel is performed using created hand model.
Controllable elements of the panel have a set of dynamic objects and sensors that compute the state of
these elements. For toggle switches, rotation angles are determined, and for buttons – their
displacements. The state of the joystick is determined by an offset along the X-axis and three successive
rotations around the Z, Y, and X axes. Coordinate values of virtual panel’s elements are processed in
control subsystem of our training complex and form commands to turn on and off the thrusters.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Motion simulation of cosmonaut with jet pack</title>
      <p>The motion of cosmonaut in Earth's orbit is described relative to the ISS. Consider the world
coordinate system (WCS) OS xyz and the local coordinate system (LCS) OC xyz , which is rigidly
attached to cosmonaut, as shown in Figure 5. Then cosmonaut position is defined with radius-vector
r   x, y, z T , and attitude – with transformation matrix R from the LCS of cosmonaut to the WCS.
Cosmonaut motion control is realized by 24 thrusters with the equal thrust f, which are located on six
sides of the jet pack symmetrically relative to cosmonaut's common center of mass. Each motion of
T
cosmonaut corresponds to its own set of thrusters, which create the total thrust force F   Fx, Fy, Fz
and torque τ   x , y , z </p>
      <p>T
relative to the LCS of cosmonaut, where</p>
      <p>Fx, Fy, FzF , 0, F ,
 x Tx , 0,Tx ,  y Ty , 0,Ty  ,  z Tz , 0,Tz  . Herewith
F  4 f ,</p>
      <p>Tx  4 fLx , Ty  4 fLy ,
Tz  4 fLz , where Lx , Ly and Lz are the arms of thrusters relative to the cosmonaut's center of mass.</p>
      <p>
        In the absence of gravity and without taking into account the influence of the ISS on cosmonaut
motion, the dynamics of its linear motion is described by differential equations in the form of Newton's
second law:
mv  RF ,
(
        <xref ref-type="bibr" rid="ref3">3</xref>
        )
T
where m is the total mass of cosmonaut, spacesuit and jet pack, v  vx ,vy ,vz  is the linear velocity of
cosmonaut in the WCS.
      </p>
      <p>
        The dynamics of cosmonaut’s rotational motion is described by the Euler differential equations:
Ixx  (I y  Iz ) yz  x ;
I yy  (Iz  Ix )xz  y ;
Izz  (Ix  I y )x y  z ,
(
        <xref ref-type="bibr" rid="ref4">4</xref>
        )
(
        <xref ref-type="bibr" rid="ref5">5</xref>
        )
(
        <xref ref-type="bibr" rid="ref6">6</xref>
        )
(
        <xref ref-type="bibr" rid="ref7">7</xref>
        )
T
where I x , I y and I z are the principal moments of inertia, ω  x ,y ,z  is the angular velocity of
cosmonaut in its LCS.
      </p>
      <p>
        In that way, the dynamics for linear and rotational motion of cosmonaut is described by the equations
(
        <xref ref-type="bibr" rid="ref3">3</xref>
        ) and (
        <xref ref-type="bibr" rid="ref4">4</xref>
        ), in which the components of the vectors F and τ are the control variables.
      </p>
      <p>Cosmonaut motion control is realized by control panel in manual and automatic modes. Deflections
of the joystick in manual mode set commands to activation of thrusters, while automatic mode provides
the attitude stabilization and cosmonaut movement to predetermined position. For synthesis of control
actions in the VES, an approach is proposed, in which the readings of virtual sensors are used that
measure position, angular velocity and orientation of cosmonaut model relative to the ISS model. The
attitude stabilization of cosmonaut model along the x -axis of its LCS will be provided by relay control
with feedback according to the angular velocity sensor:</p>
      <p>Tx sgnx , x  
 x  
0, x  
.</p>
      <p>
        Control law (
        <xref ref-type="bibr" rid="ref5">5</xref>
        ) includes a dead zone x   , which reduces the number of switches in a
neighborhood of  x  0 . Similarly, the attitude stabilization control of virtual cosmonaut model along
the y and z axes can be obtained.
      </p>
      <p>In turn, proposed solution for automatic return of cosmonaut model is to ensure the velocity towards
to target. Let the target position of model be specified by the vector rd   xd , yd , zd T . Then target
motion velocity of cosmonaut model is given in the following form</p>
      <p> rd  r
vd   vr rd  r
0,
,
rd  r  l ;
rd  r  l ,
where vr is the specified value of cosmonaut velocity modulus, l is the specified distance.</p>
      <p>
        Linear velocity (
        <xref ref-type="bibr" rid="ref6">6</xref>
        ) provided by relay control
      </p>
      <p>F  F sgn(ev ) ,
where ev  R1(vd  v) is the residual vector in the LCS of cosmonaut model.</p>
    </sec>
    <sec id="sec-6">
      <title>6. Simulation results</title>
      <p>Methods and approaches proposed in this paper were implemented in our prototype of training
complex for cosmonaut rescue. To control virtual cosmonaut model, a block diagram scheme was
created. Its structure is shown in Figure 6.</p>
      <p>
        Connection of training complex’s hardware is implemented in the form of blocks that interconnect
with hand and head tracking devices, receive from them and then send the target coordinate values
(position and orientation of the operator's head and hands, as well as flexion angles of fingers) to the
scheme. On the basis of obtained coordinates and virtual sensors readings (feedback in this scheme),
joint torques of virtual hands model are computed according to Eq. (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ). The vector α of head rotation
angles are used to compute the voltages U by means of PD controllers and then sent to electric motors
of virtual camera. During the simulation, virtual hand interacts with virtual control panel, changing
states of its elements (button press, joystick rotations), which are combined in the form of a vector s.
Further, this vector is involved in the computation of commands t to turn on and off thrusters, where
ti 0,1, i 1, 24 . In the automatic mode of cosmonaut model motion control, virtual sensors readings
of its position and angular velocity are applied in accordance with the control laws (
        <xref ref-type="bibr" rid="ref5">5</xref>
        ) and (
        <xref ref-type="bibr" rid="ref7">7</xref>
        ).
      </p>
      <p>The emergency simulation in developed training complex was carried out on the example when
cosmonaut separated from the ISS and has non-zero separation speeds. In this scenario, trained
cosmonaut must to power virtual control panel and select automatic attitude hold mode by pressing the
joystick’s button. At the end of rotation, cosmonaut needs to change his attitude so that the ISS is in the
center of his field of view. To do this, it is necessary to switch control “MODE” to the “ROT” position
and make rotations in attitude control mode. As soon as the cosmonaut visually found the ISS, he must
activate translational motion (switch control “MODE” to the “TRAN”) and by deflecting the joystick
to ensure his motion towards the station. Meanwhile, training assumes that fuel reserves and battery
level of control panel are limited. During training, cosmonaut can enable automatic reset mode by
pressing the “RTRN” button, in which he moves to the nearest predetermined position on the ISS.
Figure 7 shows operator training by immersing him in virtual environment using virtual reality headset
and VR gloves. Using his hands, movements of which are copied by the virtual ones, the operator
controls cosmonaut model interacting with virtual panel elements.</p>
    </sec>
    <sec id="sec-7">
      <title>7. Acknowledgements</title>
      <p>The publication is made within the state task of Federal State Institution “Scientific Research
Institute for System Analysis of the Russian Academy of Sciences” on “Carrying out basic scientific
researches (47 GP)” on topic No. FNEF-2021-0012 “Virtual environment systems: technologies,
methods and algorithms of mathematical modeling and visualization. 0580-2021-0012” (Reg. No.
121031300061-2).</p>
    </sec>
    <sec id="sec-8">
      <title>8. References</title>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>R. K.</given-names>
            <surname>Fullerton</surname>
          </string-name>
          , EVA Tools and
          <article-title>Equipment Reference Book</article-title>
          . NASA Johnson Space Center,
          <source>JSC20466 Rev. B, Nov. 20</source>
          ,
          <year>1993</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>P. M.</given-names>
            <surname>Handley</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. K.</given-names>
            <surname>Robinson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. R.</given-names>
            <surname>Duda</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Prasov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. P.</given-names>
            <surname>York</surname>
          </string-name>
          , and
          <string-name>
            <given-names>J. J.</given-names>
            <surname>West</surname>
          </string-name>
          ,
          <article-title>Real-time performance metrics for SAFER self-rescue</article-title>
          ,
          <source>in: 45th International Conference on Environmental Systems</source>
          ,
          <year>2015</year>
          , Bellevue, Washington, pp.
          <fpage>1</fpage>
          -
          <lpage>14</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>Yuqing</given-names>
            <surname>Liu</surname>
          </string-name>
          , Shanguang Chen, Guohua Jiang, Xiuqing Zhu,
          <string-name>
            <given-names>Ming</given-names>
            <surname>An</surname>
          </string-name>
          , Xuewen Chen,
          <string-name>
            <surname>Bohe Zhou</surname>
          </string-name>
          , and Yubin Xu,
          <article-title>VR simulation system for EVA astronaut training</article-title>
          ,
          <source>in: Proceedings of AIAA Space 2010 Conference &amp; Exposition</source>
          , Anaheim California,
          <year>2010</year>
          . doi:
          <volume>10</volume>
          .2514/6.2010-
          <volume>8696</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>M. B.</given-names>
            <surname>Bruguera</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Ilk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ruber</surname>
          </string-name>
          , and
          <string-name>
            <given-names>R.</given-names>
            <surname>Ewald</surname>
          </string-name>
          ,
          <article-title>Use of virtual reality for astronaut training in future space missions - spacecraft piloting for the Lunar Orbital Platform - Gateway (LOP-G)</article-title>
          ,
          <source>in: 70th International Astronautics Congress</source>
          , Washington D.C.,
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>A. D.</given-names>
            <surname>Garcia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Schlueter</surname>
          </string-name>
          , and
          <string-name>
            <given-names>E.</given-names>
            <surname>Paddock</surname>
          </string-name>
          ,
          <article-title>Training astronauts using hardware-in-the-loop simulations and virtual reality, in: AIAA SciTech Forum</article-title>
          , Orlando, FL,
          <year>2020</year>
          . doi:
          <volume>10</volume>
          .2514/6.2020-
          <volume>0167</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Mikhaylyuk</surname>
          </string-name>
          , and
          <string-name>
            <given-names>M.A.</given-names>
            <surname>Torgashev</surname>
          </string-name>
          ,
          <article-title>The visual editor and calculation module of block diagrams for simulation and training complexes</article-title>
          ,
          <source>Software &amp; Systems</source>
          ,
          <volume>4</volume>
          (
          <year>2014</year>
          ):
          <fpage>10</fpage>
          -
          <lpage>15</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>A. A.</given-names>
            <surname>Shabana</surname>
          </string-name>
          , Computational dynamics,
          <source>3rd edition</source>
          , John Wiley &amp; Sons Ltd,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>M.V.</given-names>
            <surname>Mikhaylyuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.V.</given-names>
            <surname>Strashnov</surname>
          </string-name>
          and
          <string-name>
            <given-names>P.</given-names>
            <surname>Yu</surname>
          </string-name>
          . Timokhin,
          <article-title>Algorithms of multibody dynamics simulation using articulated-body method</article-title>
          ,
          <source>Mathematica Montisnigri</source>
          ,
          <volume>39</volume>
          (
          <year>2017</year>
          ):
          <fpage>133</fpage>
          -
          <lpage>145</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>