<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Modeling Situations in Virtual Environment Systems</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Mikhail V. Mikhaylyuk</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Dmitry A. Kononov</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Dmitry M. Loginov</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Scientific Research Institute for System Analysis RAS</institution>
          ,
          <addr-line>Nakhimovsky avenue 36/1, Moscow, 117218</addr-line>
          ,
          <country country="RU">Russia</country>
        </aff>
      </contrib-group>
      <fpage>173</fpage>
      <lpage>181</lpage>
      <abstract>
        <p>The technology of modeling various situations in virtual environment systems, which are computer tree-dimensional models of a real or artificial environment, is considered. The user can view these scenes directly on the computer screen, wall screen, stereo glasses, virtual reality glasses, etc. He can also move inside a virtual scene and interact with its objects. In turn, the environment can also change. This allows modeling of various situations (situational modeling) in the virtual environment system. With such modeling, some static or dynamic situation is set in the virtual environment system in which the operator must perform the tasks assigned to him. A mechanism for setting situations by changing a virtual three-dimensional scene using configuration files and virtual control panels is proposed. A special language and editor has been developed for writing configuration files and for creating virtual control panels. The approbation of the proposed methods is presented on the example of two virtual scenes: a training ground for mobile robots and a jet backpack of an astronaut in outer space.</p>
      </abstract>
      <kwd-group>
        <kwd>1 Virtual environment system</kwd>
        <kwd>situational modeling</kwd>
        <kwd>three-dimensional scene</kwd>
        <kwd>configuration file</kwd>
        <kwd>virtual control panel</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>Virtual environment systems are the computer 3D model of real or virtual surroundings, which a user
can observe directly on the computer monitor, wall screen, in stereo or virtual reality glasses etc. Besides
that, the user can move within this surroundings, interact with its objects and analyze the results of such
interaction. The environment can also change by itself or under the influence of any factors. All these lead
to the possibility of modeling in virtual environment systems.</p>
      <p>It is known some kind of modeling, which can be used within virtual environment systems. At
imitation modeling a computer model of real or developed system or process is created, on which the
modeling results (system or process behavior and parameters) are investigated or predict the future
results. The visualization of all processes and their visual observation are essential here. The scene
modeling let you to study the durability and survivability of the complex systems. Its main tasks are
prevention of emergency situations, risks management, restoring system survivability, etc. The system
structure and the elements interaction during its functioning are represented in the form of an oriented
graph, the arcs and vertices of which are assigned parameters and functionalities that adequately describe
the processes of functioning of the elements of the system under study (modeled). The visualization of
such a system in the virtual environment systems allows you to visually observe the processes taking
place. Simulation and training complexes are designed for professional training of operators by repeatedly
performing the necessary actions. In these complexes, virtual environment systems are used to simulate
the dynamics and visualization of the environment. Expanding the tasks of such complexes by creating
different situations for operators leads to situational modeling.</p>
      <p>
        Situational modeling [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] creates a special static or dynamic environment in virtual environment
system, in which the operator must perform the assigned tasks. At the same time, he must follow both the
objective properties of the created situation, and his subjective ideas about how to act in such a situation.
It is not so much the training of specific role is important here, as the ability to cope with difficult
situations. The instructor can offer both individual situations and several situations following each other. Such
modeling allows you to check not only the qualifications of the operator, but also his psychological
qualities: courage, risk-taking, perseverance, learning ability, emotionality, stress-resistance, adequacy,
purposefulness, capacity for work, etc. The paper [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] considers motivation in solving such a problem as the
state of situational cognitive and epistemic readiness to solve a problem in order to reduce the perceived
discrepancy between the expected and real states. In [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], a classification of situations and their
corresponding behavior algorithms is proposed using the example of controlling a small turbojet engine. The
paper [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] considers the work of an operator with a control panel in the virtual interior of the space module
“Pirs” for the implementation of the astronaut’s spacewalk. A whole range of different tasks is presented
in [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. In it, based on the analysis of astronauts’ preparation for extravehicular activity, classes of tasks are
identified that are physically difficult to model on technical means. These include the tasks of emergency
situations, management of advanced facilities, information support for astronauts, etc. Some other tasks
are presented in [
        <xref ref-type="bibr" rid="ref6 ref7 ref8">6–8</xref>
        ].
      </p>
      <p>In this paper, the problem of situational modeling in virtual environment systems is considered on the
examples of two virtual scenes: a training ground for mobile robots and a jetpack astronaut rescue in outer
space. Methods of setting different situations in one 3D virtual scene using configuration files and virtual
control panels are proposed, as well as possible modeling tasks in turn.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Virtual Environment System</title>
      <p>The virtual environment system consists of a three-dimensional (3D) virtual scene, a control
subsystem, a dynamics calculation subsystem and a visualization subsystem (see Figure 1).</p>
      <sec id="sec-2-1">
        <title>3D scene</title>
      </sec>
      <sec id="sec-2-2">
        <title>Control subsystem</title>
      </sec>
      <sec id="sec-2-3">
        <title>Operator</title>
      </sec>
      <sec id="sec-2-4">
        <title>Instructor</title>
      </sec>
      <sec id="sec-2-5">
        <title>Dynamics calculation subsystem</title>
      </sec>
      <sec id="sec-2-6">
        <title>Visualization subsystem</title>
        <p>A virtual 3D scene is created in a three-dimensional modeling system (for example, 3ds Max) and sets
the scene in which all actions will take place. Figure 2 shows the scene of the mobile robots training
ground, and Figure 4 shows the scene of an astronaut with a jetpack rescue in outer space. The selected
scene is initially loaded into the dynamics and visualization subsystems. The control subsystem includes
control panels for dynamic objects (for example, robots or jetpack engines), functional circuits and
software modules for calculating control signals resulting from the operator’s impacts on the control panel
elements. At the same time, you can use both real and virtual control panels.</p>
        <p>To create virtual control panels, a special interactive editor has been developed, containing a large set
of types of control elements (buttons, toggle switches, joysticks, regulators, etc.). Functional circuits are
created in other special editor, which includes a wide set of various functional blocks (arithmetic, logical,
trigonometric, dynamic, automatic control, signal generators, etc.). Each block has inputs and outputs that
can be connected in the editor with lines, thus obtaining a functional circuit. The control actions of the
operator, sensor readings from the virtual scene, the parameters of its setting, etc. are fed to the inputs of
the circuit.</p>
        <p>The calculated signals are transmitted to the dynamics subsystem, which calculates new coordinates,
orientation angles and states of controlled objects after a period Δt of simulation time. This takes into
account all dynamic parameters, collisions of objects, friction forces, gravity, etc. The calculation results are
transmitted into the visualization subsystem. This subsystem synthesizes the image of a virtual scene with
new parameters of dynamic objects. Various types of lighting, special objects (fire, jets of water and
foam, relief, etc.), as well as states of the environmental conditions (time of day, rain, snow, fog) are also
modeled. Since one cycle of operation of all subsystems takes no more than 40 milliseconds, the operator
gets the impression of a continuous controlled process in a virtual environment system.</p>
        <p>In the Scientific Research Institute for System Analysis Russian Academy of Science has developed
an original virtual environment system, named “VirSim”, which includes all the subsystems, described
above. Its main purpose is to be used as a simulator for control operators of complex dynamic systems.
Therefore, the work of all subsystems should be as realistic as possible, so as not to form so-called false
skills among operators. This system was used to train operators of mobile and anthropomorphic robots, as
well as to train cosmonauts on the ISS. Figures 2 and 3 show examples of virtual scenes in this virtual
environment system.</p>
        <p>The scene of the virtual environment system can be considered as a certain situation (situational
model) determined by the relative location of objects and their properties.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Situational modeling</title>
      <p>
        Situational modeling here means the creation of a certain situation (static or dynamic) in the virtual
environment system and the solution of a certain problem in it by an operator (or a group of operators).
The instructor carries out setting the situation, setting the task, monitoring and evaluating the operator’s
actions. The instructor can offer both individual situations and several consecutive situations. The purpose
of the simulation is not only to train the operator to perform known actions, but also to be able to find the
right solution in difficult situations. Therefore, the operator must not only take into account the objective
properties of the given situation, but also use his abilities to act in a complex and unexpected situation.
The examples are abnormal situations in the simulated system. At the end of the training, participants
usually analyze and take an introspection of their actions and their results. This allows you to gain
experience in such situations without endangering your life and health. The paper [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] identifies classes of tasks
in the process-preparing astronauts for extravehicular activity, which are physically difficult to simulate
using conventional technical means, but it is possible using virtual reality technologies. The first class
includes the tasks of information support for the astronaut, namely, the demonstration of the current
situation and necessary actions of the astronaut in this situation. The second class includes modeling the
actions of an astronaut to perform a number of tasks and demonstrate the results of mistakes made (failure
to secure the safety halyard, incorrect trajectory of launching small satellites, skipping an emergency
signal, etc.). The third class includes tasks for managing promising means of moving the spacecraft (rescue
satchel, robotic system, mobile robots etc.). A similar classification one can be carried out for any field of
activity.
      </p>
    </sec>
    <sec id="sec-4">
      <title>4. Methods for setting the situation in the virtual environment system</title>
      <p>A static situation is a situation in which all objects and the surrounding environment do not change
their parameters during the simulation. The simplest way to simulate a static situation is to create a new
three-dimensional scene for your virtual environment system in a 3D modeling system. For example, you
can use the 3ds Max system for this purpose. However, in many tasks, the same scene can be used as the
base for different situations. In this case, there will be numerous specific modifications of it, which will
take up a large amount of memory and differ little from each other. If you change the base scene, you
have to track the changes in all its modifications.</p>
      <p>Another way to set the situation is possible with the help of a configuration file. The configuration file
is an xml file in which for any scene element you can set the value of any of its parameter. You can set
the position and orientation of the object, as well as make it invisible or replace it with a damaged one;
the lighting source can be turn on or off, set the time of day, turn on rain or snow and set their intensity,
etc. Figure 4 shows an example of a base scene, and Figure 5 shows the same scene in which some
objects are set invisible using a configuration file.</p>
      <p>Virtual control panels can be used to manage configuration files. Such panels are created using a
specially designed editor that allows you to place controls (buttons, toggle switches, switches, etc.) on the
substrate of the panel. The positions of the control elements during simulation are transmitted to a
functional scheme, which is built from functional blocks of various types (arithmetic, trigonometric, control
and others). In particular, it has time tracking blocks and startup blocks for executing configuration files.</p>
      <p>Using this mechanism, you can change the situation in the virtual scene at a certain point in time.
Figure 6 shows the example of an instructor’s control panel, with which he can set one of 12 situations and
choose weather conditions (rain or snow).</p>
    </sec>
    <sec id="sec-5">
      <title>5. Possible tasks in the situational modeling</title>
      <p>Let us consider the possible tasks in the situational modeling using the example of two scenes in the
virtual environment system: a mobile robot training ground and an astronaut’s rescue jetpack (safer). In
the first scene (see Figure 2), the operator uses a virtual or real panel to control the robot’s movement and
can capture objects with a manipulator. In the second scene (see Figure 3), the operator uses a virtual or
real joystick to control the jetpack, moving the astronaut in space.</p>
      <p>At the initial stage, the operator must master the management and execution of simple basic operations
(studying the control panel, moving the controlled vehicle forward and backward, turning left and right,
grabbing objects, etc.). More complex tasks consist in moving the robot (or the safer) to a given point.
The point can be set by coordinates (for this it is necessary to simulate the local positioning system and
display the current coordinates on the control panel) or by a description (for example, the entrance to the
hangar, the exit hatch of the International Space Station, etc.). This type of tasks also includes searching
for specified objects or inspecting the environment. When searching for objects, special virtual sensors
can be used. For example, a virtual gamma detector shows the level of gamma radiation of a radioactive
object falling into its cone of action. It can be used to search for infected radioactive objects for the
purpose of their further transportation to specialized containers. An object can also be specified by a
description (for example, a fire source, a fuel barrel, a solar battery, a certain space module, etc.). The tasks of
inspection of the surrounding environment may include shooting with a virtual camera (including 360
degree panoramic video), searching for damage, checking the correctness of structural fasteners, etc.).
The performance of all this operations can be limited by time and the presence of obstacles.</p>
      <p>Operations that are more complex are extinguishing burning objects with the power of a jet of water or
foam from a mobile robot water cannon, orientation when a sensor fails, “jamming” the robot in a pit,
restoring its operability, etc. In the scene with a safer, you can work out an automatic return (in case of
fogging of the spacesuit or loss of visibility of the ISS), return to the nearest handrail (in case of lack of
fuel for the flight to the gateway), failure of one engine, stabilization of the astronaut (in case of its
twisting), etc. Having the underlying surface of the Earth in the scene, you can change the time of day, set
tasks for recognizing the place of flight, detecting fires, floods, and others cataclysms. Figure 7 shows an
example of extinguishing a fire by a robot, while the operator controls the process using a virtual control
panel in the lower left corner. Figure 8 shows a model of underlying earth’s surface from a height of 300
km, on which it is necessary to recognize the visible area of the terrain.</p>
    </sec>
    <sec id="sec-6">
      <title>6. Control in situational modeling</title>
      <p>An important task in situational modeling is control. The type of the control is determined by the task
being solved and can be performed using a real or virtual panel (command mode), skeleton, operator’s
hands and head tracking system, by voice or gestures, semi or full automatic mode etc. Switching
elements of the virtual panel are carried out either with a computer mouse or with fingers if the panel is
displayed on a touch screen. This type of control is not universal, it is also inconvenient to use it to control
manipulator models with many degrees of freedom (e.g. human or animal models). The control with
control panels was discussed above.</p>
      <p>To control the model of an anthropomorphic robot, you can use an exoskeleton, which is a frame worn
by the operator. The operator’s movements are digitized and transmitted to the dynamics subsystem to
simulate the operation of motors in the robot joints. The disadvantage of this control method is not very
high accuracy and inconveniences connected with wearing an exoskeleton.</p>
      <p>Tracking systems of the operator’s head and hands allow coordinates and orientations of parts of the
operator’s body to be obtained and used to repeat movements by a virtual model. This control mode is
called copying.</p>
      <p>Voice control (supervisory mode) involves the utterance of a command by the operator, followed by
its execution by a controlled object. To implement this technology, it is necessary to have a command
recognition module. The action itself can be performed according to a pre-recorded trajectory.</p>
      <p>Figure 9 shows in the “VirSim” system the unscrewing of the valve by an android with voice or
gesture command, and Figure 10 shows the automatic transition of the mobile robot to a specified point.</p>
      <p>In the supervisory control mode, the operator uses positions or movements (gestures) of his hands or
fingers to set commands; the robot recognizes the position (gesture) and executes a predefined command
corresponding to this position (gesture). The disadvantage of this method is a small number of meaningful
gestures and recognition inaccuracies. Semi-automatic mode involves controlling the manipulator's end
effector (the point on the end of the last link of the manipulator) using inverse kinematics. This means
that the operator controls the movement of the manipulator's end effector (using one of the previous
methods), and all other links of the manipulator automatically adjust to this movement. It can also be
specified the automatic mode in which the operator indicates the end point, and the robot moves to it
independently bypassing obstacles. At the same time the robot’s movement may be inefficient, take longer,
or the path may not be found at all. Further development may include intelligent control systems.</p>
    </sec>
    <sec id="sec-7">
      <title>7. Conclusion</title>
      <p>The paper proposes methods for setting various situations, as well as possible tasks of situational
modeling in virtual environment systems on the example of two virtual scenes: a training ground for
mobile robots and a jet backpack for saving an astronaut in outer space. Situational modeling was carried
out in the virtual environment system “VirSim”, developed at the Scientific Research Institute for System
Analysis of the Russian Academy of Science. The approbation of the creation of various situations using
three-dimensional virtual scenes, configuration files and virtual control panels showed the adequacy of
the proposed methods to the tasks set.</p>
    </sec>
    <sec id="sec-8">
      <title>8. Acknowledgements</title>
      <p>The publication is made within the state task of Federal State Institution “Scientific Research Institute
for System Analysis of the Russian Academy of Sciences” on “Carrying out basic scientific researches
(47 GP)” on topic No. FNEF-2021-0012 “Virtual environment systems: technologies, methods and
algorithms of mathematical modeling and visualization. 0580-2021-0012” (Reg. No. 121031300061-2).</p>
    </sec>
    <sec id="sec-9">
      <title>9. References</title>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>D. A.</given-names>
            <surname>Pospelov</surname>
          </string-name>
          ,
          <article-title>Situatsionnoe upravlenie: teoriia i praktika</article-title>
          . M.:
          <string-name>
            <surname>Nauka</surname>
          </string-name>
          . Gl. red. fiz.-mat. lit.,
          <year>1986</year>
          . 288 s.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Jeong-Nam</surname>
            <given-names>Kim</given-names>
          </string-name>
          , James E. Grunig,
          <article-title>Problem Solving and Communicative Action: A Situational Theory of Problem Solving</article-title>
          .
          <source>Journal of Communication</source>
          <volume>61</volume>
          (
          <year>2011</year>
          )
          <fpage>120</fpage>
          -
          <lpage>149</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>R.</given-names>
            <surname>Andoga</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Főző</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Madarász</surname>
          </string-name>
          ,
          <source>Digital Electronic Control of a Small Turbojet Engine MPM 20. Acta Polytechnica Hungarica</source>
          .
          <volume>4</volume>
          (
          <issue>4</issue>
          ) (
          <year>2007</year>
          )
          <fpage>83</fpage>
          -
          <lpage>95</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>A. V.</given-names>
            <surname>Maltsev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Mikhaylyuk</surname>
          </string-name>
          ,
          <article-title>Virtual Environment System for Pirs Space Module Interior</article-title>
          .
          <source>CEUR Workshop Proceedings: Proc. of the 29th International Conference on Computer Graphics and Vision</source>
          <volume>2485</volume>
          (
          <year>2019</year>
          )
          <fpage>1</fpage>
          -
          <lpage>3</lpage>
          . URL: http://ceur-ws.
          <source>org/</source>
          Vol-2485
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>A. A.</given-names>
            <surname>Altunin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. P.</given-names>
            <surname>Dolgov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. R.</given-names>
            <surname>Zhamaletdinov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Yu. Irodov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. S.</given-names>
            <surname>Korennoj</surname>
          </string-name>
          ,
          <article-title>Napravleniya primeneniya tekhnologij virtual'noj real'nosti pri podgotovke kosmonavtov k vnekorabel'noj deyatel'nosti. Pilotiruemye polety v kosmos 1 (38) (</article-title>
          <year>2021</year>
          )
          <fpage>72</fpage>
          -
          <lpage>88</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>A. V.</given-names>
            <surname>Maltsev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Mikhaylyuk</surname>
          </string-name>
          ,
          <article-title>Visualization and virtual environment technologies in the tasks of cosmonaut training</article-title>
          .
          <source>Scientific Visualization</source>
          .
          <volume>12</volume>
          (
          <issue>3</issue>
          ) (
          <year>2020</year>
          )
          <fpage>16</fpage>
          -
          <lpage>25</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Mikhailiuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. V.</given-names>
            <surname>Maltsev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Iu. Timokhin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. V.</given-names>
            <surname>Strashnov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. I.</given-names>
            <surname>Kriuchkov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. M.</given-names>
            <surname>Usov</surname>
          </string-name>
          ,
          <article-title>Sistemy virtualnogo okruzheniia dlia prototipirovaniia na modeliruiushchikh stendakh ispolzovaniia kosmicheskikh robotov v pilotiruemykh poletakh</article-title>
          .
          <source>Pilotiruemye polety v kosmos</source>
          .
          <volume>2</volume>
          (
          <issue>35</issue>
          ) (
          <year>2020</year>
          )
          <fpage>61</fpage>
          -
          <lpage>75</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>T.</given-names>
            <surname>Tomchinskaya</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Shaposhnikova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Dudakov</surname>
          </string-name>
          ,
          <article-title>Training Beginners and Experienced Drivers using mobile-based Virtual and Augmented Reality</article-title>
          .
          <source>CEUR Workshop Proceedings: Proc. of the 30th International Conference on Computer Graphics and Vision</source>
          <volume>2744</volume>
          (
          <year>2020</year>
          ).
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>