=Paper= {{Paper |id=Vol-3027/paper122 |storemode=property |title=Research of Immersion in Virtual Reality Technology for the Stage of Information Systems Design |pdfUrl=https://ceur-ws.org/Vol-3027/paper122.pdf |volume=Vol-3027 |authors=Nicolay Dudakov }} ==Research of Immersion in Virtual Reality Technology for the Stage of Information Systems Design== https://ceur-ws.org/Vol-3027/paper122.pdf
Research of Immersion in Virtual Reality                                                                   Technology
for the Stage of Information Systems Design
Nicolay Dudakov 1
1
 Nizhny Novgorod State Technical University n.a. R.E. Alekseev, Minin Street, 24, Nizhny Novgorod, 630000,
Russia

                Abstract
                Virtual reality technology (VR) is a comprehensive technology that allows you to immerse a
                person in an immersive virtual world using specialized devices (virtual reality helmets). Virtual
                reality provides a complete immersion in the computer environment surrounding the user and
                responding to his actions in a natural way, manipulating objects and programmable events in
                the virtual environment. Virtual reality constructs a new artificial world transmitted to man
                through his sensations: vision, hearing, touch and others. A person can interact with a 3D,
                computerized environment, as well as manipulate objects or perform specific tasks by gaining
                user experience. Virtual reality is an evolving technology for the transfer of user experience
                (User Experiment: UI) from person to person. Currently, at the stage of information systems
                design, the use of induced virtual environment technology in dynamic interaction systems leads
                to the need to analyze the effectiveness of visual perception and transfer to the human
                neocortex. Visual perception represents a fundamental scientific task and is studied in many
                areas of science: neurophysiology, psychophysics, psychology, computer graphics and virtual
                environment, computer vision, computer science theory.

                Keywords 1
                Virtual reality, VR, User Experiment, UI, Immersion, Graphics, Virtual World Geometry
                Detailing, Texturing, Low Polygonal Models, High Polygonal Models

1. Evaluation of virtual reality immersion
    The task of obtaining the maximum positive experience and at the lowest cost is one of the priority
tasks when calculating the life cycle of any information system. One of the indicators of the quality of
virtual reality and as a result of visual perception is Presents - the effect of presence. For a better
immersion, three main factors are currently taken into account:
        Technical characteristics of the device;
        Dynamic interaction level;
        Visual accompaniment qualities.
    On the technical side, the presence effect is achieved when the system substitutes an image for each
eye in the virtual space in the same way as the user is located in the real world with minimal frame
delay and accuracy of positioning the virtual avatar and the real user. Peripheral devices (glasses,
helmets) and low-level software that convert the results of processing digital machine codes into a form
convenient for human perception or suitable for affecting the actuators of the control object. Key
characteristics for 2019: resolution of the VR/AR headset 615 pixels per inch, visualization system -
monofocal, linear level of environment, accuracy of measurements of the oculograph at a frequency of
1000 Hz delay of 1 ms and at a power consumption of 50 mV - 10 angular minutes. Currently, the
optimal frame rate varies from 60 to 120 fps. The average value for virtual reality systems is 90 fps [3].
    A set of means, methods, and methods for transmitting information. Key characteristics for 2019:
echotest when checking the quality of data transmission at a 50 Mbit/s 50-100 ms channel. Dynamic

GraphiCon 2021: 31st International Conference on Computer Graphics and Vision, September 27-30, 2021, Nizhny Novgorod, Russia
EMAIL: DudakovNU@gmail.com
ORCID: 0000-0002-7844-6593 (N. Dudakov)
             ©️ 2021 Copyright for this paper by its authors.
             Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
             CEUR Workshop Proceedings (CEUR-WS.org)
interaction with space is achieved by software and requires large and complex computing power from
a device for reproducing physical properties of space and interaction within it. It is achieved by
programming physics and scripting the user's actions. The visual support of the user's actions is that it
is necessary to replace the perception of the user in reality with virtual objects. One of the criteria is:
        Visual recognition of image geometry;
        Realistic texturing of Virial reality objects;
        Physical properties.
    Modern research shows that within virtual systems, objects used in simulation reality should be
similar or intuitively perceived by the user as similar objects in the real world. To optimize the
performance of systems, lowpoli is usually used - low-polygonal three-dimensional models with
sufficient geometry to perceive the type of object [4]. The user experience within the simulation must
match the real experience. For example, a button in virtual space and in reality must have similar
external and physical (logical) characteristics and properties, the button is pressed, the lever is shifted
after which a logical action follows, such as opening doors, changing the property of the object.
    The environment in the virtual space is one of the important ones since the user has the ability to
determine the depth of what is happening. Virtual reality technology is based on stereoscopic vision
and reproduction of a picture for each eye [5]. To determine the distance and distance of an object, the
user can bypass virtual obstacles and interact with objects.
    Reproduction of physical properties of objects, such as weight, tactile drains, smell remains an urgent
task in virtual reality. One of the leading companies in the direction of recreating virtual interaction
with objects and objects is the Belarusian company that created the Teslasuit costume. It is equipped
with a motion capture system, climate control, biometric sensors, electrical stimulation, all this allows
you to feel virtual reality [6]. But so far it is a prototype, and is not available for mass use.
    Each system selects its own relationship between the visual component and the dynamic impact.
This factor is very important in the design of virtual systems, since at the design stage the application
playback platform and the target audience are selected. An error at this stage can affect the quality and
labor of creating virtual reality systems. For example, for video 360, the visual component is high, and
the dynamic part is simplified before moving or changing scenery [7]. It is important for the user to see
a realistic picture, to consider the space that the director shows, to get aesthetic enjoyment.

2. User Immersion Research
   In systems with dynamic interaction, the main task is to gain user experience. Development of new
skills, fulfillment of tasks, achievement of the goal. In these cases, the visual component fades into the
background. The user's brain is unable to assess the detail of the world around it and is focused on the
task at the same time. The geometry of objects is perceived at the associative level. To study this
perception, various dynamic simulators were analyzed with different levels of detail and texturing of
the surrounding world and the same external factors:
   Experiment space: a space of 3x3 meters with an even floor covering. In the room there is an
instructor who monitors the user.
         Virtual Reality Helmet: HTC Vive.
         PC: GPU: Nvidia GeForce GTX 970; CPU: Intel Core i5-4590; 8 GB RAM.
         Task: Before immersing in a virtual space, the user is given the task: to reach the end of the real
   room. The instructor helps to dress the virtual reality headset. The user executes the task within 2
   minutes.
         Simulation: the user sees a narrow virtual platform (board) under his feet: 50x300 cm. The
   virtual environment of the city with different levels of detail and texturing in accordance with the
   conditions of the experiment. The instructor reports the beginning of the experiment and asks the
   user to take a step from the virtual platform.
   When immersed in a virtual environment, the brain department responsible for emotions and
instincts works, while consciousness is fully aware of the reality of what is happening. This is due to
instincts formed for the evolution of man and stored in Neocortex. All available experience related to
the outside world, everything that the user analyzes from input data received from the eyes, hearing
organs and other sensory organs. Neocortex generates commands for motility and causing groups of
neurons to work unconsciously [8]. This effect allows you to train a person's ability to perform specific
actions in critical situations, to develop user experience (UI), modernizing their instincts in favor of
rational thinking, without exposing themselves to real danger.
    This effect is achieved by installing a virtual avatar on the edge of the abyss. Regardless of detail,
the quality of the graphics users instinctively shift the center of gravity down and seek to leave the
danger zone or acquire a stable position. This proves that the user, being safe in the real world, perceives
virtual reality as a real threat. Simulators with dynamic height changes with different levels of detail
and texturing, such as Fancy Skiing VR, Beat Saber, Richie's Plank Experience, Summer Funland
(figure 1). In all tests of the simulators, the user knew that in reality he was not in danger, there was an
instructor in the room who insured the user against falling. The user can remove the headset at any time,
but at the physiological level, users tried to leave the discomfort zone and take a safer position.




 Figure 1: Examples of VR systems: Fancy Skiing VR, Beat Saber, Richie's Plank Experience,
 Summer Funland.


3. Simulations of different levels of scene detail
    Three virtual scenes were created for the completeness of the study:
        Maximum surroundings: High polygonal model 1,000,000 Pol polygons, texturing
    environment objects, one light source (figure 2).
        Medium environment: Medium detailed model of 10,000 Pol polygons, without additional
    textures, objects painted gray, one light source.
        Minimum environment: Low-polygonal model 14 Pol polygons platform, objects painted gray,
    one light source.
    The user avatar was placed on a virtual platform. In reality, before the headset was worn and the
user plunged into virtual space, a task was given: to take several steps into the abyss. The user saw that
he was in the room and in front of him was a solid surface.
    Within 60 seconds, the user could look inside the virtual space and feel that he was not in danger of
falling from a height. After 60 seconds of simulation, the user had to start the job. 84% of test group
users had physiological needs to reduce the center of gravity in order to adopt a more comfortable
position. 67% of users refused to take a step into the abyss (figure 3).



Maximum environment:
                         A                                                   B

 Figure 2: Maximum environment in a virtual scene, A - first person view, B - general view




 Figure 3: Maximum environment in a virtual scene, red line - average height change from time

Medium Environment: Medium detailed model of 10,000 Pol polygons, without additional textures,
objects painted gray, one light source (figure 4).




                         A                                                   B

 Figure 4: Medium Environment in a virtual scene, A - first person view, B - general view

 After 60 seconds of simulation, the user had to start the job. 79% of test group users had
 physiological needs to reduce the center of gravity in order to adopt a more comfortable position.
 71% of users refused to take a step into the abyss (figure 5).
 Figure 5: Medium Environment in a virtual scene, red line - average height change from time

Minimum environment: Low-polygonal model 14 Pol polygons platform, objects painted gray, one
light source (figure 6-7).




                         A                                                  B

 Figure 6: Minimum environment in a virtual scene, A - first person view, B - general view




 Figure 7: Minimum environment in a virtual scene, red line - average height change from time


4. Output
   From the experiment, it can be concluded that in order to obtain user experience in virtual reality
systems, the influence of visual accompaniment is minimized, since the user perceives what is
happening in the headset as reality.
   Steps of immersion in the virtual environment: Login, Introduction to the virtual environment,
Completing the task. Green line - Maximum environment experiment Yellow Line - Medium
Environment Experiment Blue line - Minimum environment experiment (figure 8).
 Figure 8: Red line - the average result of the change in height from the time of all experiments

   This experiment shows that users perceive virtual space in a similar way to a real environment. In
specific cases, the danger of falling in virtual space and the inclusion of instinctive mechanisms to return
to the comfort zone. Therefore, to gain user experience within simulations of real space, graphical
accompaniment in the form of realistic geometry of objects and high-quality textures is a secondary
factor in immersion into virtual reality.

5. References
[1] Roadmap for the development of “end-to-end” digital technology “virtual and augmented reality
    technology”,          2021.            URL:          https://tadviser.com/index.php/Article:End-to-
    end_digital_economy_technologies
[2] V.P. Aleshin, Technology of Virtual 3D Environment in Inverse Problems for Analysis of Visual
    Perception and Image Interpretation, in: Proceedings of the 26th International Conference on
    Computer Graphics and Vision GraphiCon'2016, Autonomous Non-Profit Organization “IFTI”,
    2016, pp. 9-13. (in Russian)
[3] V.N. Voloshina, S.E. Putilova, I.A. Shcherbinina, T.A. Yunaeva, Virtual reality as the
    technological basis for laboratory work in marine engineering education: characteristics and
    principle of work, Transport business of Russia 2 (2020) 117-119. (in Russian)
[4] A.I. Malysheva, T.N. Tomchinskaya, Features of low-polygonal modeling and texturing in mobile
    applications, in: Proceedings of the 29th All-Russian Scientific and Practical Conference on
    Graphic Information Technologies and Systems KOGRAF-2019, 2019, pp. 51-54.(in Russian)
[5] A.I. Vinokur, N.V. Kondratyev, Yu.N. Ovechkis, Study of stereoscopic characteristics of virtual
    reality helmets, Scientific visualization 12(1) (2020) 61-69. doi: 10.26583/sv.12.1.05
[6] V.A. Donbass, V.V. Ponomarev, S.V. Gubkin, V.E. Mitskevich, A.N. Osipov, New possibilities
    of quantitative evaluation of the qualitative structure of adaptive kinematics, Medical journal 4
    (2020) 69-77. URL: http://rep.bsmu.by/handle/BSMU/29366 (in Russian)
[7] D.A. Dmitriev, A.D. Filinskikh, Creating Virtual Excursions, in: the collection: Proceedings of the
    28th All-Russian Scientific and Practical Conference on Graphic Information Technologies and
    Systems KOGRAF-2018, 2018, pp. 32-38. (in Russian)
[8] D.V. Popov, Consciousness as an interface in the communication structure, Manuscript 12(2)
    (2019) 54-60. doi: 10.30853/manuscript.2019.2.10 (in Russian)