=Paper= {{Paper |id=Vol-3101/Paper4 |storemode=property |title=Research on the increase of information theory in the era of the ending of silicon electronics and new types of risks |pdfUrl=https://ceur-ws.org/Vol-3101/Paper4.pdf |volume=Vol-3101 |authors=Volodymyr Hrytsyk,Mariia Nazarkevych |dblpUrl=https://dblp.org/rec/conf/citrisk/HrytsykN21 }} ==Research on the increase of information theory in the era of the ending of silicon electronics and new types of risks== https://ceur-ws.org/Vol-3101/Paper4.pdf
Research on the Increase of Information Theory in the Era of
the Ending of Silicon Electronics and New Types of Risks
Volodymyr Hrytsyk1 and Mariia Nazarkevych1
1
    Lviv Polytechnic National University, 12, Stepan Bandera str, Lviv, 79013, Ukraine


              Abstract
              Today, it is obvious that the era of silicon electronics is nearing completion, and nanotechnologies are
              being replaced by quantum technologies that operate on ions, atoms and elementary particles. This
              means that the rules for building computers will change: information theory will need to be adapted:
              the laws that describe the rules of behavior in the quantum world will have to be taken into account.
              One of the most probable scenarios of information theory development is investigated. The paper
              focuses on understanding "elementary" objects, operations, and simple systems of quantum
              information theory to study their properties. That is, for a promising generation it is necessary to
              develop conceptual models of work for conditions where the classical rules of physics and computer
              science will not be fulfilled by virtue of other laws of physics that affect monoatomic processors. In
              particular, one of the conditions affecting a monoatomic carrier is the condition of "Schrödinger cat" –
              the storage cell is simultaneously in the states of zero and one. This condition requires the
              development of specific protocols for data processing and transmission and reproduction; that is, a
              significant correction in the laws of information theory is required. To do this, the author reviews the
              problem and formulates the problem to be solved. Any material object or process is the primary
              source of information. All its possible states constitute the source code of information.

              Keywords 1
              risks, quantum information, supersymmetry, muon




1. Introduction
The discovery of the last decade has made it possible to implement fully automated adaptive
cybernetic systems for their direct interaction with humans. Computer vision works as an
element of perception of the external environment. People began to understand the basics of the
processes taking place in the brain and already implemented implants (artificial arms, legs, eyes,
toys, etc.). The U.S. Department of State's automated identification system processes more than
seven hundred and five million photos a year. We live in the era of the next round of cybernetic
technologies - the era of the Industrial Revolution 4.0 [1]. The dynamics of the emergence of
new concepts, methods and relevant opportunities (theoretical, applied developments) [2] is
rapidly moving to the moment where society reaches a point where the laws that have served as
a support for science can no longer describe the rules of cybernetic systems, and the ability to

CITRisk’2021: 2nd International Workshop on Computational & Information Technologies for Risk-Informed Systems, September
16–17, 2021, Kherson, Ukraine
EMAIL: volodymyr.v.hrytsyk@lpnu.ua (V.Hrytsyk); mariia.a.nazarkevych@lpnu.ua (M.Nazarkevych)
ORCID: 0000-0002-9696-5805 (V.Hrytsyk); 0000-0002-6528-9867 (M.Nazarkevych)
              © 2021 Copyright for this paper by its authors.
              Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
              CEUR Workshop Proceedings (CEUR-WS.org)
reliably predict the flow of information in 15-20 years is impossible to predict today. This will
happen because the principles of information dissemination can be based on fundamentally new
models [3, 4], which we do not use today. Moreover, although such an assumption for people,
who want to fall into the field of adaptive IT research, seems a bit premature, but the worst-case
scenario is that this is the only possible model of development. In this paper, the authors
explores one of the most likely scenarios for the development of information theory.
    Today there is a clear structure of definitions of concepts used in information technology, i.e.
information, messages, signal, data, entropy – these are different concepts that have a clearly
defined place in information theory. In the theory of information, information is considered as a
quantitative measure to eliminate uncertainty, which decreases as a result of obtaining some
information.
    Let us introduce several definitions:
    Information is new knowledge that the consumer receives as a result of the perception of
certain messages.
    Knowledge is a set of data (from an individual, society or artificial intelligence system) about
the world, including data on the properties of objects, patterns of processes and phenomena, as
well as the rules of using this information for decision-making. In other words, knowledge is a
reflection of reality (objects, situations) in the form of concepts, relationships (subjective image
of objective reality).
    Subject knowledge is the ability to solve problems independently in the subject area, i.e. to
use the available data to achieve the goal.
    Data is a message recorded on a specific medium and presented in a form convenient for
receiving, transmitting, processing information-analytic system (human or device).
    A message is a sequence of signs, signals or physical processes that change over time, i.e.
those that have a material and energy basis.
    The theory of information is based on the method proposed by K. Shannon for calculating the
amount of information that contains one random variable relative to another random variable.
The amount of information in a message is determined by how much uncertainty is reduced after
receiving the message. From this point of view, the amount of information contained in the
received message is the greater, the greater the uncertainty before the transmission of the
message.
    If the state of an object or system is known in advance, then the message about this state does
not carry any information for its recipient. If we receive new information about the
state/condition of the observed object, then this message carries new information that will
increase knowledge about the object. In this case, it is said that such a message contains some
information for its recipient.
    Thus, in the first approximation, information can be defined as new, previously unknown
knowledge about the state of an object or system, and the amount of information is the amount of
this knowledge. It is clear that if new knowledge increases the general level of knowledge about
the state of the observed object, the amount of information accumulates and has an additive
nature.
    As a result, before the source sends a message to its recipient, there is uncertainty about the
content of the message about the state of the object of observation. After selecting a message, the
source generates a certain amount of information, which reduces this uncertainty.
    The theory of information is based on the method proposed by Claude Shannon to measure
the amount of information contained in one random variable relative to another random variable
[5, 35]. This method allows you to express the amount of information in numbers and provides
an opportunity to objectively assess the information contained in the report.
    Consider the basic calculations of Shannon's probabilistic approach to determining the
amount of information.
    Let a discrete source of information output a sequence of messages xi, each of which is
selected from the alphabet X={x1, x2, … xi …, xk}, where k is the capacity of the source alphabet.
Each elementary message for its recipient contains information as a set of information about the
state of an object or system.
    In order to abstract from the specific content of information, i.e. its semantic meaning, and to
obtain the most general definition of the amount of information, the quantitative measure of
information is determined without taking into account its semantic content, as well as value and
usefulness for the recipient.
    Before the connection takes place, there is uncertainty as to which of the possible messages
will be transmitted. The degree of uncertainty in the transmission of the message хi can be
determined by its a priori probability pi. Therefore, the amount of information containing one
average message I(Xi) will be some function of pi: I(Xi)=f(pi). You mean the type of this
function.
    Let the measure of the amount of information I(Xi) correspond to two properties:
   1) if the choice of the source message xi is known in advance (there is no uncertainty), ie we
have a reliable case, the probability of which is pi=1, then I(Xi)=f(1)= 0;
   2) if the source sequentially issues messages xi and xj, and the probability of such a choice pij
is a compatible probability of events xi and xj, then the amount of information in these
elementary messages is equal to the sum of the amount of information in each of them.
   The probability of the joint loss of two random events xi and xj is equal to the product of the
probability of one of these events and the probability of the other, provided that the first event
occurred, another words pij=pi⋅pj/i=P⋅Q.
   Then, from property 2 of the amount of information, it follows that
                            I(Xi, Xj)=I(Xi)+I(Xj)=f(P∙Q)=f(P)+f(Q).
    It follows that the function f(pi) is logarithmic. Therefore, the amount of information is
related to the a priori probability ratio
                                          𝐼𝐼(𝑋𝑋𝑖𝑖 ) = 𝑘𝑘 ∙ 𝑙𝑙𝑙𝑙𝑙𝑙 𝑝𝑝𝑖𝑖 ,
    here k – the capacity of the probable message alphabet;
    pi – the probability of the i-th message
    Xi – random value of the хі -th message;
    к and the base of the logarithm can be arbitrary.
    That the amount of information was determined by a non-negative number, taken k= – 1, and
the basis of the logarithm for ease of calculation on a computer choose 2, then we have:
                                              𝐼𝐼(𝑋𝑋𝑖𝑖 ) = − 𝑙𝑙𝑙𝑙𝑙𝑙 𝑝𝑝𝑖𝑖 .
   That is log 𝑝𝑝𝑖𝑖 = 2𝐼𝐼(𝑋𝑋𝑖𝑖) and in general evidence for computer:
                              𝐼𝐼(𝑋𝑋𝑖𝑖 ) = ∑𝑘𝑘𝑖𝑖=1 𝑝𝑝𝑖𝑖 log 2 𝑝𝑝𝑖𝑖                                 (1)
   In this case, a unit of information is also taken bit. Thus, a bit is the amount of information in
a message of a discrete source, the alphabet of which consists of two alternative events that are a
priori equally likely. If the number of a priori equally probable events is 28, then the byte is
taken as a unit of information.
   If the source of information gives a sequence of interdependent messages, then receiving each
of them changes the probability of occurrence of the following and, accordingly, the amount of
information in them. In this case, the amount of information is expressed through the conditional
probability of choosing the source of the message xi, provided that the messages xi-1, xi-2 …, i.e.
                             𝐼𝐼(𝑋𝑋𝑖𝑖 /𝑋𝑋𝑖𝑖−1 , 𝑋𝑋𝑖𝑖−2 , … ) = − log 2 𝑝𝑝(𝑥𝑥𝑖𝑖 /𝑥𝑥𝑖𝑖−1 , 𝑥𝑥𝑖𝑖−2 , … )   (2)
   The amount of information I(X) is a random variable because the messages themselves are
random. The law of probability distribution I(X) is determined by the probability distribution
P(X) of the ensemble of source messages.


2. The concept of source entropy
The entropy of the source (specific amount of information) is the amount of information
contained in one elementary message xi. The entropy of the source is determined by the formula
introduced by Claude Shannon in 1948 [5, 35]:
                                                                                            1
                              𝐻𝐻(𝑋𝑋) = − ∑𝑘𝑘𝑖𝑖=1 𝑝𝑝𝑖𝑖 log 2 𝑝𝑝𝑖𝑖 = ∑𝑁𝑁−1
                                                                    𝑖𝑖=0 𝑝𝑝𝑖𝑖 log 2 � �,               (3)
                                                                                           𝑝𝑝𝑖𝑖

   here 𝐻𝐻(𝑋𝑋) – the entropy of the source;
   𝑝𝑝𝑖𝑖 – the frequency with which the i-th letter occurs in the language in which the message is
written.
   k – the number of letters in the source alphabet.
                 1
   Here log 2 is interpreted as a certain amount of information 𝐼𝐼𝑖𝑖 , obtained during the
              𝑝𝑝𝑖𝑖
implementation of the i-th option. Entropy in Shannon's formula is the average value of a set of a
sequence of numbers with different information capacity, i.e. it is a mathematical expectation of
the distribution of a quantity 𝐼𝐼0 , 𝐼𝐼1 , … , 𝐼𝐼𝑁𝑁−1 .
   The physical content of entropy is an average measure of the uncertainty of the knowledge of
the recipient of information about the state of the observed object.


        2.1. Formulation of the problem
Moore's Law is an empirical law according to which the number of transistors in a crystal of one
integrated circuit doubled every year for the first 15 years, and then, and to this day, such
doubling occurs every 1.5 years. Therefore, if the first silicon chips were implemented on the
technology of tens of microns, today the working control is carried out at 10 nm (Intel), 7 nm
Qualcomm Snapdragon 865+ (AMD) and 5 nm (Apple A14). IBM has demonstrated a "quantum
mirage" effect, in which an atom can store one bit of information [19]. In addition, one atom is
the limit beyond which quantum effects can be a real issue with reliability, as the behavior of
such small objects begins to be described by the rules of quantum physics, such as electronic
tunneling. Therefore, nanoconductors can no longer conduct electrons according to the laws of
classical theory - in the usual way: in the form of a stream of charged particles in a solid. Recall
that quantum mirages were discovered by Hari Manoharan, Christopher Lutz.
    If Moore's Law continues to operate, in less than 20 years the size of the integrated circuit
will be at the level of atoms. That is, the laws of their operation will be determined by quantum
mechanics. Today, there are many inconsistencies in describing the behavior of nanoworld
objects in classical systems. For example, the very fact of observing an atom disturbs its motion,
and the absence of observation leads to a blurring of its velocity and trajectory (lack of trajectory
– the ratio of Heisenberg uncertainty) – as if it is in several places at once (at one point in time).
Thus, quantum effects are perceived as an obstacle in the design of ultra-small computers.
Quantum computer science must figure out how to use fundamental quantum properties.
   Fundamental interactions appeared due to spontaneous symmetry breaking in the first
moments of the universe's existence. Modern cosmological theories consider the evolution of the
universe, starting with the so-called Planck moment, 5.4 × 10-44s. After that, the only field that
had the greatest symmetry disintegrated, and gravity separated from it. To avoid a long
description, we reflect the current state in Fig. 1:
   So, the formulation of the problem:
   Problem 1 - reducing the size of computational elements and integrated circuits has a natural
limit - the point when the laws of quantum information theory will replace the laws of classical
information theory, respectively, there is a need to explore the extension of information theory to
quantum information theory.
   Problem 2 - reducing the share of energy dissipated. Logically inverse operations are
operations that are not accompanied by energy dissipation (Landauer, 1961). Thus, to perform
classical calculations requires a physical system that has two stable states, such as triggers in
electronics, which provide information in the form of a binary system.


3. Research of quantum properties of information carriers
The idea of using quantum properties of information carriers to accelerate the process of
computation and development of the quantum computer is attributed to the American physicist
R.-F. Feynman [7, 8], and improvement to the English physicist D. Deutsch [9]. E. Bernstein and
W.W. Vazirani in 1993 proved the possibility of performing an arbitrary unary operation in
quantum calculations using a finite number of operations (universality of the quantum Turing
machine) [10]. American scientists P. Shore and L. Grover have developed algorithms that fully
use quantum interference and entanglement for significant (compared to classical) computational
acceleration [11, 12]. Methods of controlling nanoscopic particles by creating and controlling
coherent quantum states, which led to the creation of experimental implementations of qubits
(quantum information carrier), are presented. Additional studies of quantum computing can be
found in [13].
Figure1: Basic forces of interaction and elementary particles

In practice, it can be said that commercial devices for creating quantum communication lines
(QPN Security Gateway by MagiQ Technologies (USA), cryptosystems Clavis and Cerberis by
ID Quantique (Switzerland)) have long been on the market. In the last decade, leading
laboratories have been fighting for quantum dominance at the level of commercialization of
quantum computers [34]. Thus, today, we see that the IT industry is preparing for the transition
to a new ideology of servicing information flows. Accordingly, Ukrainian participants must
prepare for the rules of the game of the future.
     Therefore, consider the known elements of quantum information transfer. By analogy with
the classical theory of information, we systematize the concept of quantum information theory.
     Definition. Quantum information – information contained in the state of a quantum system. It
is the subject of the study of quantum information theory.
     Definition. A quantum object is an object of the nanoworld (electrons, atoms, molecules,
light clusters) to which the rules of quantum theory apply, in particular, its description in terms
of states that are statistical in relation to all measurement results;
     The information carrier is the state of the quantum system H, which is an information
resource because it is statistically uncertain
     The state of a quantum object is a list (catalog) of possible measurement results performed on
it. In quantum mechanics, two states of objects are considered: pure and mixed.
     Pure states describe objects that are independent of all other quantum objects around them.
Mathematically, the pure state is given by the state vector (wave function), which is usually
written in Dirac notation as a cat vector |Ψ⟩. This vector belongs to the multidimensional
complex Euclidean space, which is called the Hilbert space of the physical system. The
dimension of the Hilbert space D specifies an important parameter - the number of independent
states (levels) of the quantum system. Using as orthogonal vectors mutually orthogonal vectors
|𝑖𝑖⟩, forming a complete set ∑𝐷𝐷               ̂
                               𝑖𝑖=1|𝑖𝑖⟩⟨𝑖𝑖| = 𝐼𝐼 , we can represent an arbitrary state vector in the form
of a linear superposition of basis vectors :
                                                    |Ψ⟩ = ∑𝐷𝐷              ̂
                                                           𝑖𝑖=1|𝑖𝑖⟩⟨𝑖𝑖| = 𝐼𝐼                         (4)
   The choice of basis is determined by the procedure of the physical experiment and its integral
part - measurement. There are an infinite number of orthogonal bases, and hence an infinite
number of different representations of an arbitrary state vector.
   The simplest example Mixed states are described using a density matrix:
                                                    |Ψ⟩ = ∑𝐷𝐷              ̂
                                                           𝑖𝑖=1|𝑖𝑖⟩⟨𝑖𝑖| = 𝐼𝐼                         (5)
   The simplest example of a superposition might be a state vector of a system having two
orthogonal states and |𝛹𝛹1 ⟩ and |𝛹𝛹2 ⟩ [16]. The state of such an object is described by a state
vector (wave function):
                                                       |𝛹𝛹⟩ = 𝛼𝛼|𝛹𝛹1 ⟩ + 𝛽𝛽|𝛹𝛹2 ⟩ ,                  (6)
   here 𝛼𝛼 𝑎𝑎𝑎𝑎𝑎𝑎 𝛽𝛽 – complex numbers or amplitudes of states, when
                                                    |𝛼𝛼|2 + |𝛽𝛽|2 = 1.                               (7)
    During the measurement, the coherent superposition (6) is destroyed and reduced to a new
state, which is determined by the type of measurement. Yes, when trying to find a system in the
state |𝛹𝛹2 ⟩ perturbation of the measuring instrument will lead to the fact that at the time of
measurement there will be a reduction (design):
                                  |𝛹𝛹 ⟩⇒        |𝛹𝛹2 ⟩⟨𝛹𝛹2 |𝛹𝛹⟩⇒|𝛹𝛹2 ⟩,                              (8)
    as a result of which the system after measurement will pass to the state |𝛹𝛹2 ⟩, and the initial
state will cease to exist.
                               𝑝𝑝𝑚𝑚𝑚𝑚𝑚𝑚 = |𝛼𝛼|2 |𝛹𝛹1 ⟩⟨𝛹𝛹1 | + |𝛽𝛽|2 |𝛹𝛹2 ⟩⟨𝛹𝛹2 |                    (9)
    This state is essentially classical, because in the mixed state (9) the system can be either in
the state |𝛹𝛹1 ⟩, or in the state |𝛹𝛹2 ⟩, whereas in the superposition state (6) the system can be in
two states simultaneously.
    In the work of Erwin Schrödinger "The current state of quantum mechanics" [14] analyzed
the features of quantum mechanical processes and formulated four main provisions that
characterize the state of a quantum object:
    The principle of superposition. The state of a quantum object is described by a linear
combination of basis states;
    The principle of interference. The measurement result depends on the relative phases of the
wave functions included in the superposition state;
    Entanglement of quantum states. Complete information about the state of the whole system
does not correspond to the same complete information about the state of its components;
    Non-cloning and uncertainty. The unknown quantum state cannot be copied or observed
without its perturbation – the state is described only statistically [15].
    Quantum entanglement states are a quantum mechanical phenomenon in which the quantum
states of two or more objects are interdependent. This interconnection persists even if the
entangled particles are spatially spaced beyond any known interactions. Measuring the parameter
of one particle causes instantaneous (more than the speed of light) destruction of the confused
state and determining the state of the second particle with one hundred percent probability. Such
behavior does not agree with the principle of locality, but does not violate the theory of
relativity, because there is no transfer of information.
    A good example of confused states are photon pairs in Bell states [17].


        3.1. Quantum calculations
The changes that occur with the quantum state can be described in the language of quantum
calculations. The fundamental model of quantum computing is quantum schemes. By analogy
with the classical computer, which contains wires and logic elements, a quantum computer is
built of quantum circuits consisting of wires and elementary quantum elements that allow the
transmission of quantum information and manipulate it.
    The evolution of an unmeasured quantum system translates the initial state of the quantum
system into another state while maintaining the norm. For Hilbert space, the transformation must
be reversible, i.e. unitary. Any transformation on a d-dimensional complex vector space can be
described using a matrix of size d × d. Let M* be a transposed complex-conjugate matrix with
respect to the matrix M. A matrix M is called unitary (one that describes unitary transformations)
if the condition M*M=1 is satisfied. An arbitrary unitary transformation of quantum space is an
admissible quantum transformation, and vice versa. Thus, the condition of unitarily is the only
restriction imposed on quantum elements. Unitary transformation can be interpreted as rotations
in a complex vector space.
    A quantum computer is a device that performs logical operations on quantum states by unary
transformations (that is, energy-saving ones) without disturbing quantum superposition in the
computational process. Schematically, the operation of a quantum computer can be represented
as a sequence of three operations:
   1) "record" - reading the initial state;
   2) "calculations" - unitary transformations of initial states;
   3) "output" of the result (measurement, design of the final state).


        3.2.    Qubit
By analogy with a classical computer operating on a sequence of classical bits, a quantum
computer operates on a sequence of quantum bits. Thus, the fundamental unit of information in
quantum information systems is a quantum bit (quantum bit) or abbreviated qubit (qubit). Some
quantum system with two stable states can be used to represent the qubit. Many different
physical systems are used to implement qubits. For example, in a model of an atom, an electron
can exist either in the ground |0⟩ or in the excited |1⟩ states. By irradiating the atom with light
with a certain energy, it is possible to make transitions between these states.
   A significant difference between a qubit and a classical bit is that a qubit can be in an
indeterminate (non-statistical) state other than 0 or 1. Let us call this state non-statistical in
contrast to |0⟩ and |1⟩, which are statistical states.
   A linear combination of states is a superposition:
                                     |𝛹𝛹 ⟩ = 𝛼𝛼|0⟩ + 𝛽𝛽|1⟩                                    (10)
   We now show the rule of change in the classical addition (table 1) and the use of "quantum
entanglement" (entanglement of quantum states) (table 2).
Table 1
Classic bitwise addition
               a                                         b                           a⊕b
               0                                         0                            0
               1                                         0                            1
               0                                         1                            1
               1                                         1                            0

Table 2
Addition taking into account the quantum effect
          a                        b                                        a’        b’ = (a⊕b)
          0                        0                                        0              0
          1                        0                                        1              1
          0                        1                                        0              1
          1                        1                                        1              0

In table 2, the value of qubit a is stored, the value of qubit b changes according to the law XOR.
Bit b - (target) changes its state only when the state of the control bit = 1; the state of the control
bit does not change. Thus, we see why, in the general case, logical data can be cloned and
quantum data cannot. In the general case, quantum data means the superposition of the form:
                                          |𝛹𝛹⟩ = 𝛼𝛼|𝛹𝛹⟩ + 𝛽𝛽|𝛹𝛹⟩
   here 𝛼𝛼 і 𝛽𝛽 – complex numbers or amplitudes of states, when |𝛼𝛼|2 і |𝛽𝛽|2 = 1.


          3.3.   Quantum entropy (von Neumann entropy)
Suppose that for a quantum mechanical system of density 𝑝𝑝̂ we have |𝑝𝑝𝑖𝑖 ⟩ - eigenvectors of the
density matrix corresponding to the eigenvalues p_i (for simplicity, we take them as
nondegenerate matrices), then
                                            𝑝𝑝̂𝑖𝑖 |𝑝𝑝𝑖𝑖 ⟩ = 𝑝𝑝𝑖𝑖 |𝑝𝑝𝑖𝑖 ⟩,
   here 𝑝𝑝𝑖𝑖 – the probability of finding a quantum system in the pure state 𝑝𝑝̂𝑖𝑖 = |𝑝𝑝𝑖𝑖 ⟨𝑝𝑝𝑖𝑖 | . Here
0 ≤ 𝑝𝑝𝑖𝑖 ≤ 1.
   In the basis, {|𝑝𝑝𝑖𝑖 } the density matrix takes a diagonal form. Therefore, in the basis of
eigenvectors, quantum entropy (or von Neumann entropy) can be determined similarly to the
classical (information entropy H(X) Shannon using eigenvalues:
                                          𝑆𝑆 = − ∑𝑖𝑖 𝑝𝑝𝑖𝑖 ln 𝑝𝑝𝑖𝑖 ,
   here s – it is a theoretical and informational entropy.
   In quantum information theory, quantum relative entropy is a measure of the difference
between two quantum states (a quantum-mechanical analogue of relative entropy] [21].
   In contrast to classical information theory, quantum theory contains the concept of entropy of
entanglement.
   Entropy entanglement is a measure of the degree of quantum entanglement of two subsystems
that form two parts of the composite of a quantum system. Thus, the entropy of entanglement is
the von Neumann entropy of the density matrix for an arbitrary subsystem. If it is not equal to 0,
then the system is in a mixed state, which means that the two systems are entangled.


        3.4.     Research results and their discussion
Quantum information theory. The main tasks of Ph.D. is the establishment of conceptual
capabilities and limitations of receiving, transmitting and processing information in quantum
media (systems that operate according to the laws of quantum mechanics). It is believed that the
theory of classical information works with shortcomings in its use in the real world (quantum
world). Schematically, this can be represented as


        3.5.     Quantum information ⊂ Classical information ⊂ Information
                 technology
The concept of communication channel and its maximum bandwidth is an important element in
quantum information theory, because the effect of quantum entanglement allows not to send
information physically, through quantum teleportation.
   Obviously, the application of quantum information theory is necessary when information is
received, stored, transmitted not by bits, but in terms of quantum states.
   Determining the degree of confusion for systems with more than two components, as well as
for noisy systems in the so-called mixed states, is a separate problem of quantum information
theory. Fundamental developments in this direction belong to the school of Polish physicists
Gorodetsky [18]. It is believed that the definition is a principle. restrictions on the confusion that
can be released for information processing will make it possible to formulate the postulates of
quantum information theory by analogy with the postulates of thermodynamics. Currently, the
main problem of quantum information theory is to overcome the so-called quantum gap – the
transition from already implemented computers with 5-10 qubits to a machine that can create
superpositions and confused states of 1000 qubits and perform (until the state decays) at least
109 operations [19].


4. Experiments and Results in quantum theory area
        4.1. Symmetry
The modern theory of everything (superstring theory), which claims to be the unifying theory of
all four fundamental interactions, is based on a model of supersymmetry. Supersymmetry is a
theory that connects bosons and fermions in nature (we can say that the transformation of
supersymmetry can translate matter into interaction (or radiation), and vice versa).
Supersymmetry provides a union with gravity (local supersymmetry is the theory of gravity);
leads to the union of strong, weak and electromagnetic interactions (the theory of the Great
Union); solves the problem of hierarchies (simultaneous existence of large and small scales);
creates insufficient dark matter in the universe. Supersymmetry provides superstring theory with
consistency and stability. Supersymmetry "relieves us of the need to fine-tune the parameters of
the standard model to overcome a number of subtle problems in quantum theory ... at
insignificantly short distances, supersymmetry changes the intensity of three non-gravitational
interactions so that they can merge into one large combined interaction" [27].
   Parts of the construction of string theory gradually fell into place" [28]. The new symmetry in
physics is called supersymmetry. She argues that in the permutation of bosonic and fermionic
particles, the physical laws must remain unchanged. It is like a mirror image of nature, in which
fermions turn into bosons and bosons into fermions. The search for various manifestations of
supersymmetry in nature is one of the main tasks of numerous experiments on modern particle
accelerators. Symmetry is the basis of all the fundamental laws of physics: the law of
conservation of momentum as a consequence of the homogeneity of space; the law of
conservation of momentum as a consequence of isotropy of space; the law of conservation of
energy as a consequence of the homogeneity of time; the law of conservation of velocity of the
center of mass (consequence of isotropy of space-time). This has been actively explored in the
works [29]. Systems were built using simulation based on a quantum computer in [30]. Image
protection was implemented in [31]. Software and hardware were implemented in [32] for the
mobile system. Neural networks have been used to encrypt and decrypt data in [33].


        4.2. Study 2021
The standard model is a currently accepted theoretical construction that describes the interaction
of all elementary particles in the universe. It assumes four fundamental interactions:
electromagnetic, strong, weak, and gravitational.
    However, this is by no means a definitive theory. This is evidenced, for example, by the
presence in the universe of dark matter or antimatter, which does not fit into the Standard Model.
And now scientists are approaching the so-called New Physics.
    The standard model accurately predicted the so-called muon g-factor, an indicator of the
strength and speed of rotation of a particle in a magnetic field. This factor is close to the value of
2, but the Brookhaven experiments revealed a deviation of several parts per million.
    Despite the minimal difference, the scientists claimed the existence of previously unknown to
science interactions between the muon and the magnetic field.
    The Muon g-2 [34] experiment at the Fermi National Accelerator Laboratory (Fermilab) in
Chicago on April 7, 2021 published the results of an experiment that was to measure the value of
the anomalous magnetic moment of a muon with high accuracy.
    The current state of research [35] shows a possible change in both unifying theory and the
theory of everything. Forces of interaction model can be tested in the near future with
accelerator-based experiments and possibly also at the precision frontier.




Figure 2: Correlation of oscillation frequency with the standard model (Fermilab)
The first result from the Muon g-2 experiment at Fermilab confirms the result from the
experiment performed at Brookhaven National Lab two decades ago. Together, the two results
show strong evidence that muons diverge from the Standard Model prediction. Image [36].
   Thus, in the course of research as part of the Muon g-2 experiment in the laboratory of the
town of Batavia near Chicago (with a probability of 1 chance out of 40,000, i.e. the statistical
level of reliability is 4.1 sigma), a new, fifth force of nature was discovered [37-40]. Probably
because, the permissible error leaves to recognize the discovery by science, the error should not
exceed one chance by 3.5 million (5 sigma).


5. Conclusions
Obviously, the era of silicon electronics is nearing completion, and biotechnologies that operate
on DNA molecules and quantum technologies that operate on ions, atoms, and elementary
particles are replacing on-technology.
    Quantum computing and quantum information are new tools that will create a link between
simple and relatively complex: in the field of calculations and algorithms there are systematic
tools for building and studying such systems. The application of ideas from these areas has
already led to the emergence of new views on physics, so the authors contributing to the
expansion of general information theory by supplementing by integrating into its foundation the
rules of computation in quantum systems. The author systematizes the concept of different
industries and introduces the concept of non-statistical state.
    To understand further development, it is necessary to formalize the current state of affairs.
Procedures for determining all the rules of interaction and construction of the corresponding
algebra today are called theories of everything.
    Definition. Fundamental forces - different types of interaction that are not reduced to each
other. The fundamental forces described are gravitational, electromagnetic, strong and weak
interactions. It is considered (the theory of everything) that all these 4 interactions are separate
cases of one, still unknown, hypothetical interaction. The case of describing the theory of
everything is constantly supplemented by new manifestations, which cannot be substantiated by
the above-mentioned four fundamental forces. In particular, such quantum effects as quantum
entanglement and uncertainty without observation and acceptance of value in the process of
observation cannot be justified by the influence of one of the four forces.


    References
[1] V.Hrytsyk, Future of Artificial Intelligence: treats and possibility, Proceedings of
    ITA’2017, INFOS section, Varna, June, 26-Luly, 09, 2017, pp. 91-99
[2] V.Hrytsyk, A.Grondzal, A.Bilenkyj, Augmented reality for people with disabilities, In 2015
    Xth International Scientific and Technical Conference "Computer Sciences and Information
    Technologies"(CSIT), 2015, pp. 188-191
[3] V.Hrytsyk, M.Nazarkevych, Real-Time Sensing, Reasoning and Adaptation for Computer
    Vision Systems, In: Babichev S., Lytvynenko V. (eds) Lecture Notes in Computational
    Intelligence and Decision Making, ISDMCI 2021, Lecture Notes on Data Engineering and
    Communications Technologies, vol 77. Springer, Cham, 2022, pp.573-585,
    https://doi.org/10.1007/978-3-030-82014-5_39
[4] M.Kaku, Hyperspace: A scientic odyssey through parallel universes, time warps, and the
     tenth dimension; 2016, p. 384
[5] C.E.Shannon, Communication theory of secrecy systems, The Bell system technical journal,
     28(4), 1949, pp. 656-715
[6] V.V.Hrytsyk, Systematyzatsiia zavdan informatsiinykh tekhnolohii epokhy 4-yi
     promyslovoi revoliutsii, Visnyk KhNTU, 3(66), 2018, pp. 265-270.
[7] R.P.Feynman, Quantum mechanical computers, In Conference on Lasers and Electro-Optics
     (p. TUAA2), Optical Society of America, 1984, June
[8] D.Deutsch, Quantum theory, the Church-Turing principle and the universal quantum
     computer, Proceedings of the Royal Society of London, Series A, Mathematical and
     Physical Sciences, 1985, Vol. 400 (1818)
[9] E.Bernstein, U.Vazirani, Quantum complexity theory, Proceedings of the 25’th Annual
     ACM Symposium on the Theory of Computing, New York, 1993
[10] P.W.Shor, Algorithms for quantum computation: Discrete logarithms and factoring,
     Proceedings of the 35’th Annual Symposium on Foundations of Computer Science, Los
     Alamitos, 1994
[11] C.H.Bennett, P.W.Shor, Quantum information theory. Transactions on Information Theory.
     1998, Vol. 44
[12] M.A.Nielsen, I.L.Chuang, Quantum computation and quantum information, Cambridge,
     2000
[13] E.Schrödinger, Collected Papers on Wave, Mechanics, 1982, pp.102-123
[14] C.H.Bennett, Quantum information and computation, Physics Today, 48(10), 1995, pp. 24-
     30
[15] E.Bernstein, U.Vazirani, Quantum complexity theory, Proceedings of the 25’th Annual
     ACM Symposium on the Theory of Computing, New York, 1993
[16] J.Kim, W.J.Jang, T.H.Bui, D.J.Choi, C.Wolf, F.Delgado, Y.Bae, Spin resonance amplitude
     and frequency of a single atom on a surface in a vector magnetic field, arXiv preprint arXiv,
     2103.09582, 2021
[17] O.A.Pastukh, V.V.Yatsyshyn, I.M.Maikiv, V.V.Hrytsyk, Kryterii otsiniuvannia spotvoren
     kvantovoi informatsii pry peredachi po kvantovomu kanalu zviazku z shumom, Visnyk
     Khmelnytskoho natsionalnoho universytetu, 6; 2013, pp. 127-129
[18] M.A.Nielsen, I.L.Chuang, Quantum computing and quantum information, 2000
[19] M.Keyl, Fundamentals of quantum information theory, Physics reports, 369(5), 2002,
     pp.431-548
[20] A.Wehrl, General properties of entropy, Reviews of Modern Physics, 50(2), 1978, p. 221
[21] M.Horodecki, P.Horodecki, R.Horodecki, Unified approach to quantum capacities: towards
     quantum noisy coding theorem, Physical review letters, 2000, 85(2), p. 433
[22] C.H.Bennett, P.W.Shor, Quantum information theory, IEEE transactions on information
     theory, 44(6), 1998, pp. 2724-2742
[23] T.Fukuyama, T.Kikuchi, Grand Unified Theories: Current Status and Future Prospects, An
     International Workshop, Grand Unified Theories: Current Status and Future Prospects,
     2008, p.1015
[24] B.Green, The elegant universe: Superstrings, hidden dimensions, and the quest for the
     ultimate theory, Vintage London, 1999
[25] D.J.Gross, The role of symmetry in fundamental physics, Proceedings of the National
     Academy of Sciences, 93(25), 1996, pp. 14256-14259
[26] R.Balkin, C.Delaunay, M.Geller, E.Kajomovitz, G.Perez, Y.Shpilman, Y.Soreq, A
     Custodial Symmetry for Muon g-2, arXiv preprint arXiv, 2104.08289, 2021
[27] B.Abi et al., Muon g-2 collaboration, Measurement of the Positive Muon Anomalous
     Magnetic Moment to 0.46 ppm, Phys. Rev. Lett., 126, 2021
[28] T.Albahri et al., Muon g-2 collaboration, Magnetic Field Measurement and Analysis for the
     Muon g-2 Experiment at Fermilab, Phys. Rev. A, 103, 2021
[29] M.Nazarkevych, M.Logoyda, O.Troyan, Y.Vozniy, Z.Shpak, The Ateb-Gabor Filter for
     Fingerprinting, In International Conference on Computer Science and Information
     Technology, Springer, Cham, 2019, pp. 247-255
[30] M.Nazarkevych, N.Lotoshynska, V.Brytkovskyi, S.Dmytruk, V.Dordiak, I.Pikh, Biometric
     Identification System with Ateb-Gabor Filtering, In 2019 XIth International Scientific and
     Practical Conference on Electronics and Information Technologies (ELIT), 2019, pp. 15-18
[31] M.Logoyda, M.Nazarkevych, Y.Voznyi, S.Dmytruk, O.Smotr, Identification of Biometric
     Images using Latent Elements, CEUR Workshop Proceedings, 2019
[32] I.Tsmots, V.Teslyuk, I.Vavruk, Hardware and software tools for motion control of mobile
     robotic system, In 2013 12th International Conference on the Experience of Designing and
     Application of CAD Systems in Microelectronics (CADSM), 2013, pp. 368-368
[33] I.Tsmots, Y.Tsymbal, V.Khavalko, O.Skorokhoda, T.Tesluyk, Neural-like means for data
     streams encryption and decryption in real time, In 2018 IEEE Second International
     Conference on Data Stream Mining & Processing (DSMP), 2018, pp. 438-443
[34] T.Albahri et al., Muon g-2 collaboration, Measurement of the anomalous precession
     frequency of the muon in the Fermilab Muon g-2 Experiment, Phys. Rev. D, 103, 2021
[35] B.Abi, T.Albahri, S.Al-Kilani, D.Allspach, L.P.Alonzi, A.Anastasi, A.Lusiani,
     Measurement of the positive muon anomalous magnetic moment to 0.46 ppm, Physical
     Review Letters, 126(14), 2021
[36] First results from fermilabs muon g2 experiment strengthen evidence of new physics,
     https://news.fnal.gov/2021/04/first-results-from-fermilabs-muon-g-2-experiment-
     strengthen-evidence-of-new-physics/
[37] M.McEwen, D.Kafri, Z.Chen, J.Atalaya, K.J.Satzinger, C.Quintana, R.Barends, Removing
     leakage-induced correlated errors in superconducting quantum error correction, Nature
     communications, 12(1), 2021, pp. 1-7
[38] A.I.Chanyshev, I.M.Abdulin, O.E.Belousova, Mining informational and analytical bulletin
     (scientific and technical journal), 2021
[39] K.Krippendorff, Mathematical theory of communication. Departmental Papers (ASC),
     2009, p.169
[40] E.C.Strinati, S.Barbarossa, 6G networks: Beyond Shannon towards semantic and goal-
     oriented communications, Computer Networks, 190, 2021