<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Using Entropy Function for Definition States of Information System</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Vladimir A. Smagin</string-name>
          <email>va_smagin@mail.ru</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Anatoly D. Khomonenko</string-name>
          <email>khomon@mail.ru</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Copyright © by the papers' authors. Copying</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Information Systems and</institution>
          ,
          <addr-line>Technologies, Emperor Alexander I St.</addr-line>
          ,
          <institution>Petersburg State Transport University</institution>
          ,
          <addr-line>St. Petersburg</addr-line>
          ,
          <country country="RU">Russia</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>permitted for private and academic purposes., In: B. V. Sokolov, A. D. Khomonenko, A. A., Bliudov (eds.): Selected Papers of the, Workshop Computer Science and Engineering, in the framework of the 5th International, Scientific-Methodical Conference "Problems of, Mathematical and Natural-Scientific Training, in Engineering Education"</institution>
          ,
          <addr-line>St.-Petersburg, Russia, 8-9 November, 2018, published at, http://ceur-ws.org</addr-line>
        </aff>
      </contrib-group>
      <fpage>53</fpage>
      <lpage>60</lpage>
      <abstract>
        <p>The concept of a state of L. Zadeh of the theory of systems is in detail studied. The yielded concept educes with reference to information systems. It is offered to define a state and quantitatively to estimate, as well as a yield of information system, the entropy distribution function. Transferring from L. Zadeh theory to use of the equation of Kolmogorov`s-Chepmen`s is offered. As the initial data construction of distribution functions of entropy is recommended.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1 Introduction</title>
      <p>The concept of state is often used in science and
technology. It is most simply defined in the theory
of the operation of systems. It is simply a set of
values of the parameters of the elements of the
systems. However, in the theory of systems, the
concept of state considered more precisely and
strictly depending on the type of system.
Researchers especially associate this concept with
dynamic systems, more precisely with continuous
and discrete systems.1</p>
      <p>Currently, there is a particular interest in the
study of information systems. For example, in
[Mar14] the value of entropy considered in the
study of the state of information systems, the
essence of informational entropy analyzed.
However, the concept of state in information
systems not formally defined. This article solves
two problems: first, to connect the concept of state
with the classical results of systems theory and
second, to connect this concept with the
achievements of modern information theory. The
verbal formulation of the concept of the state of the
information system follows from the results of the
article.
A dynamical system, according to [Nem49], is a
group of transformations {Ri}, defined on a
separable metric space R and having the properties:
1. Ri Defined for all t on -¥ &lt; t &lt; ¥ .</p>
      <p>2. The function q = f ( p, t) , where q – the
image of a point p from in R in accordance with
Ri , has a group property:</p>
      <p>f ( p, t0 + t) = f ( f ( p, t0 ), t).</p>
      <p>3. The group Ri is continuous in the sense that
for all t0 and p0, and sequences {tn} and {pn},
converging to t0 and p0, the relation is true
lim f ( pn , tn ) = f ( p0 , t0 ). (2)
n®¥</p>
      <p>The element p of R is the state of the dynamic
system, and q=f(p,t), describes the state of the
system at the moment t provided that at the moment
t=0 the system was in the state p.</p>
      <p>It is formulated on the basis of the analysis of
problems of celestial mechanics or problems of
dynamics of a solid body. Therefore, the system
inputs and outputs are not explicitly highlighted in
the definition. This definition requires a slight
change.
3 Formal Analogue of the State from
Information Theory
An information system is a group of
transformations {Hi} defined on the probabilistic
space H and possessing properties:
1. The Hi transformations defined for all t on [0,
∞).</p>
      <p>2. The function g=f (h,t), where g is the image
of a point from H according to Hi, has a group
property
f (h, t0 + t) = f ( f (h, t0 ), t).
(3)
3. The group Hi is continuous in the sense that
for all t0 and h0 and all sequences {tn} and {hn}
converging to t0 and h0, the relation is true
lim f (hn , tn ) = f (h0 , t0 ). (4)
n®¥</p>
      <p>Element h of H is the state of the information
system, and g=f (h,t), describes the state of the
information system at the time t, provided that at
the time t=0 the system was in the state of h. This
definition needs to specified and clarified.</p>
      <p>As a function f(t), in our opinion, we can take
the density or entropy distribution function supplied
to the input of the system. For example, we
consider the function of differential entropy for
normal distribution with probability density f
(t)=dnorm (t, m, σ), m=100 units, σ =20 units. It has
the form:</p>
      <p>t
h (t ) = -ò f ( z) ln( f ( z))dz.
1
(5)
0</p>
      <p>Expression (5) is the first initial moment of the
random entropy. Second initial entropy moment:
t
h2 (t ) = -ò f ( z)(ln( f ( z)))2dz. (6)</p>
      <p>0</p>
      <p>Similarly, we can find the higher initial
moments of entropy. In practical applications, it is
enough to limit you by two points. At the Figure 1 a
graph of the hi(t) function is shown.</p>
      <p>For our example, the minimum entropy value is
0 nat. and a maximum of 4, 415 nat. Median value
h1 (100)=2,207 nat. For the first case, the initial
moments are: v1(800)=4,415 nat. v2(800)=19,989
nat.2, the standard deviation δ(800)=0.707 nat.,
coefficient of variation η(800)=0.16. For the second
case, the corresponding values: v1(100)=2.207 nat.,
v2 (100)=9.905 nat.2, δ (100)=2,263 nat.,
η(100)=1.0125. For the given data, the probability
density of random entropy values determined by the
relations:
g1 ( x) =
g2 ( x) =
1
1
According to [Zad64, Zad63] under the oriented
abstract object (OAO) understand a certain system
associated with some input signal (cause) u and
output signal (consequence). Both signals are
understood as vector functions of time. The
relationship between them is not straightforward. A
specific function u can correspond to several output
functions y, and, conversely, a specific output
signal can correspond to several input functions.</p>
      <p>To formalize the OAO, the segment of the
function u defined on the observation interval [t0,
t1], is denoted u[t0 , t1] on the closed or u(t0 , t1]
on semi–open interval, depending on the
contextsimply u. As a result of the experimental study, a
set of input–output pairs (u(t0 , t1] , y(t0 , t1] ) is
usually obtained.</p>
      <p>If the same signal is applied to the input of
another sample of the test device, the output signal
it does not have to be the same as in the first case,
since the initial conditions for the second sample
may be different. Therefore, this definition
[Nem49] reflects the fact that more than one y(t0 ,
t1] can correspond to a given u(t0 , t1].</p>
      <p>The set of ordered pairs of time functions on the
specified interval denoted as</p>
      <p>R(t0 , t1] = {u(t0 , t1], y(t0 , t1]}.
(8)</p>
      <p>Based on this concept, the following definition
proposed in [Zad64]. OAO a is a family R(t0 , t1] =
{ u(t0 , t1] , y(t0 , t1] }, t0 , t1 є (0,∞) of sets of
ordered pairs (u, t) of time functions. Here the first
element in (8) called the segment of the input signal
or simply the input signal, and the second – the
segment of the output signal or simply the output
signal. Thus, the OAO identified with a set of
input–output pairs that belong to the A. In addition,
any segment of the pair for which t0 ≤ τ0 ≤ t1, τ0 ≤
τ1≤ t1 must belong to the A.</p>
      <p>The set of all segments u on the interval (t0, t1],
such that (u, y) Î A , called the space of input
signals A and denote R[u]. Similarly, the set of all
segments y, such that (u, y) Î A , called the output
signal space and denote R[y]. It follows that the set
R(t0, t1] of all pairs (u(t0 , t1] , y(t0 , t1] ) є A, there is
some subset of the product R[u]×R[y]. In the "list"
of ordered pairs (u, y) each fixed u corresponds,
generally speaking, to a set of different y and,
conversely, to each fixed y – a set of different pairs.</p>
      <p>From a mathematical point of view, this
essentially boils down to defining the system as a
relationship rather than, as usual, some function or
operator. The difference can explained by the
example of the integrator. The values of the input
and output signals at the same time t are related to
each other by a differential equation
dy(t)
dt
= u(t) .</p>
      <p>(9)</p>
      <p>The statement that the integrator is OAO can
described by a set of ordered pairs of functions of
time of the form</p>
      <p>t1
(u(t),a + ò u(x )dx , t0 £ t £ t1 Î (0, ¥),
t0
where the parameter α belongs to the space of real
numbers, and the function u – to the class of time
functions, integrable on any finite interval. In this
case, each fixed value u (t0 , t1] corresponds to a set
y(t0 , t1], each element of which corresponds to
different values of the parameter α:</p>
      <p>
        t1
y(t) = a + ò u(x )dx , t0 £ t £ t1 . (
        <xref ref-type="bibr" rid="ref9">10</xref>
        )
t0
      </p>
      <p>
        Any mathematical relation between u and y, that
defining the set of pairs of input–output that form A
is called the characteristic input–output for A. In
this sense, (
        <xref ref-type="bibr" rid="ref9">10</xref>
        ) is a characteristic input / output for
A. More generally, if the input and output signals of
the system A satisfy differential equation of the
form
a
      </p>
      <p>d n y
n dt n
= bm (t)
+ × × × + a0 (t) y =
d mu</p>
      <p>+ × × × + b0 (t).</p>
      <p>dt m
Then this equation is the input–output characteristic
for A, since it defines the set of all input–output
pairs belonging to A.</p>
      <p>It is useful to parameterize (or move) many
input–output pairs R (t0 , t1] so that each segment of
the input signal u (t0 , t1] and each parameter value
corresponds to a single segment of the output signal
y(t0 , t1]. Such a parameterization would correspond,
roughly speaking, to the page numbering of the
"list" of input–output pairs, on each page of which
pairs with the same output signals are written out. A
States are essentially the values of such a
parameter. From this point of view, the main role of
the concept of state is to provide the ability to
associate a single output signal with each input
signal, using the state of the system as a parameter.
(11)
We present an approach to the construction of the
concept of the state of L. Zadeh [Zad64]. Statement:
based on the content of section 4, it can be assumed
that parameter α parametrizes A if there is some
function А defined on the product ΣxR[u] and such
that for all pairs (u, y) belonging to A and all t0 and
t1 can be chosen from Σ such α that
y = A(a ;u) .
(12)</p>
      <p>For each α of Σ and for each u of R[u] in this
case, the pair (u, A(α;u)) is an input-output pair,
which belongs to the A. To call α by the state of the
system, it is necessary for the function A to have the
property of conjugating reactions, which formulated
as follows. We agree that uv denotes a signal in
which a segment v=v (t, t1] follows a segment
u=u(t, t1]. This is one of the reasons for choosing to
use half-open observation intervals. Otherwise,
there would be a difficulty with the definition uv at
the point t, provided that u(t)≠v(t). In particular, if
by definition u=u(t0 , t1] and u=u(t, t1], then
uu=u(t0, t1].</p>
      <p>Definition 1. A function A (α; u) has the
property of conjugating reactions: if for each α
from Σ and each uu of R[uu] there is an element α *
from Σ, uniquely defined by α and u, that</p>
      <p>A(a ; uu`) = A(a ; u) A(a *; u) . (13)</p>
      <p>Condition (13) means that the output signal (the
response of the system corresponding to the value
of the parameter α and the segment uu of the output
signal) coincides with the response segment
corresponding to the parameter α and the input
signal u, followed by the response segment
corresponding to the parameter α* and the input
signal u.</p>
      <p>
        Definition 2. If α is used to parameterize A, and
the function A(α;u) has the property of conjugation
of reactions, then the elements Σ represent the state
A, the space Σ is called the state space A, and the
input-output characteristic is the state of the system
A. If u=u(t0, t1], then α of A(α;u) is called the initial
state of the system A at time t0 and is denoted by s
(t0). In this regard, the characteristic
input-outputstate of the system A can be represented in a more
explicit form as
y(t0 , t] = A(s(t0 ); u(t0 , t]) ,
(
        <xref ref-type="bibr" rid="ref6">14</xref>
        )
Where u(t0, t1] is the segment of the input signal,
s(t0) – the initial state of the system, and y (t0 , t] –
the corresponding output signal. Thus, equation
(
        <xref ref-type="bibr" rid="ref6">14</xref>
        ) States that the initial state of the system A at
the time t0 and the interval u(t0, t1] of the input
signal uniquely determines the interval of reactions
y (t0 , t].
      </p>
      <p>Definition 3. Let system A be in the state s
(t0)=α and at its input a signal u = u (t0, t1] is given.
Thanks to conjugation of the reactions A (α; u),
there is an element α* ϵ Σ such that the equation
(13) holds for any u = u (t, t1].</p>
      <p>The element α*, which is uniquely determined
by the values s (t0) and u = u (t0, t1], is called the
state of system A at time t and is denoted by s (t).
Thus, the state of the system at time t uniquely
determined by the state of the system in time t0 and
the value of the signal at its input in the interval
between these points in time. Symbolically
s(t) = s(s(t0 ), u(t0 , t1) ,
(15)
and the resulting equation is called the state
equation A. Therefore, the conjugation property of
reactions (13) can expressed as:</p>
      <p>
        A(s(t0 ); uu `) = A(s(t0 ); u) A(s(t ); u) . (
        <xref ref-type="bibr" rid="ref2 ref3">16</xref>
        )
The reaction of system A, which is in the state s
(t0), to the input signal uu must be identical to the
response of system A, which is in the state s (t0), to
the input signal u and the subsequent reaction of the
same system, which is in the state s (t), at the input
signal u.
      </p>
      <p>
        In [Zad64] it is shown that the function A (α;u)
has the property of conjugation of reactions defined
by equations (13) and (
        <xref ref-type="bibr" rid="ref2 ref3">16</xref>
        ), it follows that the
function from equation (15) has the property of
conjugation of states
      </p>
      <p>
        s(s(t0 ); uu ` ) = s(s(t0 ); u); u ` ) . (
        <xref ref-type="bibr" rid="ref5">17</xref>
        )
This property is equivalent to the group property
2 in the definition of the dynamic system. Consider
a simple example with the input output
characteristic:
+ y = u .
      </p>
      <p>(18)</p>
      <p>In this case, the input–output pairs defined on
have the form
t
(u(t)),a e-(t-t0 ) + ò e-(t-x )u(x )dx ,
dy
dt
t0
t0
t0 &lt; t £ t1</p>
      <p>If we identify Σ with the axis of real numbers
(0, ∞) then the parameter α from equation (19) can
be used to parameterize A. Moreover, writing the
equation
t
y(t) = a e-(t-t0 ) + ò e-(t-x )u(x )dx ,</p>
      <p>t0 &lt; t £ t1,
it is easy to verify the validity of an identity:
. (19)
(20)
t
a e -(t -t0 ) + ò e -(t -x )u(x )dx =</p>
      <p>t
= a e-(t-t ) + ò e-(t -x )u(x )dx ,
t0</p>
      <p>t0
Where t0 ≤ τ0 ≤ t and</p>
      <p>t
a * = a e -(t -t0 ) + ò e -(t -x )u(x )dx .</p>
      <p>t0</p>
      <p>Equation (20) is equivalent to the relation of the
form (13) y = A (α; u), since it determines the
values of y for t &gt; t0. Moreover, equations (20) and
(22) indicate that the function on the right side of
equation (20) has the property of conjugation of
reactions. Therefore, equation (20) can be called the
input–output–state characteristic for system A,
where α is the state of the system at time t0 and Σ =
(0, ∞), we also note that putting t = t0 (which is
valid if it does not contain delta functions with a
singularity at the point t0), we obtain
(21)
(22)
s(t0 ) = a = y(t0 ).
(23)</p>
      <p>It follows that the state of system A at time t0
can identified with the output signal of this system
at time t0. This concludes the state definition and an
example illustrating the definition.</p>
      <p>As result of the study of the concept of "state"
L. Zadeh note the following.</p>
      <p>1. The result is the introduction of the concept
of an abstract object, defined as a family of ordered
pairs of time functions. An abstract object is
defined by itself, regardless of how the concept of
state is introduced to it.</p>
      <p>2. The concept of state introduced as a method
of parameterization of a set of input–output pairs
that provide providing a unique dependence of the
output signal and the state of the system. There are
countless ways to parameterize input-output pairs.
Hence, we should conclude that any
characterization of input–output can match many of
the characteristics of the input–output–state are
essentially equivalent. The input–output–state
characteristic can considered as a description of an
oriented abstract object with a specific choice of a
system of parameters for a set of its input–output
pairs.</p>
      <p>3. Definition 3 extends to a broader class of
systems than dynamic systems. In this regard,
definitions 1 and 2 are more General definitions of
the concept of state than the indirect definition of
the concept contained implicitly in the definition of
a dynamic system.
6 An Example of the Concept of State in
the Information System
As an input u (t), we use the density functions of
the distributions of the random variable of entropy
(7) – g1 (t), g2 (t), shown in Figure 2.</p>
      <p>We apply these functions to construct states and
exit functions of information systems, applying the
results of the theory of L. Zade. To illustrate the
calculations, we use the integrator element. The
dependence for it input–output is represented by a
differential equation:
dt</p>
      <p>In this equation, u1 (t) = g1 (t), u2 (t) = g2 (t).
Since it is represented by OAO, the first
dependence can be described by a set of ordered
pairs of time functions of the following form (for
example, g1 (t)).
dy</p>
      <p>In this case, each fixed value g1 (t0, t1)
corresponds to a certain set y (t0, t1], each element
of which corresponds to different values of the
parameter α:
y(t) =</p>
      <p>t1
a + ò g1(x )dx ,
t0 £ t £ t1, t Î (0, ¥)</p>
      <p>This relationship between g1 (t) and y, which
determines the set of input–output pairs that make
up system A, is the input–output characteristic for
A, and α, the state of the system. But for this, it is
necessary to require that the function of system A,
on the basis of parametrization, has the property of
conjugating reactions and define a new function y =
A (α; u), satisfying the property</p>
      <p>A(a ; uu`) = A(a ; u) A(a *; u) ,
(27)
t0
t0
where a * = a e-(t -t0 ) + ò t e-(t -x ) g (x )dx , and
t0 1
uv is a signal in which the segment v = g1 (t, t1]
follows the segment u = g1 (t0, t]. In this case, we
can assert
s(t0 ) = a = y(t0 ),
(28)
which means the state of the system at time t0. It
can identified with the output of this system at time
t0. Followed by
(29)
(30)
(31)
s(t) = s(s(t0 ); g1(t0 , t]) ,
s(s(t0 ); u`) = s(s(t0 );
g1(t0 , t]); g1(t, t1]`)</p>
      <p>Consider the numerical presentation of the
example with the initial data for the maximum
entropy point in Figure 1. The average entropy
value and standard deviation will be equal to v1 =
4.415 nat., Σ = 0.707 nat. The integrator input
function is u1 (t) = dnorm (t, v1, σ). We take the
initial values of time t0 = 1; 3h. For them, the state
values will be s (1) = 6.815 nat. and s (3) = 0.023
nat., output variables:
t
y11 (t) = s(1) + ò u (x )dx ;
1 1
t
y12 (t) = s(3) + ò u (x )dx
3 1
and their integral components are</p>
      <p>t t
v11(t) = ò u1(x )dx ;n 12(t) = ò u1(x )dx .</p>
      <p>1 3</p>
      <p>In Figure 3 and 4 are graphs of these functions.
It follows from the figures that there is practically
no difference between the graphs.</p>
      <p>Consider the presentation of the example with
the initial data for the point of the average value of
entropy in Figure 1. The mean value of entropy and
the standard deviation are v2 = 2,207 nat., Σ = 2.263
nat. The integrator input function is u2 (t) = dnorm
(t, v2, σ). We take the initial values of time t0 = 1;
3h. For them, the state values are s (1) = 0.158 nat.,
s (3) = 0.565 nat., and the output variables are:
y21 (t) = s(1) + ò u2 (x )dx ;
t
1
t
y21(t) = s(3) + ò u2 (x )dx .</p>
      <p>3
And their integral components are:
t
v21(t) = ò u (x )dx ;
1 2
t
n 22 (t) = ò u (x )dx .</p>
      <p>3 2</p>
      <p>We have considered an example of calculation
provided, that the second phase of the process does
not depend on the duration of the first phase. This is
not fully consistent with equation (32) below. If we
take into account this dependence, we will have to
build two-dimensional graphs of calculations.
7 Analogy of the Theory of L. Zade and
the Kolmogorov–Chapman Equations
for Information Systems
Based on the study of the state model of L. Zadeh, a
qualitative conclusion suggested: in the information
system, the input state can be the value of the
entropy distribution function at the initial moment
of time before the process of information
transformation in the system begins. For the values
of the variable at the output of the system, take the
values of the entropy distribution function obtained
as result of the transformation in the system.</p>
      <p>Heuristic statement. For a complex system, as a
subject of future research of its informational
property, try to apply the Kolmogorov–Chapman
equation [Fel57]. This equation described using the
theory of L. Zade, but using the entropy distribution
functions to determine the states and output
variables of the system [Sma10].</p>
      <p>Consider an example that is simpler than the
integrator, namely, a two-phase single-beam
random process from the standpoint of solving the
simplest Kolmogorov – Chapman equation. This
allows us to show the process of solving the
Kolmogorov – Chapman equation and compare the
adequacy of the research with the theory of L.
Zade. Let us present an example for the numerical
illustration of the solution of the Kolmogorov–
Chapman equation:
p02 (t0 , t + Dt) = p01 (t0 , t) ´
´ p12 (t, t + Dt), t0 &lt; t &lt; t + Dt
.</p>
      <p>(32)</p>
      <p>Equation (32) reflects the presence of three
discrete states and two phases with continuous
distributions following each other. Moreover, the
second phase is dependent on the first phase. It is
required to calculate the output variable (state 02),
if the initial state is determined by the delay in the
first phase t0, and the continuous distributions are
independent.</p>
      <p>The initial data: t0 = 10 nat., the first phase f01 =
dnorm (t, v1, σ1), v1 = 50 nat., σ1 = 12 nat., the
second phase f12 = dnorm (t, v2, σ2), v2 = 40 nat., σ2
= 7 nat. The probabilities that the phases will be at
least t represented as</p>
      <p>¥ ¥
p 01(t) = ò f01(z)dz, p12 (t) = ò f12 (z)dz.</p>
      <p>t t
The variables t0, t are measured by the measure nat.</p>
      <p>Recall that we are investigating an information
system defined by information states and exits, the
densities and probabilities introduced above already
measured in advance by entropy distributions.
Perform the following numerical calculations:
A. Phases are independent.
p01(t) = ò f01( z)dz
ò f01( z)dz ,
¥
t0
¥
t0 +t
¥
t
p12 (t) = ò f12 ( z)dz,
p02 (t) = p01(t) p12 (t).
(33 a)</p>
      <p>The results of the calculations are presented in
Figure 7. For example, consider the values of the
curves at the point t = 35 nat .: p01 (t) = 0.662 nat.,
p12 (t) = 0.762 nat., p02 (t) = 0.505 nat.</p>
      <p>
        In Figure 8 s (30) for comparison, an analogue
of Figure 7 s (
        <xref ref-type="bibr" rid="ref9">10</xref>
        ) is shown. To estimate the
uncertainty function at the output of the system
based on the application of the Kolmogorov–
Chapman equation, the indicator y (t) = 1 – p02 (t)
should be used instead of the indicator p02(t).
The given example illustrates a method for
determining the state and magnitude of a function at
the output of a system based on the solution of the
Kolmogorov–Chapman equation. Real information
systems are more complex, they, as a rule,
"multipath", can contain in each "information ray"
more than two phases of random processes. The
number of states (initial and intermediate) can be
very large.
      </p>
      <p>B. Phases are dependent. In this case, the
formulas for the probability of phase
implementation take the form:</p>
      <p>¥ ¥
p01(t) = ò f01(z)dz
t0 +t
ò f01(z)dz ,
t0
p12 (t, t + Dt) = ò f12 ( z)dz
¥
t+Dt
¥
ò f01(z)dz,
t
p02 (t, t + Dt) = p01(t) p12 (t, t + Dt). (33 b)
Let t = 30h., Δt = 15h., Then p01 (30) = 0.798,
p12 (30.45) = 0.249, p02 (30.45) = 0.199. Since p02 (t,
∆) is a function of two variables, for it we can to
construct a two-dimensional graphical dependence.</p>
      <p>Similarly, we can consider an example of the
application of the Kolmogorov–Chapman equation
covered by feedback. The use of systems of
equations of the Kolmogorov–Chapman type for
estimating and predicting the values of the
indicators of entropic (informational) uncertainty,
in our opinion, can be effective. It requires further
study.</p>
    </sec>
    <sec id="sec-2">
      <title>8 Conclusion</title>
      <p>The essence of the proposed model is that the
concept of state, the input and output of information
systems should be measured by such indicators that
measure entropy and information. Therefore, we
propose an informational modification of the model
L. Zade. To work with such a model, it is
necessary, on the basis of the method of moments
of a random variable of entropy, to approximately
construct the necessary distribution functions of all
the components constituting the information
system. Then, using these distribution functions,
you can apply the model L. Zade. An example of
calculation for the integrator [Zad64] is given.</p>
      <p>The essence of the method consists in
modifying the method for solving the Kolmogorov
– Chapman equation by applying in it the
distribution functions of the random variable of
entropy indicated in the preceding paragraph of the
conclusion. The simplest examples for an equation
with three discrete states and two random phases
with normal distributions are considered. General
conclusion: the state and output indicators in the
information system should be measured by the
entropy (information) associated with a certain
probability.</p>
      <p>We present a number of modern works related
to current applied research areas in the application
of information systems models, including those
based on the use of entropy. In [Kud16], issues
related to the concept of information and
terminology in this area, as well as models of
information, communication and info
communication systems and their interconnection
are considered. In [Liv17], an analysis of
information security systems is conducted from the
position of determining the total entropy of an
information system. In [San08], clustering
algorithms based on multilevel entropy sub graphs
are proposed. In [Kho16], a cloud computing model
in information systems with a Web interface based
on a multi-channel queuing system with “cooling”
and iterative solution of the Kolmogorov-Chapman
equations. The concepts of informational entropy,
coarse entropy, knowledge granulation and
measures of granularity in incomplete information
systems are considered in [Lia06].</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [Fel57]
          <string-name>
            <given-names>W.</given-names>
            <surname>Feller</surname>
          </string-name>
          .
          <article-title>An Introduction to Probability Theory and its Applications</article-title>
          . Vol.
          <volume>1</volume>
          . John Wiley &amp; Sons Inc.
          <year>1957</year>
          . 528 p.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <surname>[Kho16] A. D. Khomonenko</surname>
            ;
            <given-names>S. I.</given-names>
          </string-name>
          <article-title>Gindin; Khalil Maad Modher. A cloud computing model using multi-channel queuing system with cooling</article-title>
          .
          <source>Proceedings of the 19th International Conference on Soft Computing and Measurements</source>
          ,
          <string-name>
            <surname>SCM</surname>
          </string-name>
          <year>2016</year>
          . Pp.
          <volume>103</volume>
          -
          <fpage>106</fpage>
          , DOI: 10.1109/SCM.
          <year>2016</year>
          .
          <volume>7519697</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [Kud16]
          <string-name>
            <given-names>V. A.</given-names>
            <surname>Kudriashov</surname>
          </string-name>
          .
          <article-title>Logical-Graphic Analysis of Hierarchy of Information Processes [Logiko-graficheskiy analiz ierarkhii informatsionnykh protsessov]</article-title>
          .
          <source>Intellectual Technologies on Transport</source>
          .
          <year>2016</year>
          . No.
          <volume>2</volume>
          (
          <issue>6</issue>
          ). С.
          <volume>30</volume>
          -
          <fpage>35</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [Lia06]
          <string-name>
            <given-names>J.</given-names>
            <surname>Liang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Shi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. J.</given-names>
            <surname>Wierman</surname>
          </string-name>
          .
          <article-title>Information entropy, rough entropy and knowledge granulation in incomplete information systems</article-title>
          .
          <source>International Journal of General Systems</source>
          .
          <year>2006</year>
          / 12 Vol.
          <volume>35</volume>
          ; Iss. 6. Pp.
          <volume>641</volume>
          -
          <fpage>654</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [Liv17]
          <string-name>
            <given-names>I. I.</given-names>
            <surname>Livshitz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. V.</given-names>
            <surname>Neklydov</surname>
          </string-name>
          .
          <article-title>Assessment of Entropy of Information Security Systems [K voprosu otsenivaniya entropii sistem obespecheniya informatsionnoy bezopasnosti]</article-title>
          .
          <source>Questions of cyber security [Voprosy kiberbezopasnosti]</source>
          .
          <year>2017</year>
          . no.
          <issue>5</issue>
          (
          <issue>24</issue>
          ). Pp.
          <volume>30</volume>
          -
          <fpage>41</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [Mar14]
          <string-name>
            <given-names>V. M.</given-names>
            <surname>Markelov</surname>
          </string-name>
          .
          <article-title>About system information in additive information systems [O sistemnoy informatsii v additivnykh informatsionnykh sistemakh]</article-title>
          .
          <source>Perspectives of Science and Education [Perspektivy Nauki i Obrazovaniya]</source>
          ,
          <year>2014</year>
          , №
          <volume>5</volume>
          (
          <issue>11</issue>
          ). Pp.
          <volume>31</volume>
          -
          <fpage>36</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [Nem49]
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Nemytskiy</surname>
          </string-name>
          .
          <article-title>Topological questions in the theory of dynamic systems [Topologicheskie voprosy v teorii dinamicheskikh system]</article-title>
          .
          <source>Achievements of mathematical sciences [Uspekhi matematicheskikh nauk]</source>
          .
          <year>1949</year>
          .
          <source>T. 4. Issue</source>
          <volume>6</volume>
          (
          <issue>34</issue>
          ). S.
          <volume>91</volume>
          -
          <fpage>153</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <surname>[San08] J. M. Santos</surname>
            ,
            <given-names>J. M.</given-names>
          </string-name>
          <string-name>
            <surname>Sa</surname>
            ,
            <given-names>L. A.</given-names>
          </string-name>
          <string-name>
            <surname>Alexandre. LEGClust -</surname>
          </string-name>
          <article-title>a clustering algorithm based on layered entropic subgraphs</article-title>
          .
          <source>IEEE Transactions on pattern analysis and machine intelligence</source>
          ,
          <source>2008</source>
          , vol.
          <volume>30</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>13</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [Sma10]
          <string-name>
            <given-names>V. A.</given-names>
            <surname>Smagin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Yu</surname>
          </string-name>
          .
          <source>Smagin. Approximate Determination of the Distribution of Entropy. Automatic Control and Computer Sciences</source>
          .
          <year>2010</year>
          , no.
          <volume>2</volume>
          (
          <issue>44</issue>
          ). Pp.
          <volume>27</volume>
          -
          <fpage>37</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [Zad63]
          <string-name>
            <given-names>L.</given-names>
            <surname>Zaden</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. F.</given-names>
            <surname>Desoer</surname>
          </string-name>
          .
          <source>Liner System Theory. The State Space Approach</source>
          ,
          <string-name>
            <surname>McGraw-Hill</surname>
            ,
            <given-names>N.Y.</given-names>
          </string-name>
          ,
          <year>1963</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [Zad64]
          <string-name>
            <given-names>L. A.</given-names>
            <surname>Zadeh</surname>
          </string-name>
          .
          <article-title>The concept of state in system theory. Views on general systems theory</article-title>
          . New York:John Wiley and Sons, Inc.,
          <year>1964</year>
          . 188 p.
          <source>Pp</source>
          .
          <volume>49</volume>
          -
          <fpage>60</fpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>