=Paper=
{{Paper
|id=Vol-2341/paper-10
|storemode=property
|title=Using Entropy Function for Definition States of Information System
|pdfUrl=https://ceur-ws.org/Vol-2341/paper-09.pdf
|volume=Vol-2341
|authors=Vladimir A. Smagin,Anatoly D. Khomonenko
}}
==Using Entropy Function for Definition States of Information System==
Using Entropy Function for Definition States of Information System Vladimir A. Smagin Anatoly D. Khomonenko Department of Information Systems and Department of Information Systems and Technologies, Emperor Alexander I St. Technologies, Emperor Alexander I St. Petersburg State Transport University Petersburg State Transport University St. Petersburg, Russia St. Petersburg, Russia va_smagin@mail.ru khomon@mail.ru However, the concept of state in information systems not formally defined. This article solves two problems: first, to connect the concept of state Abstract with the classical results of systems theory and second, to connect this concept with the The concept of a state of L. Zadeh of the achievements of modern information theory. The theory of systems is in detail studied. The verbal formulation of the concept of the state of the yielded concept educes with reference to information system follows from the results of the information systems. It is offered to define a article. state and quantitatively to estimate, as well as a yield of information system, the 2 Formal Analogue of the State of the entropy distribution function. Transferring from L. Zadeh theory to use of the equation Dynamic Systems Theory of Kolmogorov`s–Chepmen`s is offered. As A dynamical system, according to [Nem49], is a the initial data construction of distribution group of transformations {Ri}, defined on a functions of entropy is recommended. separable metric space R and having the properties: 1. Ri Defined for all t on -¥ < t < ¥ . 1 Introduction 2. The function q = f ( p, t ) , where q – the The concept of state is often used in science and image of a point p from in R in accordance with technology. It is most simply defined in the theory of the operation of systems. It is simply a set of Ri , has a group property: values of the parameters of the elements of the systems. However, in the theory of systems, the f ( p, t0 + t ) = f ( f ( p, t0 ), t ). concept of state considered more precisely and strictly depending on the type of system. 3. The group Ri is continuous in the sense that Researchers especially associate this concept with for all t0 and p0, and sequences {tn} and {pn}, dynamic systems, more precisely with continuous converging to t0 and p0, the relation is true and discrete systems.1 lim f ( pn , tn ) = f ( p0 , t0 ). (2) n®¥ Currently, there is a particular interest in the study of information systems. For example, in The element p of R is the state of the dynamic [Mar14] the value of entropy considered in the system, and q=f(p,t), describes the state of the study of the state of information systems, the system at the moment t provided that at the moment essence of informational entropy analyzed. t=0 the system was in the state p. It is formulated on the basis of the analysis of problems of celestial mechanics or problems of dynamics of a solid body. Therefore, the system Copyright © by the papers’ authors. Copying inputs and outputs are not explicitly highlighted in permitted for private and academic purposes. the definition. This definition requires a slight In: B. V. Sokolov, A. D. Khomonenko, A. A. change. Bliudov (eds.): Selected Papers of the Workshop Computer Science and Engineering in the framework of the 5th International 3 Formal Analogue of the State from Scientific-Methodical Conference "Problems of Information Theory Mathematical and Natural-Scientific Training in Engineering Education", St.-Petersburg, An information system is a group of Russia, 8–9 November, 2018, published at transformations {Hi} defined on the probabilistic http://ceur-ws.org space H and possessing properties: 53 1. The Hi transformations defined for all t on [0, density of random entropy values determined by the ∞). relations: 2. The function g=f (h,t), where g is the image ( x - 4.415)2 1 - of a point from H according to Hi, has a group g1 ( x) = ´ e 2´0.707 , 2 property 2p ´ 0.707 (7) f (h, t0 + t ) = f ( f (h, t0 ), t ). (3) 1 - ( x - 2.207) 2 g 2 ( x) = ´ e 2´2.263 . 2 3. The group Hi is continuous in the sense that 2p ´ 2.263 for all t0 and h0 and all sequences {tn} and {hn} converging to t0 and h0, the relation is true Graphics densities g1(x) and g2(x) are provided below in Figure 2. lim f (hn , tn ) = f (h0 , t0 ). (4) n®¥ Element h of H is the state of the information 4 The Concept of Oriented Abstract system, and g=f (h,t), describes the state of the Object L. Zadeh information system at the time t, provided that at According to [Zad64, Zad63] under the oriented the time t=0 the system was in the state of h. This abstract object (OAO) understand a certain system definition needs to specified and clarified. associated with some input signal (cause) u and As a function f(t), in our opinion, we can take output signal (consequence). Both signals are the density or entropy distribution function supplied understood as vector functions of time. The to the input of the system. For example, we relationship between them is not straightforward. A consider the function of differential entropy for specific function u can correspond to several output normal distribution with probability density f functions y, and, conversely, a specific output (t)=dnorm (t, m, σ), m=100 units, σ =20 units. It has signal can correspond to several input functions. the form: To formalize the OAO, the segment of the t function u defined on the observation interval [t0, h1 (t ) = - ò f ( z ) ln( f ( z ))dz. (5) t1], is denoted u[t0 , t1] on the closed or u(t0 , t1] 0 on semi–open interval, depending on the context- Expression (5) is the first initial moment of the simply u. As a result of the experimental study, a random entropy. Second initial entropy moment: set of input–output pairs (u(t0 , t1] , y(t0 , t1] ) is t usually obtained. h2 (t ) = - ò f ( z )(ln( f ( z ))) 2 dz. (6) If the same signal is applied to the input of 0 another sample of the test device, the output signal Similarly, we can find the higher initial it does not have to be the same as in the first case, moments of entropy. In practical applications, it is since the initial conditions for the second sample enough to limit you by two points. At the Figure 1 a may be different. Therefore, this definition graph of the hi(t) function is shown. [Nem49] reflects the fact that more than one y(t0 , t1] can correspond to a given u(t0 , t1]. The set of ordered pairs of time functions on the specified interval denoted as R(t0 , t1 ] = {u(t0 , t1 ], y(t0 , t1 ]} . (8) Based on this concept, the following definition proposed in [Zad64]. OAO a is a family R(t0 , t1] = { u(t0 , t1] , y(t0 , t1] }, t0 , t1 є (0,∞) of sets of ordered pairs (u, t) of time functions. Here the first element in (8) called the segment of the input signal or simply the input signal, and the second – the segment of the output signal or simply the output Figure 1: Graph of the hi(t) function signal. Thus, the OAO identified with a set of For our example, the minimum entropy value is input–output pairs that belong to the A. In addition, 0 nat. and a maximum of 4, 415 nat. Median value any segment of the pair for which t0 ≤ τ0 ≤ t1, τ0 ≤ h1 (100)=2,207 nat. For the first case, the initial τ1≤ t1 must belong to the A. moments are: v1(800)=4,415 nat. v2(800)=19,989 The set of all segments u on the interval (t0, t1], nat.2, the standard deviation δ(800)=0.707 nat., such that (u, y ) Î A , called the space of input coefficient of variation η(800)=0.16. For the second signals A and denote R[u]. Similarly, the set of all case, the corresponding values: v1(100)=2.207 nat., segments y, such that (u, y ) Î A , called the output v2 (100)=9.905 nat.2, δ (100)=2,263 nat., η(100)=1.0125. For the given data, the probability signal space and denote R[y]. It follows that the set R(t0, t1] of all pairs (u(t0 , t1] , y(t0 , t1] ) є A, there is 54 some subset of the product R[u]×R[y]. In the "list" 5 Concept of State of ordered pairs (u, y) each fixed u corresponds, generally speaking, to a set of different y and, We present an approach to the construction of the conversely, to each fixed y – a set of different pairs. concept of the state of L. Zadeh [Zad64]. Statement: From a mathematical point of view, this based on the content of section 4, it can be assumed essentially boils down to defining the system as a that parameter α parametrizes A if there is some relationship rather than, as usual, some function or function А defined on the product ΣxR[u] and such operator. The difference can explained by the that for all pairs (u, y) belonging to A and all t0 and example of the integrator. The values of the input t1 can be chosen from Σ such α that and output signals at the same time t are related to y = A(a ; u) . (12) each other by a differential equation For each α of Σ and for each u of R[u] in this dy (t ) case, the pair (u, A(α;u)) is an input-output pair, = u (t ) . (9) which belongs to the A. To call α by the state of the dt system, it is necessary for the function A to have the The statement that the integrator is OAO can property of conjugating reactions, which formulated described by a set of ordered pairs of functions of as follows. We agree that uv denotes a signal in time of the form which a segment v=v (t, t1] follows a segment t1 u=u(t, t1]. This is one of the reasons for choosing to (u (t ),a + ò u (x )dx , t 0 £ t £ t1 Î (0, ¥) , use half-open observation intervals. Otherwise, t0 there would be a difficulty with the definition uv at where the parameter α belongs to the space of real the point t, provided that u(t)≠v(t). In particular, if numbers, and the function u – to the class of time by definition u=u(t0 , t1] and u=u(t, t1], then functions, integrable on any finite interval. In this uu=u(t0, t1]. case, each fixed value u (t0 , t1] corresponds to a set Definition 1. A function A (α; u) has the y(t0 , t1], each element of which corresponds to property of conjugating reactions: if for each α different values of the parameter α: from Σ and each uu of R[uu] there is an element α * t1 from Σ, uniquely defined by α and u, that y (t ) = a + ò u (x )dx , t 0 £ t £ t1 . (10) A(a ; uu ` ) = A(a ; u) A(a * ; u) . (13) t0 Any mathematical relation between u and y, that Condition (13) means that the output signal (the defining the set of pairs of input–output that form A response of the system corresponding to the value is called the characteristic input–output for A. In of the parameter α and the segment uu of the output this sense, (10) is a characteristic input / output for signal) coincides with the response segment A. More generally, if the input and output signals of corresponding to the parameter α and the input the system A satisfy differential equation of the signal u, followed by the response segment form corresponding to the parameter α* and the input signal u. dny an + × × × + a 0 (t ) y = Definition 2. If α is used to parameterize A, and dt n (11) the function A(α;u) has the property of conjugation d mu of reactions, then the elements Σ represent the state = bm (t ) m + × × × + b0 (t ). A, the space Σ is called the state space A, and the dt input-output characteristic is the state of the system Then this equation is the input–output characteristic A. If u=u(t0, t1], then α of A(α;u) is called the initial for A, since it defines the set of all input–output state of the system A at time t0 and is denoted by s pairs belonging to A. (t0). In this regard, the characteristic input-output- It is useful to parameterize (or move) many state of the system A can be represented in a more input–output pairs R (t0 , t1] so that each segment of explicit form as the input signal u (t0 , t1] and each parameter value corresponds to a single segment of the output signal y(t0 , t ] = A(s(t0 ); u (t0 , t ]) , (14) y(t0 , t1]. Such a parameterization would correspond, roughly speaking, to the page numbering of the Where u(t0, t1] is the segment of the input signal, "list" of input–output pairs, on each page of which s(t0) – the initial state of the system, and y (t0 , t] – pairs with the same output signals are written out. A the corresponding output signal. Thus, equation States are essentially the values of such a (14) States that the initial state of the system A at parameter. From this point of view, the main role of the time t0 and the interval u(t0, t1] of the input the concept of state is to provide the ability to signal uniquely determines the interval of reactions associate a single output signal with each input y (t0 , t]. signal, using the state of the system as a parameter. Definition 3. Let system A be in the state s (t0)=α and at its input a signal u = u (t0, t1] is given. 55 Thanks to conjugation of the reactions A (α; u), t there is an element α* ϵ Σ such that the equation ae - ( t -t 0 ) + ò e -(t -x ) u (x )dx = (13) holds for any u = u (t, t1]. t0 The element α*, which is uniquely determined t by the values s (t0) and u = u (t0, t1], is called the = a e- (t -t ) + ò e- (t -x )u (x )dx , (21) state of system A at time t and is denoted by s (t). t0 Thus, the state of the system at time t uniquely determined by the state of the system in time t0 and Where t0 ≤ τ0 ≤ t and t the value of the signal at its input in the interval between these points in time. Symbolically a = ae * - (t -t 0 ) + ò e -(t -x ) u (x )dx . (22) t0 s(t ) = s(s(t0 ), u(t0 , t1 ), (15) Equation (20) is equivalent to the relation of the form (13) y = A (α; u), since it determines the and the resulting equation is called the state values of y for t > t0. Moreover, equations (20) and equation A. Therefore, the conjugation property of (22) indicate that the function on the right side of reactions (13) can expressed as: equation (20) has the property of conjugation of reactions. Therefore, equation (20) can be called the A( s(t0 ); uu ` ) = A( s (t0 ); u ) A( s(t ); u ) . (16) input–output–state characteristic for system A, where α is the state of the system at time t0 and Σ = The reaction of system A, which is in the state s (0, ∞), we also note that putting t = t0 (which is (t0), to the input signal uu must be identical to the valid if it does not contain delta functions with a response of system A, which is in the state s (t0), to singularity at the point t0), we obtain the input signal u and the subsequent reaction of the same system, which is in the state s (t), at the input s(t 0 ) = a = y(t0 ). (23) signal u. In [Zad64] it is shown that the function A (α;u) It follows that the state of system A at time t0 has the property of conjugation of reactions defined can identified with the output signal of this system by equations (13) and (16), it follows that the at time t0. This concludes the state definition and an function from equation (15) has the property of example illustrating the definition. conjugation of states As result of the study of the concept of "state" L. Zadeh note the following. s ( s (t 0 ); uu ` ) = s ( s (t 0 ); u ); u ` ) . (17) 1. The result is the introduction of the concept of an abstract object, defined as a family of ordered This property is equivalent to the group property pairs of time functions. An abstract object is 2 in the definition of the dynamic system. Consider defined by itself, regardless of how the concept of a simple example with the input output state is introduced to it. characteristic: 2. The concept of state introduced as a method of parameterization of a set of input–output pairs dy + y = u. (18) that provide providing a unique dependence of the dt output signal and the state of the system. There are countless ways to parameterize input-output pairs. In this case, the input–output pairs defined on Hence, we should conclude that any have the form characterization of input–output can match many of t the characteristics of the input–output–state are (u (t )), a e - ( t -t0 ) + ò e- (t -x )u (x )dx , essentially equivalent. The input–output–state t0 . (19) characteristic can considered as a description of an oriented abstract object with a specific choice of a t0 < t £ t1 system of parameters for a set of its input–output pairs. If we identify Σ with the axis of real numbers 3. Definition 3 extends to a broader class of (0, ∞) then the parameter α from equation (19) can systems than dynamic systems. In this regard, be used to parameterize A. Moreover, writing the definitions 1 and 2 are more General definitions of equation the concept of state than the indirect definition of t the concept contained implicitly in the definition of y (t ) = a e- (t -t0 ) + ò e- (t -x )u (x )dx , a dynamic system. t0 (20) t0 < t £ t1 , it is easy to verify the validity of an identity: 56 6 An Example of the Concept of State in t where a * = a e- (t -t0 ) + ò e- (t -x ) g1 (x )dx , and the Information System t0 uv is a signal in which the segment v = g1 (t, t1] As an input u (t), we use the density functions of follows the segment u = g1 (t0, t]. In this case, we the distributions of the random variable of entropy can assert (7) – g1 (t), g2 (t), shown in Figure 2. s(t 0 ) = a = y(t 0 ), (28) which means the state of the system at time t0. It can identified with the output of this system at time t0. Followed by s(t ) = s( s(t 0 ); g1 (t0 , t ]) , (29) s( s(t0 ); u ` ) = s( s(t0 ); . (30) g1 (t0 , t ]); g1 (t , t1 ]` ) Figure 2: Density functions The value of the output variable defined as We apply these functions to construct states and y(t ,t ] = A(s(t0 ); g1 (t0 , t ]) . (31) 0 exit functions of information systems, applying the Consider the numerical presentation of the results of the theory of L. Zade. To illustrate the example with the initial data for the maximum calculations, we use the integrator element. The entropy point in Figure 1. The average entropy dependence for it input–output is represented by a value and standard deviation will be equal to v1 = differential equation: 4.415 nat., Σ = 0.707 nat. The integrator input dy function is u1 (t) = dnorm (t, v1, σ). We take the + y = u. (24) dt initial values of time t0 = 1; 3h. For them, the state In this equation, u1 (t) = g1 (t), u2 (t) = g2 (t). values will be s (1) = 6.815 nat. and s (3) = 0.023 Since it is represented by OAO, the first nat., output variables: t dependence can be described by a set of ordered pairs of time functions of the following form (for y11 (t ) = s(1) + ò u1 (x )dx ; 1 example, g1 (t)). t t1 y12 (t ) = s(3) + ò u1 (x )dx 3 ( g1 (t )), a + ò g1 (x )dx , and their integral components are t0 . (25) t t t0 £ t £ t1 Î (0, ¥) v11 (t ) = ò u1 (x )d x ; n 12(t ) = ò u1 (x )d x . 1 3 In this case, each fixed value g1 (t0, t1) In Figure 3 and 4 are graphs of these functions. corresponds to a certain set y (t0, t1], each element It follows from the figures that there is practically of which corresponds to different values of the no difference between the graphs. parameter α: t1 a + ò g1 (x )dx , y (t ) = t0 . (26) t0 £ t £ t1 , t Î (0, ¥) This relationship between g1 (t) and y, which determines the set of input–output pairs that make up system A, is the input–output characteristic for A, and α, the state of the system. But for this, it is necessary to require that the function of system A, on the basis of parametrization, has the property of conjugating reactions and define a new function y = A (α; u), satisfying the property A(a ; uu ` ) = A(a ; u) A(a * ; u) , (27) Figure 3: Charts y11(t) and y12(t) 57 Figure 6: Charts v11(t) and v12(t) Figure 4: Charts v11(t) and v12(t) We have considered an example of calculation Consider the presentation of the example with provided, that the second phase of the process does the initial data for the point of the average value of not depend on the duration of the first phase. This is entropy in Figure 1. The mean value of entropy and not fully consistent with equation (32) below. If we the standard deviation are v2 = 2,207 nat., Σ = 2.263 take into account this dependence, we will have to nat. The integrator input function is u2 (t) = dnorm build two-dimensional graphs of calculations. (t, v2, σ). We take the initial values of time t0 = 1; 3h. For them, the state values are s (1) = 0.158 nat., 7 Analogy of the Theory of L. Zade and s (3) = 0.565 nat., and the output variables are: the Kolmogorov–Chapman Equations t for Information Systems y21 (t ) = s(1) + ò u2 (x )dx ; Based on the study of the state model of L. Zadeh, a 1 qualitative conclusion suggested: in the information t system, the input state can be the value of the y21 (t ) = s(3) + ò u2 (x )dx . entropy distribution function at the initial moment 3 of time before the process of information And their integral components are: transformation in the system begins. For the values t of the variable at the output of the system, take the v21 (t ) = ò u2 (x )dx ; values of the entropy distribution function obtained 1 as result of the transformation in the system. t n 22 (t ) = ò u2 (x )dx . Heuristic statement. For a complex system, as a 3 subject of future research of its informational Figure 3-6 show how, depending on and, the property, try to apply the Kolmogorov–Chapman values of variables at the integrator output, equation [Fel57]. This equation described using the measured by the value of the entropy distribution theory of L. Zade, but using the entropy distribution function. They can also act as the values of future functions to determine the states and output states in the case of continuation in time of the variables of the system [Sma10]. process under consideration. Consider an example that is simpler than the integrator, namely, a two-phase single-beam random process from the standpoint of solving the simplest Kolmogorov – Chapman equation. This allows us to show the process of solving the Kolmogorov – Chapman equation and compare the adequacy of the research with the theory of L. Zade. Let us present an example for the numerical illustration of the solution of the Kolmogorov– Chapman equation: p02 (t 0 , t + Dt ) = p01 (t 0 , t ) ´ . (32) ´ p12 (t , t + Dt ), t 0 < t < t + Dt Equation (32) reflects the presence of three Figure 5: Charts y21(t) and y22(t) discrete states and two phases with continuous distributions following each other. Moreover, the second phase is dependent on the first phase. It is required to calculate the output variable (state 02), 58 if the initial state is determined by the delay in the first phase t0, and the continuous distributions are independent. The initial data: t0 = 10 nat., the first phase f01 = dnorm (t, v1, σ1), v1 = 50 nat., σ1 = 12 nat., the second phase f12 = dnorm (t, v2, σ2), v2 = 40 nat., σ2 = 7 nat. The probabilities that the phases will be at least t represented as ¥ ¥ p 01 (t ) = ò f 01 ( z )dz, p12 (t ) = ò f12 ( z )dz. t t The variables t0, t are measured by the measure nat. Recall that we are investigating an information system defined by information states and exits, the Figure 8: Plots p01 (t), p12 (t), p02 (t) for s(30) densities and probabilities introduced above already The given example illustrates a method for measured in advance by entropy distributions. determining the state and magnitude of a function at Perform the following numerical calculations: the output of a system based on the solution of the A. Phases are independent. Kolmogorov–Chapman equation. Real information ¥ ¥ systems are more complex, they, as a rule, p01 (t ) = ò f 01 ( z )dz ò f ( z )dz , 01 "multipath", can contain in each "information ray" t0 + t t0 more than two phases of random processes. The number of states (initial and intermediate) can be ¥ very large. p12 (t ) = ò f12 ( z )dz , (33 a) B. Phases are dependent. In this case, the t formulas for the probability of phase p02 (t ) = p01 (t ) p12 (t ). implementation take the form: ¥ ¥ The results of the calculations are presented in p01 (t ) = ò f 01 ( z )dz ò f ( z)dz , 01 Figure 7. For example, consider the values of the t0 +t t0 curves at the point t = 35 nat .: p01 (t) = 0.662 nat., ¥ ¥ p12 (t) = 0.762 nat., p02 (t) = 0.505 nat. p12 (t , t + Dt ) = ò f12 ( z )dz ò f ( z )dz, 01 t +Dt t p02 (t , t + Dt ) = p01 (t ) p12 (t , t + Dt ). (33 b) Let t = 30h., Δt = 15h., Then p01 (30) = 0.798, p12 (30.45) = 0.249, p02 (30.45) = 0.199. Since p02 (t, ∆) is a function of two variables, for it we can to construct a two-dimensional graphical dependence. Similarly, we can consider an example of the application of the Kolmogorov–Chapman equation covered by feedback. The use of systems of equations of the Kolmogorov–Chapman type for Figure 7: Plots p01 (t), p12 (t), p02 (t) for s (10) estimating and predicting the values of the The initial state indicators of entropic (informational) uncertainty, t0 in our opinion, can be effective. It requires further s(t0 ) = ò f 01 ( z )dz study. 0 -4 Is: s (10) = 4.136 × 10 nat., and s (30) = 0.048 nat. 8 Conclusion In Figure 8 s (30) for comparison, an analogue The essence of the proposed model is that the of Figure 7 s (10) is shown. To estimate the concept of state, the input and output of information uncertainty function at the output of the system systems should be measured by such indicators that based on the application of the Kolmogorov– measure entropy and information. Therefore, we Chapman equation, the indicator y (t) = 1 – p02 (t) propose an informational modification of the model should be used instead of the indicator p02(t). L. Zade. To work with such a model, it is necessary, on the basis of the method of moments of a random variable of entropy, to approximately construct the necessary distribution functions of all the components constituting the information system. Then, using these distribution functions, 59 you can apply the model L. Zade. An example of Processes [Logiko-graficheskiy analiz calculation for the integrator [Zad64] is given. ierarkhii informatsionnykh protsessov]. The essence of the method consists in Intellectual Technologies on Transport. modifying the method for solving the Kolmogorov 2016. No. 2 (6). С. 30-35. – Chapman equation by applying in it the [Lia06] J. Liang, Z. Shi, D. Li, M. J. Wierman. distribution functions of the random variable of Information entropy, rough entropy and entropy indicated in the preceding paragraph of the knowledge granulation in incomplete conclusion. The simplest examples for an equation information systems. International with three discrete states and two random phases Journal of General Systems. 2006 / 12 with normal distributions are considered. General Vol. 35; Iss. 6. Pp. 641–654. conclusion: the state and output indicators in the [Liv17] I. I. Livshitz, A. V. Neklydov. Assessment information system should be measured by the of Entropy of Information Security entropy (information) associated with a certain Systems [K voprosu otsenivaniya entropii probability. sistem obespecheniya informatsionnoy We present a number of modern works related bezopasnosti]. Questions of cyber to current applied research areas in the application security [Voprosy kiberbezopasnosti]. of information systems models, including those 2017. no. 5(24). Pp. 30–41. based on the use of entropy. In [Kud16], issues [Mar14] V. M. Markelov. About system related to the concept of information and information in additive information terminology in this area, as well as models of systems [O sistemnoy informatsii v information, communication and info additivnykh informatsionnykh communication systems and their interconnection sistemakh]. Perspectives of Science and are considered. In [Liv17], an analysis of Education [Perspektivy Nauki i information security systems is conducted from the Obrazovaniya], 2014, №5(11). Pp. 31–36. position of determining the total entropy of an [Nem49] V. V. Nemytskiy. Topological questions information system. In [San08], clustering in the theory of dynamic systems algorithms based on multilevel entropy sub graphs [Topologicheskie voprosy v teorii are proposed. In [Kho16], a cloud computing model dinamicheskikh system]. Achievements in information systems with a Web interface based of mathematical sciences [Uspekhi on a multi-channel queuing system with “cooling” matematicheskikh nauk]. 1949. T. 4. and iterative solution of the Kolmogorov-Chapman Issue 6(34). S. 91–153. equations. The concepts of informational entropy, [San08] J. M. Santos, J. M. Sa, L. A. Alexandre. coarse entropy, knowledge granulation and LEGClust – a clustering algorithm based measures of granularity in incomplete information on layered entropic subgraphs. IEEE systems are considered in [Lia06]. Transactions on pattern analysis and machine intelligence, 2008, vol. 30, no. 1, References pp. 1–13. [Sma10] V. A. Smagin, S. Yu. Smagin. [Fel57] W. Feller. An Introduction to Probability Approximate Determination of the Theory and its Applications. Vol. 1. John Distribution of Entropy. Automatic Wiley & Sons Inc. 1957. 528 p. Control and Computer Sciences. 2010, [Kho16] A. D. Khomonenko; S. I. Gindin; Khalil no. 2 (44). Pp. 27–37. Maad Modher. A cloud computing model [Zad63] L. Zaden, C. F. Desoer. Liner System using multi-channel queuing system with Theory. The State Space Approach, cooling. Proceedings of the 19th McGraw-Hill, N.Y., 1963. International Conference on Soft [Zad64] L. A. Zadeh. The concept of state in Computing and Measurements, SCM system theory. Views on general systems 2016. Pp. 103–106, DOI: theory. New York:John Wiley and Sons, 10.1109/SCM.2016.7519697. Inc., 1964. 188 p. Pp. 49-60. [Kud16] V. A. Kudriashov. Logical-Graphic Analysis of Hierarchy of Information 60