Probabilistic functions and statistical equivalence of binary shift registers with random Markov input Sergey Yu. Melnikov, Konstantin E. Samouylov Peoples’ Friendship University of Russia (RUDN University), 6, Miklukho-Maklaya St., Moscow, 117198, Russia Abstract We consider a binary non-autonomous shift register with a sequence of random variables connected into a simple homogeneous stationary Markov chain at the input. The expression is obtained for the probability function in the output sequence in the form of a fractional rational function, the arguments of which are the transition probabilities of the Markov chain. An equivalence relation arising in the case of the identical equality of the probability functions of the registers is described. The results known earlier for the case when the input sequence is a sequence of independent random variables are generalized. Keywords shift register, automata with random input, probability function 1. Introduction In a number of problems of recognition and identification of automata [1], the case is considered when the input of the automaton is a sequence of random variables. Recognition or identification of an automaton is carried out by analysis of statistics at the output of an automaton. It is known [2] that if the input of the automaton is a sequence of independent identically distributed random variables, then the distribution of symbols of the output sequence is described by a function on the Markov chain. Such a function can be specified by gluing states of the Markov chain [3]. In [4] the problem of synchronizing of finite automata, the input of which is a Bernoulli sequence, was studied. In [5] the problem of recognizing of the output function for three classes of automata was considered under the conditions when the input of the automaton is a sequence of independent identically distributed random variables. For the case when the automaton is a shift register and the input sequence is Bernoulli [6], an expression is obtained for the probability of a symbol in the output sequence. This probability polynomially depends on the parameter of the Bernoulli scheme, the degree of which does not exceed the size of the register. The coefficients of the polynomial are given by the sums of the values of the output function on some subsets of register states. Here we study the case when the symbols of the input sequence of the binary shift register are connected into a simple homogeneous stationary Markov chain with two free parameters. Workshop on information technology and scientific computing in the framework of the X International Conference Information and Telecommunication Technologies and Mathematical Modeling of High-Tech Systems (ITTMM-2020), Moscow, Russian, April 13-17, 2020 Envelope-Open melnikov@linfotech.ru (S. Yu. Melnikov); samuylov-ke@rudn.ru (K. E. Samouylov) Orcid 0000-0002-6368-9680 (K. E. Samouylov) Β© 2020 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings http://ceur-ws.org ISSN 1613-0073 CEUR Workshop Proceedings (CEUR-WS.org) 93 Sergey Yu. Melnikov et al. CEUR Workshop Proceedings 93–99 Our goal is to obtain an expression for the probability of a symbol in the output sequence and describe the equivalence relation on the set of output functions that provide the identity of the desired probabilities. We show that the desired probability is a fractional rational function of the parameters of the Markov chain, and the equivalence relation is given by the vectors of sums of the values of the output functions on some subsets of register states. 2. Definitions and statement of the problem Let 𝑉𝑛 be the space of 𝑛-dimensional binary vectors, 𝐹𝑛 be the set of Boolean functions of 𝑛 arguments, 𝑛 = 1, 2, …. For 𝑓 (π‘₯1 , π‘₯2 , … , π‘₯𝑛 ) ∈ 𝐹𝑛 by 𝐴𝑓 = (𝑋 = {0, 1}, 𝑉𝑛 , π‘Œ = {0, 1}, β„Ž, 𝑓) we denote the Moore automaton with set of states 𝑉𝑛 , the transition function β„Ž, determined by the rule β„Ž ((π‘Ž1 , … , π‘Žπ‘› ), π‘₯) = (π‘Ž2 , … , π‘Žπ‘› , π‘₯), where π‘₯, π‘Žπ‘– ∈ {0, 1}, 𝑖 = 1, 2, … , 𝑛, and output function 𝑓 (π‘₯1 , π‘₯2 , … , π‘₯𝑛 ). The automaton 𝐴𝑓 is a shift register with a size of 𝑛. If the Bernoulli sequence of binary random variables π‘₯ (𝑖) , 𝑖 = 1, 2, … with distribution 𝑃 (π‘₯ (𝑖) = 1) = 𝑝, 0 < 𝑝 < 1, used as the input of the automaton 𝐴𝑓 , then the output se- quence of random variables 𝑓 (π‘₯ (𝑖) , π‘₯ (𝑖+1) , … , π‘₯ (𝑖+π‘›βˆ’1) ), 𝑖 = 𝑛 + 1, 𝑛 + 2, … is stationary and the probability 𝑃 {𝑓 (π‘₯ (𝑖) , π‘₯ (𝑖+1) , … , π‘₯ (𝑖+π‘›βˆ’1) ) = 1} of the symbol β€œ1” in the output sequence is given by the polynomial [6]: 𝑛 Φ𝑓 (𝑝) = βˆ‘ 𝑠𝑗 𝑝 𝑗 (1 βˆ’ 𝑝)π‘›βˆ’π‘— , (1) 𝑗=0 where π‘ π‘˜ = βˆ‘ 𝑓 (π‘₯1 , π‘₯2 , … , π‘₯𝑛 ), π‘˜ = 0, 1, … , 𝑛. (2) (π‘₯1 ,π‘₯2 ,…,π‘₯𝑛 )βˆΆβˆ‘ π‘₯𝑖 =π‘˜ Suppose that the automaton 𝐴𝑓 , 𝑓 ∈ 𝐹𝑛 , 𝑛 = 1, 2, …, receives a sequence of binary random variables π‘₯ (𝑖) , 𝑖 = 1, 2, …, connected in a simple homogeneous stationary Markov chain with the transition probability matrix 1βˆ’πœ† πœ† ( ), 0 < πœ†, πœ‰ < 1. (3) πœ‰ 1βˆ’πœ‰ The sequence of random variables 𝑓 (π‘₯ (𝑖) , π‘₯ (𝑖+1) , … , π‘₯ (𝑖+π‘›βˆ’1) ), 𝑖 = 𝑛 + 1, 𝑛 + 2, … is stationary and it makes sense to talk about the probability 𝑃 {𝑓 (π‘₯ (𝑖) , π‘₯ (𝑖+1) , … , π‘₯ (𝑖+π‘›βˆ’1) ) = 1} of the symbol β€œ1” in the output sequence. The function 𝑃𝑓 (πœ†, πœ‰ ) = 𝑃 {𝑓 (π‘₯ (𝑖) , π‘₯ (𝑖+1) , … , π‘₯ (𝑖+π‘›βˆ’1) ) = 1} will be called the probabilistic func- tion of the automaton 𝐴𝑓 with a Markov dependence at the input. Our task is to obtain an expression for 𝑃𝑓 (πœ†, πœ‰ ) and break 𝐹𝑛 into classes with the same probabilistic functions. 3. Calculation of the probability function We divide the set 𝑉𝑛 of all 𝑛-dimensional binary vectors, 𝑛 β©Ύ 2, into four classes, depending on the values of the first and last coordinates. Let us denote: 94 Sergey Yu. Melnikov et al. CEUR Workshop Proceedings 93–99 𝑉 00 = {(0, 𝛼2 , 𝛼3 , … , π›Όπ‘›βˆ’1 , 0)} , 𝑉 01 = {(0, 𝛼2 , 𝛼3 , … , π›Όπ‘›βˆ’1 , 1)} , (4) 𝑉 10 = {(1, 𝛼2 , 𝛼3 , … , π›Όπ‘›βˆ’1 , 0)} , 𝑉 11 = {(1, 𝛼2 , 𝛼3 , … , π›Όπ‘›βˆ’1 , 1)} , where 𝛼𝑖 = 0, 1, 𝑖 = 2, … , 𝑛 βˆ’ 1. To each vector from 𝑉𝑛 we associate its bigram marking (𝑣00 , 𝑣01 , 𝑣10 , 𝑣11 ), where 𝑣𝛼𝛽 is the number of bigrams (𝛼, 𝛽) encountered in it. Let us put π‘›βˆ’1 𝑣𝛼𝛽 = 𝑣𝛼𝛽 (𝛼1 , 𝛼2 , 𝛼3 , … , 𝛼𝑛 ) = βˆ‘ 𝛿 ((𝛼𝛽), (π›Όπ‘˜ π›Όπ‘˜+1 )) , (5) π‘˜=1 where 𝛿 is the Kronecker symbol, 𝛼, 𝛽 = 0, 1. In each of the four classes, we single out the vectors characterized by the same markings (𝑣00 , 𝑣01 , 𝑣10 , 𝑣11 ). To do this, we introduce the following notation: Let 𝑆𝑖𝑗00 be the set of vectors from 𝑉 00 with markings (𝑣00 , 𝑣01 , 𝑣10 , 𝑣11 ) = (𝑛 βˆ’ 𝑗 βˆ’ 𝑖 βˆ’ 1, 𝑖, 𝑖, 𝑗 βˆ’ 𝑖). We denote 𝐼1 as the set of possible pairs of indices (𝑖, 𝑗). We will describe it. For 𝑛 = 2𝑙 + 2, 𝑙 = 0, 1, 2, … , (0, 0); ⎧ 𝐼1 = {(𝑖, 𝑗)} = 𝑗 = 1, … , 𝑙; 𝑖 = 1, … , 𝑗; (6) ⎨ βŽ©π‘— = 𝑙 + 1, … , 2𝑙; 𝑖 = 1, … , 2𝑙 βˆ’ 𝑗 + 1; for 𝑛 = 2𝑙 + 1, 𝑙 = 0, 1, 2, … , (0, 0); ⎧ 𝐼1 = {(𝑖, 𝑗)} = 𝑗 = 1, … , 𝑙; 𝑖 = 1, … , 𝑗; (7) ⎨ βŽ©π‘— = 𝑙 + 1, … , 2𝑙; 𝑖 = 1, … , 2𝑙 βˆ’ 𝑗 + 1. Let 𝑆𝑖𝑗01 be the set of vectors from 𝑉 01 with markings (𝑣00 , 𝑣01 , 𝑣10 , 𝑣11 ) = (𝑛 βˆ’ 𝑗 βˆ’ 𝑖 βˆ’ 1, 𝑖, 𝑖 βˆ’ 1, 𝑗 βˆ’ 𝑖). We denote 𝐼2 as the set of possible pairs of indices (𝑖, 𝑗). We will describe it. For 𝑛 = 2𝑙 + 2, 𝑙 = 0, 1, 2, … 𝑗 = 1, … , 𝑙 + 1; 𝑖 = 1, … , 𝑗; 𝐼2 = {(𝑖, 𝑗)} = { (8) 𝑗 = 𝑙 + 2, … , 2𝑙 + 1; 𝑖 = 1, … , 2𝑙 βˆ’ 𝑗 + 2; for 𝑛 = 2𝑙 + 1, 𝑙 = 0, 1, 2, … 𝑗 = 1, … , 𝑙; 𝑖 = 1, … , 𝑗; 𝐼2 = {(𝑖, 𝑗)} = { (9) 𝑗 = 𝑙 + 1, … , 2𝑙; 𝑖 = 1, … , 2𝑙 βˆ’ 𝑗 + 1. Let 𝑆𝑖𝑗10 be the set of vectors from 𝑉 10 with markings (𝑣00 , 𝑣01 , 𝑣10 , 𝑣11 ) = (𝑛 βˆ’ 𝑗 βˆ’ 𝑖 βˆ’ 1, 𝑖 βˆ’ 1, 𝑖, 𝑗 βˆ’ 𝑖) . The set of possible pairs of indices (𝑖, 𝑗) is the same as 𝐼2 . Let 𝑆𝑖𝑗11 be the set of vectors from 𝑉 11 with markings (𝑣00 , 𝑣01 , 𝑣10 , 𝑣11 ) = (𝑛 βˆ’ 𝑗 βˆ’ 𝑖 βˆ’ 1, 𝑖, 𝑖, 𝑗 βˆ’ 𝑖 βˆ’ 1) . The set of possible pairs of indices, which we denote by 𝐼3 , is obtained from 𝐼1 by replacing 𝑗 with 𝑛 βˆ’ 𝑗. 95 Sergey Yu. Melnikov et al. CEUR Workshop Proceedings 93–99 Note that the index 𝑗 in the whole introduced notation is equal to the weight (sum of ones) of the vectors from the sets under consideration. The following lemma shows that the space of all 𝑛-dimensional binary vectors is represented as a union of the introduced sets, and these sets do not intersect. Lemma. The following relations hold. 𝛼𝛽 𝛾𝛿 1. 𝑆𝑖𝑗 ∩ π‘†π‘˜π‘™ β‰  βˆ… if and only if 𝛼 = 𝛾, 𝛽 = 𝛿, 𝑖 = π‘˜, 𝑗 = 𝑙. 𝛼𝛽 2. 𝑉 𝛼𝛽 = ⋃(𝑖𝑗) 𝑆𝑖𝑗 , 𝛼, 𝛽 = 0, 1, depending on the value (𝛼, 𝛽) the union is taken from among the sets 𝐼1 , 𝐼2 , 𝐼3 . π‘›βˆ’π‘—βˆ’1 π‘—βˆ’1 3. |𝑆𝑖𝑗00 | = ( )( ) , (𝑖, 𝑗) ∈ 𝐼1 ; 𝑖 π‘–βˆ’1 π‘›βˆ’π‘—βˆ’1 π‘—βˆ’1 |𝑆𝑖𝑗01 | = |𝑆𝑖𝑗10 | = ( )( ) , (𝑖, 𝑗) ∈ 𝐼2 ; π‘–βˆ’1 π‘–βˆ’1 π‘›βˆ’π‘—βˆ’1 π‘—βˆ’1 |𝑆𝑖𝑗11 | = ( )( ) , (𝑖, 𝑗) ∈ 𝐼3 . π‘–βˆ’1 𝑖 Hereinafter, we assume that 𝑛 1, 𝑛 = βˆ’1, ( )={ βˆ’1 0, 𝑛 β‰  βˆ’1. 𝛼𝛽 Proof. The first two points of the lemma follow from the construction of sets 𝑆𝑖𝑗 . The proof of the third one is based on counting the number of binary vectors of fixed weight with a given number of series of zeros and ones [7]. For 𝐷 βŠ‚ 𝑉 𝑛 we denote ‖𝑓/𝐷‖ = βˆ‘ 𝑓 (π‘₯1 , π‘₯2 , … , π‘₯𝑛 ). (10) (π‘₯1 ,π‘₯2 ,…,π‘₯𝑛 )∈𝐷 Theorem 1. Let π‘₯ (𝑖) , 𝑖 = 1, 2, … be a stationary Markov chain with the states {0, 1} and the transition probability matrix 1βˆ’πœ† πœ† ( ). (11) πœ‰ 1βˆ’πœ‰ The probabilistic function 𝑃𝑓 (πœ†, πœ‰ ) of the automaton 𝐴𝑓 with a Markov dependence at the input has the form 1 𝑃𝑓 (πœ†, πœ‰ ) = {βˆ‘(1 βˆ’ πœ†)π‘›βˆ’π‘—βˆ’π‘–βˆ’1 πœ†π‘– πœ‰ 𝑖+1 (1 βˆ’ πœ‰ )π‘—βˆ’π‘– ‖𝑓/𝑆 00 β€– + πœ†+πœ‰ 𝐼 𝑖𝑗 1 + βˆ‘(1 βˆ’ πœ†)π‘›βˆ’π‘—βˆ’π‘– πœ†π‘– πœ‰ 𝑖 (1 βˆ’ πœ‰ )π‘—βˆ’π‘– (‖𝑓/𝑆 01 β€– + ‖𝑓/𝑆 10 β€–) + 𝑖𝑗 𝑖𝑗 𝐼2 + βˆ‘(1 βˆ’ πœ†)π‘›βˆ’π‘—βˆ’π‘– πœ†π‘–+1 πœ‰ 𝑖 (1 βˆ’ πœ‰ )π‘—βˆ’π‘–βˆ’1 ‖𝑓/𝑆 11 β€–} . (12) 𝑖𝑗 𝐼3 96 Sergey Yu. Melnikov et al. CEUR Workshop Proceedings 93–99 To prove the Theorem 1, we use the lemma, the formulas for the total and conditional probability, and the well-known [8] form of the stationary distribution vector of the input sequence πœ‰ πœ† (𝑃 (π‘₯ (𝑖) = 0) , 𝑃 (π‘₯ (𝑖) = 1)) = ( , ). (13) πœ†+πœ‰ πœ†+πœ‰ 4. The relation of statistical equivalence with Markov dependence at the input and its properties By analogy with how this was done in [5], we introduce the equivalence relation on the set 𝐹𝑛 . We call the functions 𝑓 and 𝑔 from 𝐹𝑛 statistically equivalent for a Markov input dependence, Ξ” having adopted the notation 𝑓 = 𝑔 for this case if the identity 𝑃𝑓 (πœ†, πœ‰ ) = 𝑃𝑔 (πœ†, πœ‰ ) for 0 < πœ†, πœ‰ < 1 holds. Obviously, the introduced relation is an equivalence relation breaking 𝐹𝑛 into disjoint classes. (π‘˜) Vector π‘š(𝑓 ) = (π‘š(1) (𝑓 ), π‘š(2) (𝑓 ), π‘š(3) (𝑓 )), where π‘š(𝑖) (𝑓 ) = (π‘šπ‘–π‘— , (𝑖, 𝑗) ∈ πΌπ‘˜ ), π‘˜ = 1, 2, 3, (1) (2) (3) π‘šπ‘–π‘— = ‖𝑓/𝑆 00 β€–, (𝑖, 𝑗) ∈ 𝐼1 , π‘šπ‘–π‘— = ‖𝑓/𝑆 01 βˆͺ 𝑆 10 β€–, (𝑖, 𝑗) ∈ 𝐼2 , π‘šπ‘–π‘— = ‖𝑓/𝑆 11 β€–, (𝑖, 𝑗) ∈ 𝐼3 , and the order 𝑖𝑗 𝑖𝑗 𝑖𝑗 𝑖𝑗 of enumeration of the sets 𝐼1 , 𝐼2 , 𝐼3 is fixed, for example, lexicographical, let’s call the Markov weight structure of the Boolean function 𝑓. Ξ” Theorem 2. Two Boolean functions are =-equivalent if and only if their Markov weight structures coincide. Proof. Let us consider the system of real functions defined on the square 0 < πœ†, πœ‰ < 1: (1) 1 π‘›βˆ’π‘—βˆ’π‘–βˆ’1 πœ† 𝑖 πœ‰ 𝑖+1 (1 βˆ’ πœ‰ )π‘—βˆ’π‘– , (𝑖, 𝑗) ∈ 𝐼 βŽ§π‘…π‘–π‘— (πœ†, πœ‰ ) = πœ† + πœ‰ (1 βˆ’ πœ†) 1 βŽͺ βŽͺ (2) 1 𝑅𝑖𝑗 (πœ†, πœ‰ ) = (1 βˆ’ πœ†)π‘›βˆ’π‘—βˆ’π‘– πœ†π‘– πœ‰ 𝑖 (1 βˆ’ πœ‰ )π‘—βˆ’π‘– , (𝑖, 𝑗) ∈ 𝐼2 . (14) ⎨ πœ† + πœ‰ βŽͺ (3) βŽͺ𝑅 (πœ†, πœ‰ ) = 1 (1 βˆ’ πœ†)π‘›βˆ’π‘—βˆ’π‘– πœ†π‘–+1 πœ‰ 𝑖 (1 βˆ’ πœ‰ )π‘—βˆ’π‘–βˆ’1 , (𝑖, 𝑗) ∈ 𝐼 3 ⎩ 𝑖𝑗 πœ†+πœ‰ (π‘˜) Denoting 𝑅(πœ†, πœ‰ ) = (𝑅(1) , 𝑅(2) , 𝑅(3) ), where 𝑅(π‘˜) = (𝑅𝑖𝑗 (πœ†, πœ‰ ), (𝑖, 𝑗) ∈ πΌπ‘˜ ), π‘˜ = 1, 2, 3, we rewrite the expression (12) for the probability function in the form 𝑇 𝑃𝑓 (πœ†, πœ‰ ) = 𝑅(πœ†, πœ‰ ) (π‘š(𝑓 )) . (15) To complete the proof, it remains to note that the system (14) is a linearly independent system of functions on the square 0 < πœ†, πœ‰ < 1. Ξ” The proved theorem allows us to identify the = - equivalence class [𝑓] Ξ” with the vector π‘š(𝑓 ) = of the Markov weight structure. Since the coordinates of the vector π‘š(𝑓 ) are non-negative integers, it is not difficult to obtain Ξ” the following description of the structure of the relation = - equivalence. Ξ” Theorem 3. The number of functions that are = - equivalent to a function 𝑓 is determined by the expression 97 Sergey Yu. Melnikov et al. CEUR Workshop Proceedings 93–99 π‘›βˆ’π‘—βˆ’1 π‘—βˆ’1 π‘›βˆ’π‘—βˆ’1 π‘—βˆ’1 π‘›βˆ’π‘—βˆ’1 π‘—βˆ’1 βŽ›( )( )⎞ βŽ›2 ( )( )⎞ βŽ›( )( )⎞ |[𝑓] Ξ” | = ∏ ⎜ 𝑖 𝑖 βˆ’ 1 ⎟ ∏ ⎜ 𝑖 βˆ’ 1 𝑖 βˆ’ 1 ⎟ ∏ ⎜ π‘–βˆ’1 𝑖 ⎟. = ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ (𝑖,𝑗)∈𝐼1 ⎝ ‖𝑓/𝑆 00 β€– ⎠ (𝑖,𝑗)∈𝐼2 ⎝ ‖𝑓/𝑆 01 βˆͺ 𝑆 10 β€– ⎠ (𝑖,𝑗)∈𝐼3 ⎝ ‖𝑓/𝑆 11 β€– ⎠ 𝑖𝑗 𝑖𝑗 𝑖𝑗 𝑖𝑗 (16) Ξ” The number of = - equivalence classes is Ξ” | = ∏ ((𝑛 βˆ’ 𝑗 βˆ’ 1) (𝑗 βˆ’ 1) + 1) ∏ (2 (𝑛 βˆ’ 𝑗 βˆ’ 1) (𝑗 βˆ’ 1) + 1) |𝐹𝑛 /= 𝑖 π‘–βˆ’1 π‘–βˆ’1 π‘–βˆ’1 (𝑖,𝑗)∈𝐼1 (𝑖,𝑗)∈𝐼2 π‘›βˆ’π‘—βˆ’1 π‘—βˆ’1 ∏ (( )( ) + 1) (17) π‘–βˆ’1 𝑖 (𝑖,𝑗)∈𝐼3 For comparison the Table 1 presents the number of all Boolean functions, the number of Ξ” =-equivalence classes and the number of β‰ˆ-equivalence classes (equivalence at the Bernoulli input, see [5]) for 𝑛 = 2, 3, 4, 5. Table 1 The numbers of equivalence classes n 2 3 4 5 |𝐹𝑛 | 16 256 65536 4294967296 |𝐹𝑛 /=Ξ”| 12 144 11664 8622400 𝐹 | 𝑛 /β‰ˆ| 12 64 700 17424 Ξ” In view of the complexity of the expression for the number of =-equivalence classes, it is of interest to estimate its growth at large 𝑛. Using the Stirling and Euler-Maclaurin formulas, we can obtain the following result. Theorem 4. If 𝑛 β†’ ∞, the relation Ξ” | = exp ( 5 𝑛3 ln 𝑛 + 𝑂(𝑛3 )) . |𝐹𝑛 /= (18) 4 5. Conclusion The expression is obtained for the probabilistic function that describes the probability of a symbol in the output sequence of a binary shift register with a random binary variables connected into a simple homogeneous stationary Markov chain as input. An equivalence relation is described on the set of output functions of binary shift registers that occurs when the corresponding probability functions are identically equal. The results obtained generalize those that were previously known for the case when the input sequence is a sequence of independent random variables. 98 Sergey Yu. Melnikov et al. CEUR Workshop Proceedings 93–99 Acknowledgments The publication has been prepared with the support of the β€œRUDN University Program 5- 100”. The reported study was funded by RFBR, project numbers 18-00-01555 (18-00-01685), 19-07-00933. References [1] S. Frenkel, Probabilistic model of control-flow altering based malicious attacks, in: O. Strich- man, R. Tzoref-Brill (Eds.), Hardware and Software: Verification and Testing, Springer International Publishing, Cham, 2017, pp. 249–252. [2] A. S. Davis, Markov chains as random input automata, The American Mathematical Monthly 68 (1961) 264–267. [3] V. M. Zakharov, B. F. Eminov, S. V. Shalagin, Representation of markov’s chains func- tions over finite field based on stochastic matrix lumpability, in: 2016 2nd International Conference on Industrial Engineering, Applications and Manufacturing (ICIEAM), 2016. doi:1 0 . 1 1 0 9 / I C I E A M . 2 0 1 6 . 7 9 1 1 6 6 2 . [4] V. V. Gusev, Synchronizing automata with random inputs, in: A. M. Shur, M. V. Volkov (Eds.), Developments in Language Theory, volume 8633, Springer International Publishing, Cham, 2014, pp. 68–75. doi:1 0 . 1 0 0 7 / 9 7 8 - 3 - 3 1 9 - 0 9 6 9 8 - 8 _ 7 . [5] S. Y. Melnikov, K. E. Samouylov, The recognition of the output function of a finite automaton with random input, in: V. M. Vishnevskiy, D. V. Kozyrev (Eds.), Distributed Computer and Communication Networks, volume 919, Springer International Publishing, Cham, 2018, pp. 525–531. doi:1 0 . 1 0 0 7 / 9 7 8 - 3 - 3 1 9 - 9 9 4 4 7 - 5 _ 4 5 . [6] B. A. Sevastyanov, The conditional distribution of the output of an automaton without memory for given characteristics of the input, Discrete Mathematics and Applications 4 (1994) 1–6. doi:1 0 . 1 5 1 5 / d m a . 1 9 9 4 . 4 . 1 . 1 . [7] V. N. Sachkov, Combinatorial Methods in Discrete Mathematics, Encyclopedia of Mathe- matics and its Applications, Cambridge University Press, 1996. [8] J. G. Kemeny, J. Snell, Finite Markov Chains, Springer-Verlag, 1976. 99