<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>The Specifics of Natural Language and Ways of Processing It in the Computational Linguistics</article-title>
      </title-group>
      <contrib-group>
        <aff id="aff0">
          <label>0</label>
          <institution>Tomasz Panczyszyn Faculty of Arts and Humanities King's College London London</institution>
          ,
          <addr-line>WC 2R 2LS</addr-line>
          ,
          <country country="UK">UK</country>
        </aff>
      </contrib-group>
      <fpage>71</fpage>
      <lpage>78</lpage>
      <abstract>
        <p>-The computational linguistics is now undoubtedly a well-developing and prospective field of study. As an intersection between linguistics as such and the computer science, it treats many problems of how to process the natural language as to make it applicable and easily transformative for the machines. The standard question we put ourselves when it comes to the interconnected areas of the natural language processing and the computer science, is whether we can teach a computer how to speak a natural language. The issue we will be thinking over in the paper, will be the way of treating the matter of standard computational linguistics problems we do encounter on a daily basis where it connects to the field of linguistics. In this paper, a novel approach to natural language processing by using the neural networks and object oriented approach is presented.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>I. INTRODUCTION</title>
      <p>When we try to define what the interconnections between
the natural language and the computer science really are we
would have to take into account the fields of study both of the
disciplines regard.</p>
      <p>Linguistics, being a study of human language consists of
almost seven thousand of languages there are in the world as
the subjects of examination. It actually studies all the appearing
linguistic aspects, that is to say: syntax, pragmatics and
semantics. So a language, with all the aspects included, is being
learned by us from a minimal input in our early childhood and
serves us to communicate with each other more or less easily.
A high variety of languages we do have nowadays causes also
that there emerges some questions regarding the translation
from one language into another. The problems to be sorted
out in the field are pretty most frequently caused by the fact
that the semantics of the language [applies also for semiotics,
syntax and pragmatics] is being acquired by us, the users of
the language, quite intuitively.</p>
      <p>As the computers and machines do not function that way,
we do meet problems like the one with how to automatically
process the construction of the possesives pair of words, i.e.
the photos of my friends that a user of a natural language
would rather understand as pictures presenting the friends of
the person who speaks than (as it could appear in the computer
translation that the photos as the property of one’s firend’s).</p>
      <p>Copyright c 2016 held by the author.</p>
      <p>
        Natural language processing binds very strongly to the
subject of artificial intelligence. The idea of creating an
artificial consciousness expanded stream of science fiction
literature in the nineteenth century, and rapid technological
development aims to realize dreams of authors this type of
books. Already in the forties of the twentieth century, an
artificial neural network model has been designed. The one
which has numerous applications in life today. Especially,
neural networks [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] are used in problems of classification ([
        <xref ref-type="bibr" rid="ref2">2</xref>
        ],
[
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]) and categorization of components ([
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], [5]). In [
        <xref ref-type="bibr" rid="ref5">6</xref>
        ], [
        <xref ref-type="bibr" rid="ref6">7</xref>
        ],
the author’s shown inference and classification system based
on social media.
      </p>
      <p>
        Another group of important methods are heuristic
algorithms, which were created in order to find the maximum and
minimum values of optimized functions. Heuristics proved to
be a good option for finding solutions, not only for the problem
of searching the extremes of functions, but also in the graphics
processing [
        <xref ref-type="bibr" rid="ref7">8</xref>
        ], [
        <xref ref-type="bibr" rid="ref8">9</xref>
        ]. An example of such applications is the
search of important points (called key points) to 2D images
[
        <xref ref-type="bibr" rid="ref9">10</xref>
        ], [
        <xref ref-type="bibr" rid="ref6">7</xref>
        ]. Moreover, in [
        <xref ref-type="bibr" rid="ref10">11</xref>
        ], [
        <xref ref-type="bibr" rid="ref11">12</xref>
        ], [
        <xref ref-type="bibr" rid="ref12">13</xref>
        ], the authors have shown
that these algorithms can also be used in the construction of
unique maze. A major use of heuristics is also the problem of
queuing ([
        <xref ref-type="bibr" rid="ref13">14</xref>
        ], [
        <xref ref-type="bibr" rid="ref14">15</xref>
        ], [
        <xref ref-type="bibr" rid="ref15">16</xref>
        ]) eg .: in online stores where overload
may occur.
      </p>
      <p>
        Natural language processing is such an important subject
that can not only afford to develop the field of artificial
intelligence [
        <xref ref-type="bibr" rid="ref16">17</xref>
        ], but also help our everyday lives, eg.: lives of blind
people. Natural language processing is called parsing. One of
these methods is shown in [
        <xref ref-type="bibr" rid="ref17">18</xref>
        ]. In [
        <xref ref-type="bibr" rid="ref18">19</xref>
        ], the authors presented
semantic parsing using paraphrasing, again in [
        <xref ref-type="bibr" rid="ref19">20</xref>
        ] shown the
idea of using semantic parsing as machine translation process.
In recent years, the idea of creating computer intelligence
using chatbots gaining more and more interest in recent years.
In such applications, an important element is the knowledge
base. In [
        <xref ref-type="bibr" rid="ref20">21</xref>
        ], the authors have shown an idea of system
for development a modular knowledge base. The authors of
[
        <xref ref-type="bibr" rid="ref21">22</xref>
        ] presented comparison between conversations type
humanhuman and human-computer.
      </p>
      <p>In these paper, I would like to present a novel approach
to find the author of a longer text with the use of methods of
artificial intelligence. The proposed model has been tested and
described with regard to all its advantages and disadvantages.</p>
      <p>MULTIDIMENSIONAL LOOK AT MODERN LINQUISTICS
While studying the opportuinities of solving the problems
computers do encounter when processing the languages we
have to take into consideration also the relations between the
words in the sentences. Even if a language is being treated
intuitively, we ought to remember that the standard defining the
quality of an enunciation is also the grammatical correctness.
Like in the computer languages, in the natural ones as well,
we have combinations that are either possible or not. Let’s take
an example.</p>
      <p>We have a complex phrase
1)
2)
a)
b)
a)
b)</p>
    </sec>
    <sec id="sec-2">
      <title>John puts on a hat every time he goes out. [John = he:possible]; He puts on a hat every time John goes out [he = John: impossible].</title>
      <p>Every time John goes out, he puts on a hat
[he = John: possible];
Every time he goes out, John puts on a hat.</p>
      <p>[he=John: possible].</p>
      <p>The sentences above, therefore, show us very well that
actually every native speaker of a language has naturally, with
a quiet mininal imput from the childhood an intuition that
tells him whether a phrase [a relation of words appearing in
a sentence] could be correct and whether a grammatically and
formally correct sentence could ever be constructed like this.</p>
      <p>So what is interesting about the phrases we have been
reflecting on is that there was no need of having taken
syntax classes to know which of the sentences is apparently
ungrammatical. Thanks to the knowledge we have nowadays,
we are able to say that on the basis of the words of a natural
language and the relations between them we can construct an
infinite number of sentences that will be grammatically correct.</p>
      <p>Even if we hear a sentence for the first time in our lives,
we can suspect or just verify whether it is correct or not. The
phrase: The six-headed CS84 Tbs grilled the blind octopus
using a MAPA mug is surely correct when it comes to its
grammar but not necessarily met by us ever before. That proves
language functions intuitively.</p>
      <p>III.</p>
    </sec>
    <sec id="sec-3">
      <title>MODELS OF THEORITICAL DISCRIPTION OF</title>
      <p>LINQUISTICS</p>
      <p>We can check if the order and relation between the subject
and predicate is as it should be. From among many problems
of either the natural language processing and the computer
science issues we would like to focus just on a particular part
of them. To see how many chances are given us by the modern
computer sciences and its processing the language we will be
trying over the article to find an answer to the question of the
classification problem. The problem we would like to sort out
is: given two presidential speeches from the US election can
I guess with high probability whose speech it is?</p>
      <p>Naturally, when it comes to the classification problem
there is no simply answer to the question like the one above.
We can, however, assume for the purposes of our study that
both presidential candidates speak about recurrent themes, also
probably using reccurent words. Even if there is no exact
answer to this, we will try make a few assumptions and create
a model to try to accomplish what we suppose.</p>
      <p>The assumption we will be making will be based on what
we know for now about the presidential speeches. We will take
into consideration two presidential speeches of Barack Obama
and of Mitt Romney.</p>
      <p>”And we salute the people of Paris for insisting this crucial
conference go on; an act of defiance that proves nothing will
deter us from building the future we want for our children (. . . )
I want to show her passionate, idealistic young generation
that we care about their future. (. . . ) This summer, I saw
the effects of climate change firsthand in our northernmost
state, Alaska, where the sea is already swallowing villages
and eroding shorelines; where permafrost thaws and the tundra
burns, where glaciers are melting at a pace unprecedenten in
modern times. And it was a preview of one possible future.”
[Barack Obama, First Session of COP 21, Nov 30, 2015].</p>
      <p>As we have seen in the Obama’s speech the word future
appears three times. Let’s now have a look at Mitt Romney’s
speech.</p>
      <p>”(. . . ) They came not just in pursuit of the riches of this
world but for the richness of this life. Freedom. Freedom of
religion. Freedom to speak their mind.. Freedom to build a life.
And yes, freedom to build a business. With their own hands.”
[Mitt Romney, Republican Convention, Aug 08, 2012]
The assumption we can make for now is that American
people have been hearing Romney’s speaking very frequently
about freedom in various contexts, like: freedom of religion,
freedom to speak your mind, freedom to build a business,
freedom to build a life.</p>
      <p>Followingly, what is specific for Obama’s speeches is the
notion of future. He speaks about the future of America,
IV. PREPROCESSING OF THE ELEMENTS OF DISCOURSE</p>
      <p>To prepare long enunciations for the purposes of an
analysis of the problems of the AI, we have to represent them
in the simpliest possible form, applying there some number
values. To show how that actually functions we will be serving
ourselves with the Bayes theorem which defines directly how
the conditional probability works and can be seen as a way
of understanding how the probability that a theory is true is
influenced by an evidence that appears there for the first time.
Let’s illustrate it with a theorem
argmaxP r(cjw) = = argmax</p>
      <p>P r(w)
= argmaxP r(wjc)P r(c);</p>
      <p>P r(wjc)P r(c)
=
where P r(cjw) stands for the probability of appearing of a
particular word w in a particular class c of objects.</p>
      <p>
        The Bayes theorem ([
        <xref ref-type="bibr" rid="ref22">23</xref>
        ], [
        <xref ref-type="bibr" rid="ref23">24</xref>
        ]) has helped us in defining
both the prior and the posterior probability of concrete objects
appearing in a concrete class of objects. For the most frequent
appearing words we then calculate the values of yi using the
formula
yi = argmaxP r(ci) Y P r(xj jci);
m
j=i
(1)
(2)
where ci stands for the class of objects and the xj – for the
analyzed expression. In the next step we do process all the
values by making use of the conception of blur. In the proposed
method of processing we applied the Gaussian blur defined as
      </p>
      <p>
        In order to increase the precision of ANN, many learning
algorithms of this type of network are created. One of these
algorithms is the backpropagation algorithm, which works on
minimizing the error function and modify weights from the
output layer to the first hidden layer [
        <xref ref-type="bibr" rid="ref30">31</xref>
        ], [
        <xref ref-type="bibr" rid="ref31">32</xref>
        ], [
        <xref ref-type="bibr" rid="ref32">33</xref>
        ]. Weights
are modified using the following formula
wi = wi +
wi;
where wi means the weight on the i-th connection, and
calculated as
i =
8
&lt; i(1
      </p>
      <p>i(1
:
i)(!i
i) X
j2</p>
      <p>i) for output layer
wji i for hidden layer
where</p>
      <p>is the output value and ! is the expected value.
(3)
(4)
(5)
(6)
i is
;
(7)
G(x) =</p>
      <p>exp
2 2
)2
;
is the height of the function, and
is the shift on
where
y-axis.</p>
      <p>The Gaussian blur permits us approximate the values as
to assess the probability of an expression appearing in the
demanded conditions. The calculated values allow us create
a vector representing the enunciation desired. That could be
showed by such a theorem</p>
      <p>[G(y0); G(y1); : : : ; G(yn 1); id];
where G(yi) is stand for values calculated, in accordance with
the equation (3), and id means the author’s numerical identifier.</p>
      <p>V.</p>
      <p>NEURAL NETWORK</p>
      <p>
        Already in the forties of the twentieth century, the author
of [
        <xref ref-type="bibr" rid="ref24">25</xref>
        ], [
        <xref ref-type="bibr" rid="ref25">26</xref>
        ], [
        <xref ref-type="bibr" rid="ref26">27</xref>
        ] described the first model of artificial neural
network (ANN). Artificial neural network is a mathematical
model inspired by the action of neurons in the human brain.
      </p>
      <p>ANN is composed of three types of layers – the input,
hidden and output. The input layer is responsible for the
acceptance of teaching vector, and output for return a result of
the network. Hidden layers are located between the input and
output, they are responsible for creating a deeper network in
order to obtain better results.</p>
      <p>
        Each layer is constructed of neurons, wherein each neuron
of one layer is connected to each neuron in the next layer.
Neuron is the smallest object of neural network. The data
enters the neuron through mergers which have a certain weight.
In the neuron, all input values are rescaled by the activation
function. The value after scaling is sent to other neurons along
outbound connections to the next layer. As activation function
selected a bipolar sigmoid function [
        <xref ref-type="bibr" rid="ref27">28</xref>
        ], [
        <xref ref-type="bibr" rid="ref28">29</xref>
        ], [
        <xref ref-type="bibr" rid="ref29">30</xref>
        ]– function
is defined as
f (x) =
      </p>
      <p>2
1 + exp(
x)
;
where is a parameter in (0; 1]. Activation function graph is
shown in Fig. 6.</p>
      <p>Fig. 5: Knowledge representation using Sammon mapping.
VI. PROPOSED MODEL OF AUTHOR’S IDENTIFICATION</p>
      <p>The problem of a person verification on the basis of a
longer text requires not only the extraction of the
characteristics of his speech, but also correct classification. For this
purpose, the proposed model consists several stages. At the
beginning, the statement is entered into a computer, where it
statement is processed according to Sec. IV. The next step of
action has two paths. In the first one, the sample is stored in
the database. When the database contains a sufficient number
of samples, the neural network is trained using the knowledge
contained in the database. The second path of action is to
classify the samples by the neural network.</p>
      <p>A model of such a system is shown in Fig. 3.</p>
      <p>EXPERIMENTS</p>
      <p>The proposed solution has been tested by the use of 200
samples – 100 samples per person. Each sample contained a
fragment of statements about the future, taking into account up
to 60 words. For the purposes of minimizing the time of neural
networks learning, each sample contained 15 components. A
neural network was composed of 4 layers
input layer composed of 15 neurons;</p>
    </sec>
    <sec id="sec-4">
      <title>4 hidden layers composed of 4 neurons;</title>
      <p>output layer consisting of one neuron.</p>
      <p>To train the network, the samples were divided into two
groups (training and verifying – 80% : 20%). The problem of
classification has been shown in Fig. 5 using Sammon mapping
– based on this interpretation of the spread of knowledge it
can be seen that there can not be any easy way to separate the
samples into two groups, so the problem of classification is
extremely difficult. The network was trained to obtain the error
of the 0:24 - error learning graph is shown in Fig. 4. In order to
verify the operation of the classifier, each sample was given to
the input of the network. In consequence of the operation, the
system indicate the author correctly for 103 samples, which
results indicates in a efficiency at about 72%.</p>
    </sec>
    <sec id="sec-5">
      <title>VIII.</title>
      <p>CONCLUSION</p>
      <p>The subject of the research was to prove the
possibilities of processing the natural language by the methods of
computational linguistics. We have shown that, given two
different longer texts, we are able to identify their authors
exclusively on the basis of the words used - with a minimal
risk rate possible (error - 0:24). After entering the data into
the computer, the use of the database’s contents needed to
be classified by the neural network. Having done so, we
processed two independent author’s speeches Barack Obama’s
and Mitt Romney’s addresses and therefore, helped by the
neural networks, we could evaluate whose a text is just on the
basis of the samples given and after comparing the data we
had with the one contained in the database. The outcomes then
of such an experiment are of significant help when it comes to
the examination of idiolects and distinctive, personal styles of
constructing a discourse. It – followingly – contributes to the
development of the Artificial Intelligence as it is the computer
to identify the authorship of a text.
[5] R. Johnson and T. Zhang, “Effective use of word order for text
categorization with convolutional neural networks,” arXiv preprint
arXiv:1412.1058, 2014.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>C.</given-names>
            <surname>Napoli</surname>
          </string-name>
          and
          <string-name>
            <surname>E. Tramontana,</surname>
          </string-name>
          “
          <article-title>An object-oriented neural network toolbox based on design patterns,”</article-title>
          <source>in Information and Software Technologies</source>
          . Springer,
          <year>2015</year>
          , pp.
          <fpage>388</fpage>
          -
          <lpage>399</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>C.-T.</given-names>
            <surname>Chen</surname>
          </string-name>
          , K.-S. Chen, and
          <string-name>
            <given-names>J.-S.</given-names>
            <surname>Lee</surname>
          </string-name>
          , “
          <article-title>The use of fully polarimetric information for the fuzzy neural classification of sar images,” Geoscience and Remote Sensing, IEEE Transactions on</article-title>
          , vol.
          <volume>41</volume>
          , no.
          <issue>9</issue>
          , pp.
          <fpage>2089</fpage>
          -
          <lpage>2100</lpage>
          ,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>A.</given-names>
            <surname>Kandaswamy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. S.</given-names>
            <surname>Kumar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. P.</given-names>
            <surname>Ramanathan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Jayaraman</surname>
          </string-name>
          , and
          <string-name>
            <given-names>N.</given-names>
            <surname>Malmurugan</surname>
          </string-name>
          , “
          <article-title>Neural classification of lung sounds using wavelet coefficients,” Computers in Biology and Medicine</article-title>
          , vol.
          <volume>34</volume>
          , no.
          <issue>6</issue>
          , pp.
          <fpage>523</fpage>
          -
          <lpage>537</lpage>
          ,
          <year>2004</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>D.</given-names>
            <surname>Valentin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Abdi</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <surname>A. J. OTOOLE</surname>
          </string-name>
          , “
          <article-title>Categorization and identification of human face images by neural networks: A review of the linear autoassociative and principal component approaches</article-title>
          ,
          <source>” Journal of biological systems</source>
          , vol.
          <volume>2</volume>
          , no.
          <issue>03</issue>
          , pp.
          <fpage>413</fpage>
          -
          <lpage>429</lpage>
          ,
          <year>1994</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>A.</given-names>
            <surname>Fornaia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Napoli</surname>
          </string-name>
          , G. Pappalardo, and E. Tramontana, “
          <article-title>An aoprbpnn approach to infer user interests and mine contents on social media,” Intelligenza Artificiale</article-title>
          , vol.
          <volume>9</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>209</fpage>
          -
          <lpage>219</lpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>C.</given-names>
            <surname>Napoli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Pappalardo</surname>
          </string-name>
          , E. Tramontana,
          <string-name>
            <given-names>R. K.</given-names>
            <surname>Nowicki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. T.</given-names>
            <surname>Starczewski</surname>
          </string-name>
          , and M. Woz´niak, “
          <article-title>Toward work groups classification based on probabilistic neural network approach,”</article-title>
          <source>in Artificial Intelligence and Soft Computing</source>
          . Springer,
          <year>2015</year>
          , pp.
          <fpage>79</fpage>
          -
          <lpage>89</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>X.-S.</given-names>
            <surname>Yang</surname>
          </string-name>
          , “
          <article-title>Flower pollination algorithm for global optimization,” in Unconventional computation and natural computation</article-title>
          . Springer,
          <year>2012</year>
          , pp.
          <fpage>240</fpage>
          -
          <lpage>249</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>M.</given-names>
            <surname>Wozniak</surname>
          </string-name>
          , “
          <article-title>Fitness function for evolutionary computation applied in dynamic object simulation and positioning</article-title>
          ,” in
          <source>Computational Intelligence in Vehicles and Transportation Systems (CIVTS)</source>
          ,
          <source>2014 IEEE Symposium on. IEEE</source>
          ,
          <year>2014</year>
          , pp.
          <fpage>108</fpage>
          -
          <lpage>114</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>D.</given-names>
            <surname>Połap</surname>
          </string-name>
          , M. Woz´niak,
          <string-name>
            <given-names>C.</given-names>
            <surname>Napoli</surname>
          </string-name>
          , E. Tramontana, and R. Damasˇevicˇius, “
          <article-title>Is the colony of ants able to recognize graphic objects?</article-title>
          ”
          <source>in Information and Software Technologies</source>
          . Springer,
          <year>2015</year>
          , pp.
          <fpage>376</fpage>
          -
          <lpage>387</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>D.</given-names>
            <surname>Polap</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Wozniak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Napoli</surname>
          </string-name>
          , and E. Tramontana, “Is swarm intelligence able to create mazes?”
          <source>International Journal of Electronics and Telecommunications</source>
          , vol.
          <volume>61</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>305</fpage>
          -
          <lpage>310</lpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>D.</given-names>
            <surname>Połap</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Wozniak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Napoli</surname>
          </string-name>
          , and E. Tramontana, “
          <article-title>Real-time cloud-based game management system via cuckoo search algorithm</article-title>
          ,”
          <source>International Journal of Electronics and Telecommunications</source>
          , vol.
          <volume>61</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>333</fpage>
          -
          <lpage>338</lpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>D.</given-names>
            <surname>Połap</surname>
          </string-name>
          , “
          <article-title>Designing mazes for 2d games by artificial ant colony algorithm,” Symposium for Young Scientists in Technology, Engineering and Mathematics (SYSTEM</article-title>
          <year>2015</year>
          ), pp.
          <fpage>63</fpage>
          -
          <lpage>70</lpage>
          ,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>M.</given-names>
            <surname>Woz</surname>
          </string-name>
          ´niak,
          <string-name>
            <given-names>W. M.</given-names>
            <surname>Kempa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Gabryel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. K.</given-names>
            <surname>Nowicki</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Z.</given-names>
            <surname>Shao</surname>
          </string-name>
          , “
          <article-title>On applying evolutionary computation methods to optimization of vacation cycle costs in finite-buffer queue</article-title>
          ,
          <source>” in Artificial Intelligence and Soft Computing</source>
          . Springer,
          <year>2014</year>
          , pp.
          <fpage>480</fpage>
          -
          <lpage>491</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>M.</given-names>
            <surname>Woz</surname>
          </string-name>
          ´niak,
          <string-name>
            <given-names>W. M.</given-names>
            <surname>Kempa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Gabryel</surname>
          </string-name>
          , and
          <string-name>
            <given-names>R. K.</given-names>
            <surname>Nowicki</surname>
          </string-name>
          , “
          <article-title>A finitebuffer queue with a single vacation policy: An analytical study with evolutionary positioning</article-title>
          ,”
          <source>International Journal of Applied Mathematics and Computer Science</source>
          , vol.
          <volume>24</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>887</fpage>
          -
          <lpage>900</lpage>
          ,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>P. V.</given-names>
            <surname>Laxmi</surname>
          </string-name>
          and
          <string-name>
            <given-names>K.</given-names>
            <surname>Jyothsna</surname>
          </string-name>
          , “
          <article-title>Optimization of service rate in a discretetime impatient customer queue using particle swarm optimization,” in Distributed Computing and Internet Technology</article-title>
          . Springer,
          <year>2016</year>
          , pp.
          <fpage>38</fpage>
          -
          <lpage>42</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>X.</given-names>
            <surname>Gao</surname>
          </string-name>
          and
          <string-name>
            <given-names>N.</given-names>
            <surname>Zhu</surname>
          </string-name>
          , “
          <source>Natural language processing,” Information Technology Journal</source>
          , vol.
          <volume>12</volume>
          , no.
          <issue>17</issue>
          , pp.
          <fpage>4256</fpage>
          -
          <lpage>4261</lpage>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>K.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Zhang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Feng</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D.</given-names>
            <surname>Zhao</surname>
          </string-name>
          , “
          <article-title>Answering natural language questions via phrasal semantic parsing,” in Natural Language Processing</article-title>
          and Chinese Computing. Springer,
          <year>2014</year>
          , pp.
          <fpage>333</fpage>
          -
          <lpage>344</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>J.</given-names>
            <surname>Berant</surname>
          </string-name>
          and
          <string-name>
            <given-names>P.</given-names>
            <surname>Liang</surname>
          </string-name>
          , “
          <article-title>Semantic parsing via paraphrasing</article-title>
          .”
          <source>in ACL (1)</source>
          ,
          <year>2014</year>
          , pp.
          <fpage>1415</fpage>
          -
          <lpage>1425</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>J.</given-names>
            <surname>Andreas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Vlachos</surname>
          </string-name>
          , and
          <string-name>
            <given-names>S.</given-names>
            <surname>Clark</surname>
          </string-name>
          , “
          <article-title>Semantic parsing as machine translation</article-title>
          .
          <source>” in ACL (2)</source>
          ,
          <year>2013</year>
          , pp.
          <fpage>47</fpage>
          -
          <lpage>52</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>G.</given-names>
            <surname>Pilato</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Augello</surname>
          </string-name>
          , and
          <string-name>
            <given-names>S.</given-names>
            <surname>Gaglio</surname>
          </string-name>
          , “
          <article-title>A modular system oriented to the design of versatile knowledge bases for chatbots</article-title>
          ,
          <source>” ISRN Artificial Intelligence</source>
          , vol.
          <year>2012</year>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>J.</given-names>
            <surname>Hill</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W. R.</given-names>
            <surname>Ford</surname>
          </string-name>
          ,
          <string-name>
            <surname>and I. G</surname>
          </string-name>
          . Farreras, “
          <article-title>Real conversations with artificial intelligence: A comparison between human-human online conversations and human-chatbot conversations,” Computers in Human Behavior</article-title>
          , vol.
          <volume>49</volume>
          , pp.
          <fpage>245</fpage>
          -
          <lpage>250</lpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [23]
          <string-name>
            <surname>K.-R. Koch</surname>
          </string-name>
          , Bayes Theorem. Springer,
          <year>1990</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [24]
          <string-name>
            <given-names>D. L.</given-names>
            <surname>Faigman</surname>
          </string-name>
          and
          <string-name>
            <given-names>A.</given-names>
            <surname>Baglioni</surname>
          </string-name>
          <string-name>
            <surname>Jr</surname>
          </string-name>
          , “
          <article-title>Bayes' theorem in the trial process: Instructing jurors on the value of statistical evidence</article-title>
          .
          <source>” Law and Human Behavior</source>
          , vol.
          <volume>12</volume>
          , no.
          <issue>1</issue>
          , p.
          <fpage>1</fpage>
          ,
          <year>1988</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [25]
          <string-name>
            <given-names>K.</given-names>
            <surname>Hornik</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Stinchcombe</surname>
          </string-name>
          , and
          <string-name>
            <given-names>H.</given-names>
            <surname>White</surname>
          </string-name>
          , “
          <article-title>Multilayer feedforward networks are universal approximators,” Neural networks</article-title>
          , vol.
          <volume>2</volume>
          , no.
          <issue>5</issue>
          , pp.
          <fpage>359</fpage>
          -
          <lpage>366</lpage>
          ,
          <year>1989</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>B.</given-names>
            <surname>Kosko</surname>
          </string-name>
          , “
          <article-title>Neural networks and fuzzy systems: a dynamical systems approach to machine intelligence/book</article-title>
          and disk,” Vol. 1Prentice hall,
          <year>1992</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>D. F.</given-names>
            <surname>Specht</surname>
          </string-name>
          , “
          <article-title>Probabilistic neural networks,” Neural networks</article-title>
          , vol.
          <volume>3</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>109</fpage>
          -
          <lpage>118</lpage>
          ,
          <year>1990</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [28]
          <string-name>
            <given-names>P.</given-names>
            <surname>d</surname>
          </string-name>
          . B.
          <string-name>
            <surname>Harrington</surname>
          </string-name>
          , “
          <article-title>Sigmoid transfer functions in backpropagation neural networks</article-title>
          ,
          <source>” Analytical Chemistry</source>
          , vol.
          <volume>65</volume>
          , no.
          <issue>15</issue>
          , pp.
          <fpage>2167</fpage>
          -
          <lpage>2168</lpage>
          ,
          <year>1993</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [29]
          <string-name>
            <given-names>H.</given-names>
            <surname>Yonaba</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Anctil</surname>
          </string-name>
          , and
          <string-name>
            <given-names>V.</given-names>
            <surname>Fortin</surname>
          </string-name>
          , “
          <article-title>Comparing sigmoid transfer functions for neural network multistep ahead streamflow forecasting</article-title>
          ,
          <source>” Journal of Hydrologic Engineering</source>
          , vol.
          <volume>15</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>275</fpage>
          -
          <lpage>283</lpage>
          ,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [30]
          <string-name>
            <given-names>M.</given-names>
            <surname>Panicker</surname>
          </string-name>
          and
          <string-name>
            <given-names>C.</given-names>
            <surname>Babu</surname>
          </string-name>
          , “
          <article-title>Efficient fpga implementation of sigmoid and bipolar sigmoid activation functions for multilayer perceptrons</article-title>
          ,”
          <source>IOSR Journal of Engineering (IOSRJEN)</source>
          , pp.
          <fpage>1352</fpage>
          -
          <lpage>1356</lpage>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          [31]
          <string-name>
            <given-names>R.</given-names>
            <surname>Hecht-Nielsen</surname>
          </string-name>
          , “
          <article-title>Theory of the backpropagation neural network,”</article-title>
          <source>in Neural Networks</source>
          ,
          <year>1989</year>
          . IJCNN.,
          <source>International Joint Conference on. IEEE</source>
          ,
          <year>1989</year>
          , pp.
          <fpage>593</fpage>
          -
          <lpage>605</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          [32]
          <string-name>
            <given-names>M.</given-names>
            <surname>Riedmiller</surname>
          </string-name>
          and
          <string-name>
            <given-names>H.</given-names>
            <surname>Braun</surname>
          </string-name>
          , “
          <article-title>A direct adaptive method for faster backpropagation learning: The rprop algorithm</article-title>
          ,” in Neural Networks,
          <year>1993</year>
          ., IEEE International Conference on. IEEE,
          <year>1993</year>
          , pp.
          <fpage>586</fpage>
          -
          <lpage>591</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          [33]
          <string-name>
            <given-names>P. J.</given-names>
            <surname>Werbos</surname>
          </string-name>
          , “
          <article-title>Backpropagation through time: what it does and how to do it</article-title>
          ,
          <source>” Proceedings of the IEEE</source>
          , vol.
          <volume>78</volume>
          , no.
          <issue>10</issue>
          , pp.
          <fpage>1550</fpage>
          -
          <lpage>1560</lpage>
          ,
          <year>1990</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>