<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <issn pub-type="ppub">2405-6561</issn>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Selected Heuristic Algorithms Analysis and Comparison in Solving Optimization Problems</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Bartłomiej Szlachta</string-name>
          <email>bartek01570@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Kamil Rusin</string-name>
          <email>kamirus323@student.polsl.pl</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Łukasz Płaneta</string-name>
          <email>luke.planeta@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Faculty of Applied Mathematics, Silesian University of Technology</institution>
          ,
          <addr-line>Kaszubska 23, 44-100 Gliwice</addr-line>
          ,
          <country country="PL">Poland</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2016</year>
      </pub-date>
      <volume>2</volume>
      <issue>1</issue>
      <fpage>79</fpage>
      <lpage>89</lpage>
      <abstract>
        <p>-Three examples of heuristic algorithms: The Firefly Algorithm, The Artificial Bee Colony Algorithm and The Genetic Algorithm, have been tested in regard to the ability of solving four testing cost functions. In this paper the algorithms are explained and the results are discussed and compared. TESTING FUNCTIONS To connect the FA algorithm with the optimization problem, the input of a cost function is interpreted as the coordinates of a firefly in a two-dimensional space. Then, for</p>
      </abstract>
      <kwd-group>
        <kwd>metaheuristics</kwd>
        <kwd>heuristic algorithms</kwd>
        <kwd>firefly algorithm</kwd>
        <kwd>artificial bee colony algorithm</kwd>
        <kwd>genetic algorithm</kwd>
        <kwd>optimization problem</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>INTRODUCTION</p>
      <p>
        Heuristic algorithms are often used for optimization
purposes, when a space of solutions is complex and common
methods are imprecise these algorithms may help. We can find
various application of heuristics in real world problems like
image processing [9], [
        <xref ref-type="bibr" rid="ref8">10</xref>
        ]; games playability [
        <xref ref-type="bibr" rid="ref9">11</xref>
        ]; cognitive
aspects of political sciences [
        <xref ref-type="bibr" rid="ref10">12</xref>
        ]; terminal slider control [
        <xref ref-type="bibr" rid="ref11">13</xref>
        ];
metal annihilating processes [
        <xref ref-type="bibr" rid="ref12">14</xref>
        ]. When it comes to the cost
function, it is sometimes difficult or it is too time-consuming
to find its minimum using mathematical methods. By the
      </p>
      <p>To check the results, we use four binary testing functions:
minimum of cost we understand a point for which the value of
function is the lowest. Heuristic algorithms are very useful for
finding solutions to problems which are difficult to solve in a
usual way. Heuristic algorithms can be seen as help which
randomize a definite number of points and move them in a
specific way. As a result, the points group in the area of the
minimum of the function. We will never be able to determine
the minimal value of the function. However, the determined
value should be precise enough.</p>
      <p>Each algorithm works in a different way, so one defined
algorithm which is able to solve all the optimization problems
does not exist. For different functions some of them perform
better than others, so the testing results of vary from each other.</p>
      <p>It is our responsibility to choose an algorithm and its
coefficients to be the most suitable to find the best value of
objective function. The only way to choose them is to apply
trial and error method.</p>
      <p>II.
1)
2)</p>
      <p>Rosenbrock function: Considered at a range from -3 to 3, it has a global minimum at f (1, 1) = 0 and is calculated as:
Beale function: Considered at a range from -4.5 to 4.5, it has a global minimum at f (3, 0.5) = 0 and is calculated as:
3) Himmelblau function: Considered at a range from -10 to 10, it has four global minimums:</p>
      <p>f (3, 2) = f (-2,805118, 3,131312) = f (-3,779310, -3,283186) = f (3,584428, -1,848126) = 0 and is calculated as:</p>
      <p>Levy function n. 13: Considered at a range from -4.5 to 4.5, it has a global minimum at f (0, 0) = 0 and is calculated as:
III.</p>
      <p>FIREFLY ALGORITHM
A.</p>
    </sec>
    <sec id="sec-2">
      <title>Fireflies’ behaviour / Genesis</title>
      <p>The firefly algorithm (FA) is an example of nature-inspired
heuristic algorithm, proposed by Xin-She Yand in [1] with
some interesting applications [2], [3]. It is inspired by the</p>
      <sec id="sec-2-1">
        <title>Copyright held by the author(s). behaviour of fireflies, beetles from the family of Lampyridae. Each of them emits a bioluminescent light from abdomen and uses it to attract the less bright fireflies.</title>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Method of operations</title>
      <p>every point of the space the cost function value can be
calculated for its co-ordinates. Also, it can be said that a firefly
has a ‘f’ value, which means the value for the point the firefly
is located at. The main objective of the algorithm is to find the
best point – that with the smallest ‘f’.</p>
      <p>Firstly, the population of fireflies needs to be generated
randomly before the algorithm runs. Then, the algorithm should
be launched and repeated a certain number of times. During
every iteration, each firefly is moved, one after the other, in the
direction of better ones in order to find the best ‘f’ value. There
is always a random factor added to the movement. If a firefly is
the best of all, the random step is the only factor responsible for
moving a firefly.</p>
      <p>C.</p>
    </sec>
    <sec id="sec-4">
      <title>Pseudocode</title>
      <p>In our implementation, the FA algorithm
accordance with the pseudocode showed below:
works in</p>
      <sec id="sec-4-1">
        <title>Input:</title>
        <sec id="sec-4-1-1">
          <title>a) Co-ordinates (xi, yi),</title>
        </sec>
        <sec id="sec-4-1-2">
          <title>Attributes of the Ai fireflies colony:</title>
          <p>b) Value ‘fi’ of the cost function, calculated for (xi, yi),
coordinates.</p>
        </sec>
        <sec id="sec-4-1-3">
          <title>Population size ‘n’, a natural number,</title>
          <p>Maximum attractiveness ’β0’, a real number in the
range (0, 1],
Absorption coefficient ‘γ’, a real number in the range
(0, 1],
Random step size ‘a’, a real positive number.
The co-ordinates (xi, yi) and the ‘fi’ value of the best
firefly,</p>
          <p>Changes in every firefly’s attributes.






</p>
        </sec>
      </sec>
      <sec id="sec-4-2">
        <title>Output:</title>
      </sec>
      <sec id="sec-4-3">
        <title>Calculations:</title>
        <p>i = 0
while i &lt; n do
j = 0
while j &lt; n do
Counting the distance between Ai and Aj
Assigning an attractiveness to Aj
if fi &lt; fj then</p>
        <p>Moving Ai in the direction of Aj
end if
Moving Ai randomly.</p>
        <p>Calculating fi value for the new Ai co-ordinates
end while
end while</p>
        <p>Comparing ‘fi’ values for each fireflies
end</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>D. Details</title>
      <p></p>
      <p>The distance between two fireflies Ai and Aj is
obtained by using formula:
The Aj attractiveness is determined by equation:
The distance that Ai is moved in Aj direction is
calculated as:
The random move of Ai is defined by a vector which
co-ordinates are random real numbers from the range
[a, a].</p>
      <p>The consecutive algorithm iterations results can be split into
two phases:
1)
2)</p>
    </sec>
    <sec id="sec-6">
      <title>Grouping fireflies together: In this part each</title>
      <p>iteration of the algorithm results in the fireflies
getting closer and closer to each other and the
distance between the most distant pair of fireflies
declines. The middle of the group moves randomly.
There is also a possibility of creating more than one
group, each at local minimums.</p>
    </sec>
    <sec id="sec-7">
      <title>Movement of the group: Now the fireflies are</title>
      <p>moving randomly with the width of the group
staying approximately constant. The whole group
can move consistently in the direction of the best
firefly when a better ‘f’ value is found.</p>
      <p>The phases are different for various algorithm coefficients
values. Their adequate choice is essential in order to find the
global minimum as precisely as possible.</p>
      <p>For different coefficients values the algorithm returns
different results. That can be analysed in order to find the
coefficients values adequate to a function and a need.</p>
      <p>The fireflies’ movement consists of two factors: attraction
by the others and the random step. The former is very essential
for the initial iterations when the distance between fireflies is
long. During the latter, fireflies are located so close to each
other that the attraction impact is responsible only for keeping
the group together and does not help with searching cost
function global minima. Those statements can be justified by
mathematical calculations.</p>
      <p>By connecting the formula (6) with the formula (7) we get a
function s(r) which describes the distance the fireflies cover
depending on the initial distance between them:</p>
      <p>Value of s(r) function (8) depends, for example, on ‘β0’ and
‘γ’. Consequently, s(r) can be increased by increasing ‘β0’
value and decreasing ‘γ’ value and vice versa.</p>
      <p>Having calculated the first derivative of (8), we can analyse
monotonicity of s(r). With the initial distance declining,
starting from the zero value in the infinity, it appreciates on
value, reaches a peak at the distance r0 shown below and
declines to 0 at the 0 distance.
(9)</p>
      <p>So, the distance Ai firefly covers is the longest when Aj
firefly is located within r0 initial distance of Ai. That means that
by reducing absorption coefficient ‘γ’ value, we increase r0
value. The size of population ‘n’ has also impact on r0 because
the more fireflies exist, the higher probability of one staying at
r0 is. On the other hand, a big amount of fireflies can slow
down a computer because it would need to make many more
calculations.</p>
      <p>It needs to be mentioned that the s(r) value needs to be
much more significant than the random step ‘a’ value.
Otherwise, the fireflies will not attract each other enough and
they will split into two or more groups facing the local, not
global minimums.</p>
      <p>During the second phase fireflies are located so close to
each other that s(r) values are insignificant compared to ‘a’ and
are responsible only for keeping the group together – when a
firefly moves too far, s(r) soars and brings the firefly back to
the group.</p>
      <p>When a randomly moved firefly hits the ‘f’ value better
than the found so far, it attracts all other fireflies. As a result,
the group is moving in its direction. The global minimum of the
cost function is located approximately in the middle of the
group and the determination of the minimum ‘f’ value with an
acceptable accuracy depends mostly on luck.</p>
      <p>G. Conclusions
1) To minimise the time required for the first section,
increase ‘n’ and ‘β0’ values, reduce ‘a’ value and
set an appropriate value of ‘γ’.
2) To reduce a risk of fireflies splitting into many
groups, try increasing ‘γ’ and ‘β0’ values and
reducing ‘a’ value.
3) To reduce the time required for the second section,
increase ‘a’ value.
4) To determine the minimum value with the better
accuracy, increase the size of population ‘n’ and
reduce ‘a’ value, but not exactly to 0.</p>
      <sec id="sec-7-1">
        <title>ARTIFICIAL BEE COLONY ALGORITHM</title>
      </sec>
    </sec>
    <sec id="sec-8">
      <title>H. Bees’ behaviour / Genesis</title>
      <p>Artificial Bee Colony algorithm (ABC) was proposed and
implemented by Dervis Karaboga in [4]. He was inspired by a
bee swarm behaviour and their way to find the best food source
in the neighbourhood. ABC is based on population of a swarm
and defined food sources. It is usually used for optimization
problems. There are many articles reporting this method in
various optimization aspects [5] – [8]. Bees are very
cooperative and thanks to it, the swarm is able to perform tasks
successfully. The whole population is divided into three
groups: employed bees, onlooker bees and scout bees. First of
them are looking for food sources. Meanwhile, they are also
sharing information with other bees about the best ones they
have found so far. The second group of bees observes them and
decides to follow them to the food sources with the best fitness.
The better the source, the higher possibility of choosing it.
Employed bees can also change their food sources. They may
be encouraged to follow other bees to their food sources.
However, if an employed bee decides to leave its current food
source and go to other randomly selected one, then it is named
a scout bee. Generally, a population of the swarm in ABC
algorithm is divided into half. Employed bees are the first half,
whereas onlooking bees are in the other one. Over time,
artificial bees discover better and better food sources to finally
find the ones with the largest amount of nectar. Then the
coordinates of its position represent the best solution for our
optimization problem.</p>
      <p>I.</p>
    </sec>
    <sec id="sec-9">
      <title>Pseudocode</title>
      <p>In our implementation, the ABC algorithm works in
accordance with the pseudocode showed below:</p>
      <sec id="sec-9-1">
        <title>Input:</title>
        <p>



</p>
      </sec>
      <sec id="sec-9-2">
        <title>Output:</title>
        <p>Population of the colony ‘n’, a natural number,</p>
        <sec id="sec-9-2-1">
          <title>Number of iterations, a natural number Limit of chances for food source ‘t’, a natural number, Number of food sources ‘o’, a natural number.</title>
          <p>The co-ordinates (xi, yi) and value of the best food
source.</p>
        </sec>
      </sec>
      <sec id="sec-9-3">
        <title>After</title>
        <p>solution</p>
      </sec>
      <sec id="sec-9-4">
        <title>Phases of the algorithm:</title>
      </sec>
      <sec id="sec-9-5">
        <title>Initialization Phase</title>
        <p>Setting number of food sources, population, functions
parameters.</p>
      </sec>
      <sec id="sec-9-6">
        <title>Repeat for every iteration</title>
        <p>Employed bees phase
Onlooker bees phase
Scouts bees phase
Memorizing the best solution achieved so far
Writing the co-ordinates and value of the best</p>
      </sec>
    </sec>
    <sec id="sec-10">
      <title>J. Initialization phase</title>
      <p>Firstly, variables for algorithm must be set, e.g. population
of a swarm, limit of chances for each food source. We can also
change the number of food sources, which optimal value equals
where ‘lb’ is lower and ‘ub’ is upper bound of our parameter
‘xi’.</p>
    </sec>
    <sec id="sec-11">
      <title>K. Employed bees phase</title>
      <p>Employed bees look for a new food source that might have
a greater amount of nectar in a close neighbourhood of their
last food source. The value of food source is set using the
following equation:</p>
      <p>where ‘i’ is a randomly chosen parameter index, ‘xi’ is a
randomly selected food source and ‘σ’ represents a random
number from range [-1,1].</p>
      <p>Simultaneously, fitness of every food source is being
computed. After these processes a bee makes a choice. The
fitness value may be calculated using the following formula:
the half of the population. Then, we must adjust parameters for
a function we want to optimize, e.g. upper and lower bound.
During this phase all food sources are initialized by scout bees.
Every food source represents possible solution for our
optimization problem. The following equation might be used
for initialization:</p>
    </sec>
    <sec id="sec-12">
      <title>N. Observations</title>
      <p>The proper number of food sources equals a half of the
swarm population. If we keep reducing the number of food
sources, we will have worse and worse solutions. When we
increase bee population, it does not always go hand in hand
with getting a more accurate solution. To obtain optimal
chances we limit ‘o’ to 100. There is a slight difference, when
we change this parameter.
(10)
(11)
(12)
(13)
where f(xi) is a value of solution ‘xi’.</p>
    </sec>
    <sec id="sec-13">
      <title>L. Onlooker bees phase</title>
      <p>Onlooker bees wait in the swarm for information about
food sources provided by employed bees. Then a probability is
calculated. It is based on the fitness value, given them by
employed bees. This probability means that onlooker bees are
more willing to exploit food sources with higher fitness value.
It can be calculated using a following formula:</p>
      <p>where ‘fit(i)’ means fitness of food source with index ‘i’
and ‘s’ means a number of food sources. Afterwards a
neighbouring source is calculated from (10) and fitness value
from (12). The onlooker bee must choose one of two. The more
bees are recruited to better food sources, the more positive
feedback they have.</p>
    </sec>
    <sec id="sec-14">
      <title>M. Scout bees phase</title>
      <p>In this phase unemployed bees choose a new food source
randomly. Scouts are bees, which abandoned their solutions
due to chance limit set at the beginning of the algorithm. This
means that their solutions could not be upgraded. What is more,
they choose next food sources as well as share negative
opinions on abandoned food sources, so that it will balance
positive ones.</p>
      <sec id="sec-14-1">
        <title>Input:</title>
      </sec>
      <sec id="sec-14-2">
        <title>Output:</title>
        <p>




GENETIC ALGORITHM</p>
      </sec>
    </sec>
    <sec id="sec-15">
      <title>A. Genesis</title>
      <p>
        Genetic algorithm (GA) was proposed by Alan Turing in
1950. He was inspired by Charles Darwin’s theory of natural
evolution. It was developed by John Holland and colleagues at
the University of Michigan in [
        <xref ref-type="bibr" rid="ref16">18</xref>
        ]. Nowadays we can find its
various improved versions reported in many articles [
        <xref ref-type="bibr" rid="ref13">15</xref>
        ]-[
        <xref ref-type="bibr" rid="ref15">17</xref>
        ].
GA is based on population which uses natural selection. It is
used for optimization problems. Algorithm relies on biological
operations: selection, mutation and crossover, as in natural
genetics. Firstly, whole population is selected by their fitness
score (degree of adaptation). The higher the score, the higher
chance of surviving to the next generation. The next generation
is composed of parents, who survived and their children.
Children are created by crossover or mutation.
      </p>
    </sec>
    <sec id="sec-16">
      <title>B. Pseudocode</title>
      <p>I our implementation, the GA works in accordance with the
pseudocode shown below:</p>
      <p>Population of the generation, a natural number,</p>
      <sec id="sec-16-1">
        <title>Number of iterations, a natural number,</title>
      </sec>
      <sec id="sec-16-2">
        <title>Two chromosomes for each individual. Two chromosomes of the best individual and fitness value.</title>
        <sec id="sec-16-2-1">
          <title>Phases of the algorithm:</title>
          <p>Initialization phase
Setting number of populations, their chromosomes and
function parameters
Repeating for every generation:</p>
          <p>Selection</p>
          <p>Setting the best individual and elite individuals
Crossover
Mutation</p>
        </sec>
      </sec>
      <sec id="sec-16-3">
        <title>After Write the best fitness score and chromosomes</title>
      </sec>
    </sec>
    <sec id="sec-17">
      <title>C. Details</title>
      <p>The leader of the population has the highest chance of
reproduction. The elite of the population have lower
chance which is still high. A typical individual has the
lowest chance of reproduction. This algorithm is based
on herd selection, where Alpha male, some group of
elite individuals and the rest of the population appear.
The reproduction probability for the leader can be
calculated using a followed formula:
(14)
where:


elite - number of individuals in elite group
population - number of population
All chromosomes before crossover or mutation are
translated into genes. In our case decimal numbers are
translated into binary numbers.</p>
      <p>We used 3 crossover variants: 1-Point Crossover,
Destructive Crossover and Constructive Crossover. In
1-Point Crossover offspring is created by exchanging
the genes of parents.</p>
      <p>In Destructive Crossover only positive genes in both
chromosomes are reproduced.</p>
      <p>In Constructive Crossover if one of genes is positive, then
offspring’s gene is also positive.
Right side: up - destructive, down - constructive crossover.</p>
      <p>In Crossover, it is also interesting how positive and
negative value is inherited. Here , we applied the methodology
of inheritance Rh factor from blood group system. When two
positive chromosomes cross, the offspring is positive in 15 of
16 cases. But, if one is positive and the other is negative, the
positive individual appears 3 times per 4. If both parents are
negative, the offspring is always negative.</p>
      <p>Operation of mutation inverts random genes. It is used for
mutating parents or children that have been created by
crossover.</p>
    </sec>
    <sec id="sec-18">
      <title>D. Observations</title>
      <sec id="sec-18-1">
        <title>We can find two kinds of behaviour:</title>
        <p> whole population does not evolve for the most of the
time

rare but really efficient ‘jumps’, where the best
individual evolves</p>
        <p>That is really interesting and the solution of this may be in
combination of two algorithms: destructive crossover and
mutation. GA works the best with Rosenbrock function, but it
did not look well until 70th generation. The ‘jump’ occurred
where chromosome evolved. In Baele function GA jumps are
only in the beginning and are not spectacular. Himmemblau
function is definitely the hardest to our GA. After the 10th
generation the algorithm slows down and starts again before the
end. In Levy’s function we can observe quick start and huge
jump in the middle generation.</p>
      </sec>
    </sec>
    <sec id="sec-19">
      <title>Conclusions 1) 2) 3)</title>
      <p>GA is very useful to find the surroundings of
solution, but after some upgrades, it could be more
accurate
Our version of GA is looking for solution in whole
range, not in closer environment
A lot of results after mutations and crossover are
useless due to thoughtless randomizing algorithm</p>
      <sec id="sec-19-1">
        <title>BENCHMARK TESTS Plots present the smallest ‘f’ value found so far depending on the number of iterations done for each testing functions. FA algorithm’s results are indicated by green lines, ABC Algorithm’s by the blue ones and GA Algorithm’s by the yellow ones. [B’s sent.]</title>
        <p>Green line indicates the values of Firefly algorithm, while blue line represents Artificial Bee Colony. [K’s sent.]
FA Algorithm data was taken for following coefficients’ values: ‘n’ = 100; ‘β0’ = 0,1; ‘γ’ = 0,1; ‘a’ = 0,01.
ABC algorithm data was taken for following coefficients’ values: ‘n’ = 100, ‘t’ = 50, ‘o’ = 100.</p>
        <p>GA algorithm data was taken for following coefficients’ values: ‘n’ = 100, ‘o’ = 100, crossover probability = 0,5, mutation
probability for not-crossover= 0,5, mutation probability for crossover = 0,1, probability of mutation single gene = 0,1, in crossover
positive sign is for: two positive = 15/16, two different = ¾, two negative 0.</p>
        <p>The graphs show the smallest values found for each of
the test functions depending on the number of iterations of
each algorithm. Population for our algorithms was set to 100
individuals. Other parameters are incomparable and
determined differently.</p>
        <p>We tested Rosenbrock function first. As we can see FA
generated higher value at the beginning and a value at 100s
iteration exceeded a final value of ABC algorithm. It was
less accurate by almost two decimal places. ABC algorithm
line plummeted and at second iteration reached its lowest
value. Genetic algorithm performed the best in this function.</p>
        <p>It was more accurate by three decimal places than ABC.</p>
        <p>The second graph shows Beal’s function, in which FA
dominated over ABC algorithm and Genetic algorithm. Even
though it started with a slightly higher function value than
the others, it started to decline very quickly and reached
highly precise value of seven decimal places. This is an
impressive result.</p>
        <p>On the third graph we can see comparable results of FA
and ABC algorithms. The values declined at different
moments. ABC algorithm reached constant value at 60th
iteration, whereas FA fell finally at 95th iteration and had a
slightly more accurate function value. Genetic algorithm was
not precise and its calculation was inaccurate, its precision
was by two decimal places, while the ABC and FA
algorithms reached values with 7 and 8 decimal places,
respectively.</p>
        <p>FA did not perform well at Levy’s function. At 10
iteration it reached a static value, which was highly
inaccurate comparing to ABC algorithm. It had precision of
two decimal places, while the second algorithm reached five
times better result. Genetic algorithm reached an interesting
precision, however, it was still less accurate than ABC
algorithm.</p>
        <p>POSSIBLE APPLICATION</p>
        <p>Heuristic algorithms can be applied in many practical
situations when an optimization is needed. To use an
heuristic algorithm, first we need to find a cost function
formula. Also we must keep in mind that there is a risk that
the algorithm will not find a local minimum. Then we
should try to apply another algorithm.</p>
        <p>Today heuristic algorithms may be used for example in
market analysis. It uses the fact that when a bee in ABC
algorithm moves to a better solution than the previous one.</p>
        <p>That behaviour may be easily applied thanks to trend
functions of specific stock chart. At the defined range,
algorithm knows when it is hitting the lowest and the
highest value. Specially designed bot could sell or buy
shares of a company we are following. Algorithms may be
also used in neuron networks that would learn to solve
issues faster than usually. Neurons would correspond with
each other and work with better quality so the data would be
interpreted more precisely.</p>
        <p>Genetic algorithm is used in many processes which use a
biological algorithm because of its natural solutions usage.</p>
        <p>For example, robots are able to learn human behaviour. As a
result, now they can perform actions which were reserved
for humans only. It advances the moment when robots will
replace physical works, which is called evolutionary
robotics.</p>
        <p>GA is also commonly used in training neural networks.</p>
        <p>The main benefit of this method is the fact that
neuroevolution can find quicker ways of finding solutions.</p>
        <p>Typical supervised learning algorithms require data of
correct input-output pairs. In contrast, genetic algorithm
requires only a measure of a network's performance at a
task. For example, in games we do not provide strategies
(how to play), but only how to increase score. This
mechanism makes GA easy to transfer from one program to
the other one.</p>
        <p>Genetic algorithm is also used in music record
production. Algorithm uses known songs to create new
ones. It is based on schemes that are most popular in the
songs, recombining them to the fresh chanson.</p>
        <p>NOZOHOUR-LEILABADY, Behzad; FAZELABDOLABADI,
Babak. On the application of artificial bee colony (ABC) algorithm
for optimization of well placements in fractured reservoirs; efficiency
comparison with the particle swarm optimization (PSO) methodology.</p>
        <p>Petroleum, 2016, 2.1: 79-89.</p>
        <p>WOŹNIAK, Marcin; POŁAP, Dawid. Adaptive neuro-heuristic
hybrid model for fruit peel defects detection. Neural Networks, 2018,
98: 16-33.</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>YANG</surname>
          </string-name>
          ,
          <string-name>
            <surname>Xin-She</surname>
          </string-name>
          .
          <article-title>Firefly algorithms for multimodal optimization</article-title>
          .
          <source>In: International symposium on stochastic algorithms</source>
          . Springer, Berlin, Heidelberg,
          <year>2009</year>
          . p.
          <fpage>169</fpage>
          -
          <lpage>178</lpage>
          .J.
          <string-name>
            <surname>Clerk Maxwell</surname>
          </string-name>
          ,
          <source>A Treatise on Electricity and Magnetism</source>
          , 3rd ed.,
          <source>vol. 2</source>
          . Oxford: Clarendon,
          <year>1892</year>
          , pp.
          <fpage>68</fpage>
          -
          <lpage>73</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <surname>WOŹNIAK</surname>
            , Marcin; MARSZAŁEK,
            <given-names>Zbigniew.</given-names>
          </string-name>
          <article-title>An idea to apply firefly algorithm in 2d image key-points search</article-title>
          .
          <source>In: International Conference on Information and Software Technologies</source>
          . Springer, Cham,
          <year>2014</year>
          . p.
          <fpage>312</fpage>
          -
          <lpage>323</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>NAPOLI</surname>
          </string-name>
          ,
          <string-name>
            <surname>Christian</surname>
          </string-name>
          , et al.
          <article-title>Simplified firefly algorithm for 2d image key-points search</article-title>
          .
          <source>In: Computational Intelligence for Human-like Intelligence (CIHLI)</source>
          ,
          <source>2014 IEEE Symposium on. IEEE</source>
          ,
          <year>2014</year>
          . p.
          <fpage>1</fpage>
          -
          <lpage>8</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <surname>Springer</surname>
          </string-name>
          , Berlin, Heidelberg,
          <year>2007</year>
          . p.
          <fpage>789</fpage>
          -
          <lpage>798</lpage>
          .K. Elissa, “
          <article-title>Title of paper if known,” unpublished.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <surname>KARABOGA</surname>
          </string-name>
          , Dervis.
          <source>Artificial bee colony algorithm. scholarpedia</source>
          ,
          <year>2010</year>
          ,
          <volume>5</volume>
          .3:
          <fpage>6915</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <surname>Witanowski</surname>
            , Łukasz &amp; Lampart,
            <given-names>Piotr.</given-names>
          </string-name>
          (
          <year>2015</year>
          ).
          <article-title>Using of a bee algorithms for searching process of minimum of objective function</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          Mechanik.
          <volume>570</volume>
          /
          <fpage>971</fpage>
          -570/978. 10.17814/mechanik.
          <year>2015</year>
          .
          <volume>7</volume>
          .318.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Capizzi</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sciuto</surname>
            ,
            <given-names>G. L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Napoli</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shikler</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Woźniak</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2018</year>
          ).
          <article-title>Optimizing the Organic Solar Cell Manufacturing Process by Means of AFM Measurements and Neural Networks</article-title>
          . Energies,
          <volume>11</volume>
          (
          <issue>5</issue>
          ),
          <fpage>1</fpage>
          -
          <lpage>13</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Napoli</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pappalardo</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tina</surname>
            ,
            <given-names>G. M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Tramontana</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          (
          <year>2016</year>
          ).
          <article-title>Cooperative strategy for optimal management of smart grids by wavelet rnns and cloud computing</article-title>
          .
          <source>IEEE transactions on neural networks and learning systems</source>
          ,
          <volume>27</volume>
          (
          <issue>8</issue>
          ),
          <fpage>1672</fpage>
          -
          <lpage>1685</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Bonanno</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Capizzi</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coco</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Napoli</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Laudani</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Sciuto</surname>
            ,
            <given-names>G. L.</given-names>
          </string-name>
          (
          <year>2014</year>
          , June).
          <article-title>Optimal thicknesses determination in a multilayer structure to improve the SPP efficiency for photovoltaic devices by an hybrid FEM-cascade neural network based approach</article-title>
          .
          <source>In International Symposium on Power Electronics, Electrical Drives, Automation and Motion (SPEEDAM)</source>
          ,
          <year>2014</year>
          (pp.
          <fpage>355</fpage>
          -
          <lpage>362</lpage>
          ). IEEE.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [13]
          <string-name>
            <surname>WOŹNIAK</surname>
          </string-name>
          ,
          <article-title>Marcin; POŁAP, Dawid. Bio-inspired methods modeled for respiratory disease detection from medical images</article-title>
          .
          <source>Swarm and Evolutionary Computation</source>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [14]
          <string-name>
            <surname>DESURVIRE</surname>
            , Heather; CAPLAN, Martin;
            <given-names>TOTH</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jozsef</surname>
            <given-names>A</given-names>
          </string-name>
          .
          <article-title>Using heuristics to evaluate the playability of games</article-title>
          . In: CHI'
          <article-title>04 extended abstracts on Human factors in computing systems</article-title>
          . ACM,
          <year>2004</year>
          . p.
          <fpage>1509</fpage>
          -
          <lpage>1512</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [15] LAU, Richard R.; REDLAWSK, David P.
          <article-title>Advantages and disadvantages of cognitive heuristics in political decision making</article-title>
          .
          <source>American Journal of Political Science</source>
          ,
          <year>2001</year>
          ,
          <fpage>951</fpage>
          -
          <lpage>971</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [16]
          <string-name>
            <surname>REZAIE</surname>
          </string-name>
          , Behrooz; GHASEMI,
          <article-title>Hossein; RAHMANI, Zahra. Terminal Sliding Mode Controller Tuned Using Evolutionary Algorithms for Finite-Time Robust Tracking Control in a Class of Nonholonomic Systems</article-title>
          .
          <source>Information Technology And Control</source>
          ,
          <volume>47</volume>
          .1:
          <fpage>26</fpage>
          -
          <lpage>44</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [17]
          <string-name>
            <surname>BROCIEK</surname>
          </string-name>
          ,
          <article-title>Rafal; SLOTA, Damian. Application of real ant colony optimization algorithm to solve space and time fractional heat conduction inverse problem</article-title>
          .
          <source>Information Technology And Control</source>
          ,
          <year>2017</year>
          ,
          <volume>46</volume>
          .2:
          <fpage>171</fpage>
          -
          <lpage>182</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [18]
          <string-name>
            <surname>DEB</surname>
          </string-name>
          ,
          <string-name>
            <surname>Kalyanmoy</surname>
          </string-name>
          , et al.
          <article-title>A fast elitist non-dominated sorting genetic algorithm for multi-objective optimization: NSGA-II</article-title>
          .
          <source>In: International Conference on Parallel Problem Solving From Nature</source>
          . Springer, Berlin, Heidelberg,
          <year>2000</year>
          . p.
          <fpage>849</fpage>
          -
          <lpage>858</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [19]
          <string-name>
            <surname>HORN</surname>
          </string-name>
          , Jeffrey; NAFPLIOTIS, Nicholas; GOLDBERG,
          <string-name>
            <surname>David</surname>
            <given-names>E.</given-names>
          </string-name>
          <article-title>A niched Pareto genetic algorithm for multiobjective optimization</article-title>
          .
          <source>In: Evolutionary Computation</source>
          ,
          <year>1994</year>
          .
          <source>IEEE World Congress on Computational Intelligence</source>
          .,
          <source>Proceedings of the First IEEE Conference on. Ieee</source>
          ,
          <year>1994</year>
          . p.
          <fpage>82</fpage>
          -
          <lpage>87</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [20]
          <string-name>
            <surname>MORRIS</surname>
          </string-name>
          ,
          <string-name>
            <surname>Garrett</surname>
            <given-names>M.</given-names>
          </string-name>
          , et al.
          <article-title>Automated docking using a Lamarckian genetic algorithm and an empirical binding free energy function</article-title>
          .
          <source>Journal of computational chemistry</source>
          ,
          <year>1998</year>
          ,
          <volume>19</volume>
          .14:
          <fpage>1639</fpage>
          -
          <lpage>1662</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [21]
          <string-name>
            <surname>GOLDBERG</surname>
            ,
            <given-names>David E.; HOLLAND</given-names>
          </string-name>
          , John H.
          <article-title>Genetic algorithms and machine learning</article-title>
          .
          <source>Machine learning</source>
          ,
          <year>1988</year>
          ,
          <volume>3</volume>
          .2:
          <fpage>95</fpage>
          -
          <lpage>99</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>