=Paper=
{{Paper
|id=Vol-1712/p01
|storemode=property
|title=Local Selection for Heuristic Algorithms as a Factor in Accelerating Optimum Search
|pdfUrl=https://ceur-ws.org/Vol-1712/p01.pdf
|volume=Vol-1712
|authors=Danuta Jama
}}
==Local Selection for Heuristic Algorithms as a Factor in Accelerating Optimum Search==
Local Selection for Heuristic Algorithms
as a Factor in Accelerating Optimum Search
Danuta Jama
Institute of Mathematics
Silesian University of Technology
Kaszubska 23, 44-100 Gliwice, Poland
Email: Danuta.Jama@polsl.pl
Abstract—Increasing the use of heuristic algorithms in practi- Due to the numerous applications of optimization algo-
cal applications makes it necessary to perform numerous modifi- rithms, in this paper we show that the introduction of local
cations in order to minimize possible disadvantages. Acceleration selection as a factor of decreasing the time of the algorithm
and increasing precision have become pressing problems of
today’s theory of applied computer science and optimization is an effective modification. For testing purposes, some well-
theory. In this work, the idea of a factor which minimizes the time known functions were taken and used for the purposes of
to search the solution space by heuristic algorithms is presented. verification of the proposed technique.
The proposed modification is described and tested for various
test functions. The results are presented and discussed in terms II. T EST FUNCTIONS FOR OPTIMIZATION PURPOSES
of effectiveness of the proposed modifications.
Test functions for optimization problems are called multiob-
I. I NTRODUCTION jective function, where the determination of a global minimum
or maximum value is an unsolvable problem for well-known
Nowadays, IT solutions can be found everywhere from hos- algorithms for finding extremes. Multiobjective functions are
pitals and factories to our own homes. Computer systems not often called artificial landscapes due to the chart, which is
only replace men at various levels of action, but also support often a complicated surface likened to the known landscapes
our lives. In hospitals, the use of such solutions allow for faster found in nature.
diagnosis of disease or even prevent their uprising. In large The first function is called Egg holder’s function which
factories, production of goods is done automatically, without includes many local extremes - it can cause getting stuck in
the participation of the people due to a more accurate precision one of them. The function is defined as
and high speed operation. Moreover, such systems have also r
found applications in our lives, eg.: washing machines or blood x1
fEggholder (x) = −(x2 + 47) sin + (x2 + 47
2
pressure monitors. p (1)
Each computer activity in their substrates has numerous −x1 sin |x1 − x2 − 47| ,
algorithms, through which it operates in a certain way. Be-
sides the classic algorithms, the application of computational wherein x is a point in 3D space with spatial coordinates
intelligence is becoming increasingly popular because of the marked as x1 and x2 . This function has a number of
ability to obtain accurate results in a short time. Computational global minimums/maximums, for instance fEggholder (x) =
intelligence is represented primarily by three branches – fuzzy −959, 6418 obtained in point x = (512; 404, 2319). Function
logic, neural networks and algorithms inspired by natural is shown in Fig. 1.
phenomena. The last group is classified also to optimization Easom function is the second function selected in the mini-
theory, which plays an important role in modern information mization problem. The function is described by the following
technology. In [1], Dugun et al. showed that use of these equation
algorithms allows for solving optimization problems of con- − cos(x1 ) cos(x2 )
struction and in [2], an optimization technique used in order fEasom (x) = , (2)
exp((x1 − π)2 + (x2 − π)2 )
to solve the capacitated vehicle routing problems was shown.
Additionally, in [3], the idea of using one of the optimization which reaches a global minimum fEasom (x) = −1 at the
algorithms to minimize the drug protein integration energy was point x = (π; π). Function is shown in Fig. 2.
discussed. Again in [4], Mohandes discussed the problem of The last function is a Himmelblau’s function representing a
the quality of solar radiation received by the Earth’s surface flattened landscape and described as
and proposed the idea of modeling global solar radiation using
fHimmelblau (x) = (x21 + x2 − 11)2 + (x1 + x22 − 7)2 . (3)
neural networks and optimization algorithm.
This function achieves only one global maximum at the point
Copyright c 2016 held by the authors. x = (−0, 270845; −0, 923039) equals fHimmelblau (x) =
181.617. Function is presented in Fig. 3.
1
Fig. 1: Egg holder’s function
Fig. 2: Easom’s function.
III. H EURISTIC ALGORITHM inspired by natural phenomena have been modeled. One of
the drawbacks of this type of algorithms is no guarantee of a
Heuristic algorithms are one of the most important tech- correct solution at a predetermined number of iterations. The
niques for finding the global optimum for multiobjective analysis of the convergence of one of the methods, S.Arora and
function in large spaces solutions. Until today, several methods S.Singh shown in [5]. On the other hand, their use allows to
2
Fig. 3: Himmelblau’s function.
obtain in a short time approximate solution using a computer [0, 1] and r means the distance between xactual and xneighbor
with low computational efficiency. defined as the Euclidean metric
v
u 2
uX
A. Wolf Search Algorithm r = d(x actual,x ) = t (x
neighbor −x
actual,i )2 .
neighbor,i
i=1
In 2012, Tang et al. [6] have created a model of behavior (5)
of wolves while looking for food and avoiding their enemies, Wolves hunt using a stalking process , what can be divided
which allowed for the creation of bio-inspired heuristic al- into three stages of preying behavior – initiation, passive action
gorithm. The algorithm assumes that the wolf is represented and escape. The first stage involves the movement in the
as a point x in the solution space. Each wolf has a fixed field of view of a wolf, which means finding a better place
visual area with a radius r – wolf can sense the company according to
only in the visible range and only in this area, he can move xnew = xactual + αvγ, (6)
(in one iteration). The position is evaluated in terms of fitness
functions Θ(x) what is understood as a quality of food in this where v is the velocity of a wolf. Passive hunting means stay in
area. There is a possibility that if the enemy of a wolf appears your current location, to the moment when the enemy appears.
in his sight, he will escape to another position outside the The third stage occurs when the enemy is close to a wolf.
range of his vision. Escape is modeled by
Wolves stick together, and so the greater the distance
xnew = xactual + αsγ, (7)
between a pair of wolves, the place is less attractive. Wolf
moves to find prey, and therefore a better place (due to the where s is the step size. Wolf Search Algorithm implementa-
fitness function). The movement carried out by the following tion is presented in Algorithm 1.
equation
B. Water Waves Algorithm
2
xnew = xactual + β0 exp(−r )(xneighbor − xactual ) + γ, (4) One of the last algorithms inspired by the natural phe-
nomenon algorithm is a model of water waves movement
where β0 means the ultimate incentive (it is a parameter which presented by Zheng et al. [7]. A point x = (x1 , x2 ) in the
is set at the beginning of the algorithm), xneighbor is the closest solution space is called the wave. The algorithm simulates
neighbor with better adaptation, γ is random number from the movement of water waves, and so the following three
3
Algorithm 1 Wolf Search Algorithm where N (µ, σ) is a random number from the normal distribu-
Start tion with µ – mean and σ – standard deviation defined as
2: Define the number of iterations T , number of wolves n,
radius of the visual range r, step size s, velocity factor α µ = |xglobal + xactual |
2 , (11)
and coefficient appearance of the enemy palpha
σ = |xglobal − xactual |
Define fitness condition Θ(·) 2
4: Create an initial population of wolves in random
t := 0 where xglobal is the best wave (solution) in current iteration.
6: while t ≤ T do
During refraction, the height of the wave is also modified by
if t > 0 then Θ(xactual )
8: Generate 80% of new waves λnew = λold . (12)
Θ(xnew )
end if
10: for each wolf xactual in population do The final stage of wave movement is breaking, which
Prey initiatively using (6) describes the movement of the wave at a depth below a certain
12: Generate new position xnew according to (4) threshold level – the velocity exceeds the wave celerity. The
if d(xactual , xnew ) < r ∧ Θ(xnew ) > Θ(xactual ) breaking stage is understood as a local search calculated as
then
xnew = xactual + 2βN (0, 1), (13)
14: Prey new food passively
end if where β is the breaking parameter.
16: Generate new location
Generate random number β ∈ [0, 1] Algorithm 2 Water Waves Algorithm
18: if β > palpha then
1: Start
Escape to a new position using (7)
2: Define the number of iterations T , number of waves n
20: end if
3: Define fitness condition Θ(·)
end for
4: Create an initial population of waves in random
22: if t 6= T then
5: t := 0
Take 20% of the best adapted waves to the next
6: while t ≤ T do
iteration
7: if t > 0 then
24: end if
8: Generate 80% of new waves
t + +,
9: end if
26: end while
10: for each xactual do
Return the best wolf xglobal as a solution
11: Create xnew using (8)
28: Stop
12: if f (xnew ) > f (xactual ) then
13: Replace xactual with xnew
14: if f (xnew ) > f (xglobal ) then
operations are modeled - propagation, refraction and breaking.
15: Break xnew according to (13)
Propagation is understood as a movement of waves from deep
16: Replace xglobal with xnew
water to shallow. During this movement, the wave height
17: end if
increases but the length decreases. This step is described as
18: else
xnew = xactual − 2λnew (xactual )v, (8) 19: Reduce the height of the xactual by one
20: if the height of the wave is 0 then
where v is random factor in the range of [−1, 1] and λ(y) is
21: Refract xactual to xnew by (10)
a function that returns the value of the wavelength y and it is
22: end if
formulated as
23: end if
Θ(y) − Θmin + 24: Update the wavelengths using (9)
λnew (y) = λactual α , (9)
Θmax − Θmin + 25: end for
where Θ is a fitness function, Θmin and Θmax are respectively 26: if t 6= T then
the minimum and maximum values in a particular iteration of 27: Take 20% of the best adapted waves to the next
the algorithm, is a small number to prevent division by zero iteration
and α is the wavelength reduction coefficient. 28: end if
After propagation, the next stage is refraction – it is a 29: t + +,
phenomenon of changes in wave direction, where the height 30: end while
reaches a value of 0. The new position after refraction of the 31: Return the best wave xglobal as a solution
wave is modeled as 32: Stop
xactual = N (µ, σ), (10)
4
IV. A FACTOR IN ACCELERATING THE FINDING OPTIMUM is achieved in the case of original algorithms. In terms of
In the heuristic algorithms, the initial population is selected acceleration of the algorithm, additional calculations slow
at random. Such a solution for complex functions increases down the operations for small number of iterations. Similar
the risk of getting stuck in a local optimum (see Fig. 1). to the accuracy, modifications is valuable when there will be
To remedy this situation, the initial population can be placed a need for a much larger number of iterations.
in specific locations in the solution space, i.e. all individuals VI. F INAL REMARKS
are positioned at equally-spaced. As a result, the population
The proposed modification reduces operational time heuris-
forms a grid that covers the entire space. In the next step,
tic algorithms and increasing the accuracy of their operation,
each individual is multiplied to cover a larger area. For every
assuming a large number of iterations or very accurate results.
individual is made local search. Each of the individuals is
On the basis of the performed test for the selected function,
evaluated by fitness function and 25% of the best fittest are
the modification shown that it is possible to reduce the time.
selected as the initial population.
Local search is done by a decrease gradient. The algorithm Algorithm 3 A factor algorithm
starts at a point x, for which (the negative for minimization
problem) gradient is calculated as 1: Start
2: Define
∂Θ(x1 , x2 ) 3: Define fitness function Θ(·), solution space Z × B, pop-
∇Θxi = for i = 1, 2. (14)
∂xi ulation size n, step size λ, number of iteration T
Gradient indicates the direction of the fastest growth the 4: Arrange the n individuals equally-spaced over the entire
value of the function at the measured point. In the next step, space AxB
the next point is found according to the direction defined by 5: for each individual x do
the gradient as 6: Create 3 additional individuals and distribute them near
x
xnew = xactual − λ∇Θx , (15)
7: end for
where λ is the value of step. A factor implementation is shown 8: t = 1,
in Algorithm 3. 9: for each individual x do
10: while t ≤ T do
V. E XPERIMENTAL RESULTS
11: Calculate xnew according to (15)
Numerical tests were performed to search global optimum 12: if Θ(xactual ) ≤ Θ(xnew ) then
of all functions described in Section II. For the selected amount 13: xold = xnew
of the population (n = 100 individuals) and iteration (10, 14: end if
100, 1000) the results obtained for the original and modified 15: t++
methods. The results in terms of accuracy are presented in 16: end while
Tables I, II and III. The obtained average values for 10 17: end for
experiments in terms of accuracy of the solution from time 18: Rate all the individuals using the fitness function
are presented in Fig. 4. 19: Return 25% fittest individuals
20: Stop
R EFERENCES
[1] I. Durgun and A. R. Yildiz, “Structural design optimization of vehicle
components using cuckoo search algorithm,” Materials Testing, vol. 54,
no. 3, pp. 185–188, 2012.
[2] M. Reed, A. Yiannakou, and R. Evering, “An ant colony algorithm for
the multi-compartment vehicle routing problem,” Applied Soft Computing,
vol. 15, pp. 169–176, 2014.
[3] A. Ghosh, M. Talukdar, and U. K. Roy, “Stable drug designing by
minimizing drug protein interaction energy using pso,” arXiv preprint
arXiv:1507.08408, 2015.
[4] M. A. Mohandes, “Modeling global solar radiation using particle swarm
optimization (pso),” Solar Energy, vol. 86, no. 11, pp. 3137–3145, 2012.
[5] S. Arora and S. Singh, “The firefly optimization algorithm: convergence
analysis and parameter selection,” International Journal of Computer
Applications, vol. 69, no. 3, 2013.
Fig. 4: The dependence of the accuracy from the average time. [6] R. Tang, S. Fong, X.-S. Yang, and S. Deb, “Wolf search algorithm with
ephemeral memory,” in Digital Information Management (ICDIM), 2012
Based on the obtained data, the application of the mod- Seventh International Conference on. IEEE, 2012, pp. 165–172.
ification in terms of accuracy is only cost-effective for a [7] Y.-J. Zheng, “Water wave optimization: a new nature-inspired metaheuris-
tic,” Computers & Operations Research, vol. 55, pp. 1–11, 2015.
large number of iterations - the results are more accurate
than others. For quantities less than 1000, better accuracy
5
TABLE I: Research Results for Egg holder’s function (1)
Wolf Search Algorithm
iterations obtained optimum x Θ(x)
100 (497,350286104414;413,316038741412) -445,109840056073
1000 (470,030297278441;485,562031085399) -747,988554493845
10000 (495,606099411662;401,486352105386) -787,798958711978
Wolf Search Algorithm with modification
iterations obtained optimum x Θ(x)
100 (482,879289436564;394,047534639969) -532,10668463655
1000 (362,447760432236;495,061778684641) -670,883977492652
1000 (459,050439605047;407,631429847158) -827,9315354643
Water Wave Algorithm
iterations obtained optimum x Θ(x)
100 (416,564008862043;457,568839219198) -481,75325119652
1000 (482,862816510192;394,327293482761) -522,678854547558
10000 (429,694043635248;441,956854035126) -893,969574033705
Water Wave Algorithm with modification
iterations obtained optimum x Θ(x)
100 (386,448251729109;445,815733413126) -78,0632300803166
1000 (463,005703484177;494,428448618589) -481,70585333327
1000 (514,777470228624;396,847095860563) -981,843214134724
TABLE II: Research Results for Easom’s function (2)
Wolf Search Algorithm
iterations obtained optimum x Θ(x)
100 (2,34615738054093;3,25216948206172) -0,365027440014811
1000 (3,9124181950057;3,62785225670219) -0,276365363830805
10000 (2,96131405698197;2,17447623478923) -0,212171936285033
Wolf Search Algorithm with modification
iterations obtained optimum x Θ(x)
100 (2,51148351398831;3,47038642199262) -0,461424318331189
1000 (3,2396818237564;1,77784279537287) -0,0315487721933061
10000 (2,19667368391374;1,37504338630244) 0,00205881016133928
Water Wave Algorithm
iterations obtained optimum x Θ(x)
100 (2,1929323068787;2,67858712499895) -0,171094307906584
1000 (3,5790521290987;3,95715529143678) -0,263663206287755
10000 (3,89253193693819;2,23055503807522) -0,111170990040294
Water Wave Algorithm with modification
iterations obtained optimum x Θ(x)
100 (3,21630300125866;3,14909067244692) -0,991576295055776
1000 (2,63754989143347;2,82413371457911) -0,583388074053213
10000 (1,17472107344061;1,67250233454281) 9,45257018956421E-05
TABLE III: Research Results for Himmelblau’s function (3)
Wolf Search Algorithm
iterations obtained optimum x Θ(x)
100 (0,682362732329575;-0,67352732442018) 160,003686302222
1000 (0,633187983945565;-1,88113112742134) 163,753970449623
10000 (-1.0879799313;-0.9926871380) 175.685
Wolf Search Algorithm with modification
iterations obtained optimum x Θ(x)
100 (-0,24526995711274;-0,385508137934612) 178,626060371431
1000 (-0,0459746942138182;-0,795370650848081) 180,212099494084
10000 (-0,234918390044439;-0,621006036932117) 180,680676146925
Water Wave Algorithm
iterations obtained optimum x Θ(x)
100 (-0,526868914965013;-0,343995140559969) 177,35171333301
1000 (-0,086203324182985;-0,733375833711296) 180,378842675413
10000 (-0,594777831153375;-0,345780614924515) 176,703320832286
Water Wave Algorithm with modification
iterations obtained optimum x Θ(x)
100 (-0,877628832998513;-0,409932400290823) 172,640906231244
1000 (-0,89729838347868;-0,823287643875595) 173,520600953295
10000 (-0,282641141341366;-0,948922069253829) 181,606341140222
6