<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>A Hybrid Solver for Large Neighborhood Search: Mixing Gecode and EasyLocal++</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Ra aele Cipriano</string-name>
          <email>cipriano@dimi.uniud.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Luca Di Gaspero</string-name>
          <email>l.digaspero@uniud.it</email>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Agostino Dovier</string-name>
          <email>dovier@dimi.uniud.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>DIMI (cipriano</institution>
        </aff>
      </contrib-group>
      <abstract>
        <p>We present a hybrid solver (called GELATO) that exploits the potentiality of a Constraint Programming (CP) environment (Gecode) and of a Local Search (LS) framework (EasyLocal++). GELATO allows the user to easily develop and use hybrid meta-heuristic combining CP and LS phases (in particular Large Neighborhood Search). We tested some hybrid algorithms on di erent instances of the Asymmetric Traveling Salesman Problem: even if only naive LS strategies have been used, our meta-heuristics improve the standard CP and LS methods, in terms of both quality of the solution reached and execution time. GELATO will be integrated into a more general tool to solve Constraint Satisfaction/Optimization Problems. Moreover, it can be seen as a new library for approximate and e cient searching in Gecode.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>Combinatorial problems like planning, scheduling, timetabling, and, in general,
resource management problems, are daily handled by industries, companies,
hospitals, and universities. Their intractability however poses challenging problems
to the programmer: ad-hoc heuristics that are adequate for one problem are
often useless for others [16]; classical techniques like integer linear programming
(ILP) need a big tuning in order to be e ective; any change in the speci cation
of the problem requires restarting almost from scratch. In the last years many
e orts have been devoted to in the development of general techniques that allow
high-level primitives to encode search heuristics. Noticeable examples of these
techniques are constraint programming (CP|with the various labeling
heuristics) [12] and local search (LS|with the various techniques to choose and visit
the neighborhood). We are not dealing with the problem of nding the optimum
of a problem but with a reasonable solution to be computed in reasonable time.
The future seems to stay in the combinations of these techniques in order to
exploit the best aspect of each technique for the problem at hand (after a tuning
of the various parameters on small instances of the problem considered). This
is also witnessed by the success of the CP-AI-OR meetings [15]. In particular,
Large Neighborhood Search (LNS) can be viewed as a particular heuristic for
local search that strongly relies on a constraint solver [3] and it is a reasonable
way to blend the inference capabilities of LS and CP techniques.</p>
      <p>In this work we develop a general framework called GELATOthat integrates
CP and LS techniques, de ning a general LNS meta-heuristic that can be
modi ed, tuned and adapted to any optimization problem, with a limited
programming e ort. Instead of building a new solver from scratch, we based our
framework on two state-of-the-art, existing systems: the Gecode CP environment [13]
and the LS framework EasyLocal++ [5]. We choose these two systems (among
other ones, such as [11,10,7]) because both of them are free and open, strong
and complete, written in C++, and with a growing community using them.</p>
      <p>We show how to model a LNS algorithm combining CP and LS in GELATO:
this modeling is easy to implement and it allows e cient computations. We test
the framework on hard Asymmetric Travel Salesman Problem instances and
show its e ectiveness w.r.t. the traditional use of a pure CP approach and a
pure LS approach.</p>
      <p>The results of this paper will be combined to those of [2] so as to obtain
a multi-language system able to model and solve combinatorial problems. This
tool will comprise three main parts: a modeling component, a translator, and
a solver. In the modeling component the user will be able to de ne in a
highlevel style the problem and the instance he/she wants to solve and the algorithm
to use (CP search, possibly interleaved with LS, integer linear programming,
heuristics or meta-heuristics phases). The translation component will handle the
compilation of the model and the meta-algorithm de ned by the user into the
solver frameworks, like Gecode or others. In the third phase, the overall compiled
program will be run on the problem instance speci ed by the user and the various
solvers will interact as set by the user in the model. A side-e ect of our work
is a new library of search strategies for the Gecode system. GELATO provides
some primitives that allow a Gecode user to easily develop approximate search
algorithms, based on LS (in particular, with LNS). On the other hand, GELATO
can be considered also as a plug-in for the EasyLocal++ system, a framework
that allows the user to easily develop, test, and combine several Local Search
algorithms. With GELATO, an EasyLocal++ user can easily exploit the bene ts
of CP in its LS algorithms.
2</p>
    </sec>
    <sec id="sec-2">
      <title>Preliminary concepts</title>
      <p>Discrete or continuous problems can be formalized using the concept of
Constraint Satisfaction Problem (CSP). A CSP is de ned as the problem of
associating values (taken from a set of domains) to variables subject to a set of
constraints. A solution of a CSP is an assignment of values to all the variables
so that all the constraints are satis ed. In some cases not all solutions are equally
preferable, but we can associate a cost function to the variable assignments. In
these cases we talk about Constraint Optimization Problems (COPs), and we
are looking for a solution that minimizes the cost value. The solution methods
for CSPs and COPs can be split into two categories:</p>
      <p>Complete methods, which systematically explore the whole solution space in
search of a feasible (for CSPs) or an optimal (for COPs) solution.</p>
      <p>Incomplete methods, which rely on heuristics that focus only on some areas
of the search space to nd a feasible solution (CSPs) or a \good" one (COPs).
2.1</p>
      <sec id="sec-2-1">
        <title>Constraint Programming basics</title>
        <p>Constraint Programming (CP) [12] is a declarative programming methodology
parametric on the constraint domain. Combinatorial problems are usually
encoded using constraints over nite domains, currently supported by all CP
systems (e.g., [13,10,11]).</p>
        <p>A CSP P is modelled as follows: a set X = fx1; : : : ; xkg of variables; a set
D = fD1; : : : ; Dkg of domains associated to the variables (i.e., if xi = di then
di 2 Di); a set C of constraints (i.e., relations) over dom = D1 Dk.
hd1; : : : ; dki 2 dom satis es a constraint C 2 C i hd1; : : : ; dki 2 C. A tuple
d = hd1; : : : ; dki 2 dom is a solution of a CSP P if d satis es every constraint
C 2 C. The set of solutions of P is denoted with sol(P). If sol(P) 6= ;, then P is
consistent. Often a CSP is associated to a function f : sol(P) ! E where hE; i
is a well-ordered set (e.g., E = N or E = R). A COP is a CSP with an associated
function f . A solution for this COP is a solution d 2 sol(P) that minimizes the
function f : 8e 2 sol(P)(f (d) f (e)).</p>
        <p>This paradigm is usually based on complete methods that analyze the search
space alternating deterministic phases (constraint propagation|values that
cannot be assigned to any solution are removed by domains) and non-deterministic
phases (variable assignment|a variable is selected and a value from its domain
is assigned to it). This process is iterated until a solution is found or unsatis
ability is reached; in the last case the process backtracks to the last choice point
(i.e., the last variable assignment) and tries other assignments.
2.2</p>
      </sec>
      <sec id="sec-2-2">
        <title>Local Search basics</title>
        <p>Local Search (LS) methods (see [1,4]) are a family of meta-heuristics to solve
CSPs and COPs, based on the de nition of proximity (or neighborhood ): a LS
algorithm typically moves from a solution to a near one, trying to improve an
objective function, iterating this precess. LS algorithms generally focus the search
only in speci c areas of the search space, so they are incomplete methods, in the
sense that they do not guarantee to nd a feasible (or optimal) solution, but
they search non-systematically until a speci c stop criterion is satis ed.</p>
        <p>To de ne a LS algorithm for a given COP, three parameters must be de ned:
the search space, the neighborhood relation, and the cost function. Given a COP
P , we associate a search space S to it, so that each element s 2 S represents a
solution of P . An element s is a feasible solution i it ful lls the constraints of
P . S must contain at least one feasible solution.</p>
        <p>For each element s 2 S, a set N (s) S is de ned. The set N (s) is called
the neighborhood of s and each member s0 2 N (s) is called a neighbor of s. In
general N (s) is implicitly de ned by referring to a set of possible moves, which
de ne transitions between solutions. Moves are usually de ned in an intensional
fashion, as local modi cations of some part of s. A cost function f , which
associates to each element s 2 S a value f (s) 2 E, assesses the quality of the
solution. f is used to drive the search toward good solutions and to select the
move to perform at each step of the search. For CSP problems, the cost function
f is generally based on the so-called distance to feasibility, which accounts for
the number of constraints that are violated. A LS algorithm starts from an initial
solution s0 2 S, and uses the moves associated with the neighborhood de
nition to navigate the search space: at each step it makes a transition between
one solution s to one of its neighbors s0, and this process is iterated. When the
algorithm makes the transition from s to s0, we say that the corresponding move
m has been accepted. The selection of moves is based on the values of the cost
function and it depends on the speci c LS technique.</p>
        <p>Large Neighborhood Search (LNS) is a LS method that relies on a particular
de nition of the neighborhood relation and of the strategy to explore the
neighborhood. Di erently from traditional LS methods, an existing solution is not
modi ed just by making small changes to a limited number of variables (as is
typical with LS move operators), instead a subset of the problem is selected and
searched for improving solutions. The subset of the problem can be represented
by a set F V of variables, that we call free variables, which is a subset of the
variables X of the problem. De ning F V corresponds to de ne a neighborhood
relation.</p>
        <p>For example, a LS move could be to swap the values of two variables, or, more
generally the permutation of the values of a set of variables. Another possibility
for a neighborhood de nition is to keep the values of some variables and to leave
the other variables totally free, constrained only by their domains.</p>
        <p>
          Three aspects are crucial in LNS de nition, w.r.t. the performance of this
technique: (
          <xref ref-type="bibr" rid="ref1">1</xref>
          ) which and (
          <xref ref-type="bibr" rid="ref2">2</xref>
          ) how many variables have to be selected (i.e., the
de nition of F V ), and (
          <xref ref-type="bibr" rid="ref3">3</xref>
          ) how to perform the exploration on these variables.
Let us brie y analyze these three key-points, starting from the third one.
        </p>
        <p>
          (
          <xref ref-type="bibr" rid="ref3">3</xref>
          ) Given F V , the exploration can be performed with any searching
technique: CP, Operation Research algorithms, and so on. We can be interested in
searching for: the best neighborhood; the best neighborhood within a certain
exploration timeout; the rst improving neighborhood; the rst neighborhood
improving the objective function of at least a given value, and so on. (
          <xref ref-type="bibr" rid="ref2">2</xref>
          )
Deciding how many variables will be free (jF V j) a ects the time spent on every
large neighborhood exploration and the improvement of the objective function
for each exploration. A small F V will lead to very e cient and fast search, but
with very little improvement of the objective function. Otherwise, a big F V can
lead to big improvement at each step, but every single exploration can take a
lot of time. This trade-o should be investigated experimentally, looking at a
dimension of F V that leads to fast enough explorations and to good
improvements. Obviously, the choice of jF V j is strictly related to the search technique
chosen (e.g., a strong technique can manage more variables than a naive one)
and to the use or not of a timeout. (
          <xref ref-type="bibr" rid="ref1">1</xref>
          ) The choice of which variables will be
included in F V is strictly related to the problem we are solving: for simple and
not too structured problems we can select the variables in a naive way
(randomly, or iterating between given sets of them); for complex and well-structured
problems, we should de ne F V cleverly, selecting the variables which are most
likely to give an improvement to the solution.
2.3
        </p>
      </sec>
      <sec id="sec-2-3">
        <title>Hybridization of CP and LS</title>
        <p>Two major types of approaches to combine the abilities of CP and LS are
presented in the literature [6,8]:
1. a systematic-search algorithm based on constraint programming can be
improved by inserting a LS algorithm at some point of the search procedure, e.g.:
(a) at a leaf (i.e., on complete assignments) or on an internal node (i.e., on a
partial assignment) of the search tree explored by the constraint programming
procedure, in order to improve the solution found; (b) at a node of the search
tree, to restrict the list of child-nodes to explore; (c) to generate in a greedy
way a path in the search tree;
2. a LS algorithm can bene t of the support of constraint programming, e.g.:
(a) to analyze the neighborhood and discarding the neighboring solutions that
do not satisfy the constraints; (b) to explore a fragment of the neighborhood of
the current solution; (c) to de ne the search of the best neighboring solution
as a problem of constrained optimization (COP).</p>
        <p>In these hybrid methods one of the two paradigms is the master and the
second one acts as a slave, supporting the master at some point of the search
algorithm. Paradigms not based on the master-slave philosophy have also been
proposed. In [9] LS and constraint propagation are split in their components,
allowing the user to manage the di erent basic operators (neighborhood
exploration, constraint propagation and variable assignment) at the same level. In [7]
CP and LS are combined in a programming language (COMET), that supports
both modeling and search abstractions, and where constraint programming is
used to describe and control LS.</p>
        <p>LNS is a LS strategy that can be naturally implemented using CP, leading
to hybrid algorithms that involves the approaches 2(a), 2(b) and 2(c) of the
above enumeration. CP can manage and exhaustively explore a lot of variables
subjected to constraints, so it is a perfect method for the exploration of large
neighborhoods.
2.4</p>
      </sec>
      <sec id="sec-2-4">
        <title>The Solvers used</title>
        <p>There are several tools commonly used by the community to solve CSPs and
COPs with the CP and LS paradigms. We focused on Gecode , for the CP aspects,
and EasyLocal++, for the LS algorithms, for three main reasons: goodness of
the solvers (they both are very strong, complete, and e cient); guarantee of
maintenance during time (they have a growing user community and they are
actively maintained by their developers); ease of integration (they are free, open
and written in C++, so they can be employed as C++ libraries). Moreover,
all these characteristics ensure the possibility in the future of easily integrating
these solvers with other C++ libraries. Here we can give only a short explanation
of these two solvers, suggesting the reader to take a look at [13] for Gecode and
[5] for EasyLocal++.</p>
        <p>Gecode It is an open, free, portable, accessible, and e cient environment for
developing constraint-based systems and applications. It is implemented in C++
and o ers competitive performances w.r.t. both runtime and memory usage. It
implements a lot of data structures, constraints de nitions, and search strategies,
allowing also the user to de ne his own ones.</p>
        <p>In the Gecode philosophy, a model is implemented using spaces. A space is the
repository for variables, constraints, objective function, searching options. Being
C++ an object-oriented language, the modeling approach of Gecode exploits the
inheritance: a model must implement the class Space, and the subclass
constructor implements the actual model. In addition to the constructor, a model must
implement some other functions(e.g., performing a copy of the space, returning
the objective function, . . . ).</p>
        <p>A Gecode space can be asked to perform the propagation of the constraints,
to nd the rst solution (or the next one) exploring the search tree, to nd the
best solution in the whole search space.</p>
        <p>EasyLocal++ It is an object-oriented framework that allows to design,
implement and test LS algorithms in an easy, fast and exible way. The basic idea of
EasyLocal++ (brie y EL) is to capture the essential features of most LS
metaheuristics, and their possible compositions. This allows the user to address the
design and implementation issues of new LS heuristics in a more principled way.</p>
        <p>The frameworks are characterized by the inverse control mechanism: the
functions of the framework call the user-de ned ones and not the other way
round. The framework thus provides the full control structures for the invariant
part of the algorithms, and the user only supplies the problem speci c details.</p>
        <p>Modeling a problem using EL means to de ne the C++ classes representing
the basic concepts of a LS algorithm: e.g., the structure of a solution (or State, in
the EL language), the neighborhood (or Move), the cost function, the strategy to
navigate the neighborhoods (NeighborhoodExplorer ). Once these basic concepts
are de ned (in C++ classes that inherit from EL base classes and implement
their characteristics), the user selects the desired LS algorithm and run it. EL
provides a wide range of LS heuristic and meta-heuristic (Hill Climbing, Steepest
Descent, Tabu Search, Multi Neighborhood, . . . ), allowing also the development
of new ones.
3</p>
        <p>A Hybrid Solver for Large Neighborhood Search
In this section we describe the hybrid solver we have developed to implement
LNS algorithms that exploit a CP solver in the exploration of the large
neighborhoods; we called it GELATO (Gecode+EasyLocal = A Tool for Optimization).
We rst de ne the basic elements of this tool, in a general way, without any
implementation detail. Then we explain the choices we made to integrate these
tools in a unique framework.
GELATO is made-up of ve main elements (see Fig. 1): a constraint model, a
move enumerator, a move performer, a CP solver, and a LS framework. The
constraint model specify the problem we want to solve, according with the de nition
given in Section 2.1. The move enumerator and the move performer de ne the
LNS characteristics (de nition of the neighborhood and how to explore it, as in
2.2). The CP solver takes care of exploring the neighborhood. The LS framework
manages all these element together into a LS algorithm. We brie y analyze each
element.</p>
        <p>Constraint model Here we de ne the variables, the domains, the constraints
and objective function of the problem we want to solve. At these level we specify
neither the instance input characteristics, that will be passed to the model as
a parameter at run time, nor the search options (variable and value selection,
timeout, . . . ), that will be speci ed by the move performer when the model
will be actually solved. In GELATO, the constraint model is speci ed using
the Gecode language, but in general it can be expressed using any high-level
modeling language (SICStus Prolog, Minizinc, OPL) or low level language,
according to the CP solver that will be used. This model will be actually
instantiated and solved by the CP solver during the exploration of a large
neighborhood. It can be also used to nd good starting solution for the main
LS algorithm (hybrid approach 1(a) in the enumeration of Section 2.3).
Move Enumerator (ME) It deals with the de nition of the set F V for the
LNS, specifying which variable of the constraint model will be free in the
exploration of the neighborhood. According with the LS framework, we can specify
several kind of move enumerator: e.g., a random ME, that randomly selects a
speci ed number of variables of the problems; an iterative ME, that iterates
among all the N combination on the variables of the problem; a sliding ME,
that considers sliding window of K variables (i.e., F V = fX1; : : : ; X1+kg, then
F V = fX2; : : : ; X2+kg and so on).</p>
        <p>Move Performer (MP) A move performer collects all the information about
how to search a large neighborhood, like searching timeout, variable and value
selection, constraint propagation policy, branching strategies and so on. It
instantiates the constraint model, according to the de nition of the problem, the
instance, the move (given by the ME), and the CP solver. It invokes the CP
solver passing all these information and obtain the result of the exploration.
CP solver It is attached to a MP and must be able to recognize the constraint
model speci ed and to perform an exploration on the neighborhoods, with
the searching parameters given by them MP. It is expected to return the best
neighborhood found (according to the MP policy) and the value of the objective
function for it. In GELATO the CP solver used is Gecode, but in principle, there
is no restriction about what kind of solver to use: a solver that can be easily
interfaced with the other components could be a good choice.</p>
        <p>LS framework It de nes the high level interaction between the above
components, building up a real LS algorithm: it navigates the searching space through
the neighborhoods de ned by the ME, exploring them using the MP. Any
traditional LS search algorithm can be used at this step (e.g., Hill Climbing,
Steepest Descent, Tabu Search). Also meta-heuristic LS algorithm are allowed
(e.g., Multi-Neighborhood, Token Ring Search), if di erent kinds of ME and
MP are de ned for the same problem. The LS framework used in GELATO is
EasyLocal++.
3.2</p>
        <p>Mixing Gecode and EasyLocal++
Gecode and EasyLocal++ are two frameworks that di er in many ways: they have
di erent aims, use di erent data structures, and of course implement di erent
algorithms. However, they have some architectural similarities: they both are
written in C++; they both have base classes for the basic concepts and ask the
user to de ne his problem implementing these base classes; they provide general
algorithms (for CP and LS) that during executions call and use the derived
classes developed by the user.</p>
        <p>To de ne a hybrid tool for LNS, the natural choice is to use the EL framework
as a main algorithm and Gecode as a module of EL that can be invoked to perform
large neighborhood explorations.</p>
        <p>According to the architecture of Gecode and EL and to the main components
of the tools described in Section 3.1, we have de ned a set of classes (described
below) that make possible the integration between the two frameworks. Only
the rst two (ELState and GEModel) must be implemented by the user (with
derived class), because they contain the problem speci c information; the other
ones are provided by our tools and the user has only to invoke them (but she/he
is also allowed to de ne his own ones).</p>
        <p>ELState It is an EL state, subclass of the base class State of the EL framework.</p>
        <p>It typically contains a vector of variables that represents a solution, and an
integer that contains the value of the objective function of the state.
GEModel It is the Gecode model, and it implements the methods requested to
work with the Gecode language and the EL framework. It inherits methods and
properties from the Gecode Space class.</p>
        <p>LNSMove Base abstract class for the LNS moves of our framework. An actual
move must implement the method bool containsVar(Var) that says if a given
Var is free w.r.t. the move. We have de ned two actual derived classes:
DistrictMove and NDi erentMove. DistrictMove is made up by several sets of variables
and of an active set: at a given time t only the variables of the active set are
free; the active set can be changed by the move enumerator. NDi erentMove is
made up by a single set of variables and a xed integer N : at a given time t
only N variables of the set are free (they can be chosen randomly or iterating
between the variables in the set).</p>
        <p>MoveEnumerator Base abstract class for ME, that cycles between a particular
kind of LNSMove. MoveEnumerator actual classes must implement the
functions RandomMove(), FirstMove(), and NextMove(). RandomMove(), given a
LNSMove, selects randomly the set of free variables, according to the speci c
LNSMove de nition (e.g., selecting randomly an active set of a DistrictMove).
FirstMove() and NextMove() are used to cycle on all the possible moves (e.g.,
cycling on all the active neighborhoods of a DistrictMove, or cycling on all the
N-combinations of the variables contained in a NDi erentMove). Our tool
provides an actual MoveEnumerator for each LNSMove: a DistrictMoveME and a
NDi erentME.</p>
        <p>MovePerformer This module applies a given move to a given state, returning the
new state reached. We de ne the GecodeMovePerformer, that takes a ELState
and a LNSMove and, according to the de ned GEModel class, builds up a Gecode
Space and solves it.</p>
        <p>LNSNeighborhoodExplorer It is the main class that EL uses for neighborhood
explorations. It wraps together a MoveEnumerator and a MovePerformer.
LNSStateManager It provides some basics functionalities of an ELState, such as
calculating the objective function of a state, and nding an initial state for the
search. This last task is performed using CP: an instance of the GEModel of the
problem is instantiated and explored until a rst feasible solution is reached.
4</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>A simple experiment</title>
      <p>We tested GELATO on instances of growing sizes of the Asymmetric Travel
Salesman Problem, taken from the TSPLib [14]. In this section we describe the
problem, the solving algorithms we used, the experiment we have performed,
and the results obtained.</p>
      <p>The Asymmetric Travel Salesman Problem (ATSP) is de ned as follows:
given a complete directed graph G = (V; E) and a function c that assigns a cost
to each directed edge (i; j), nd a roundtrip of minimal total cost visiting each
node exactly once. We speak of asymmetric TSP if there exists (i; j) such that
c(i; j) 6= c(j; i) (imagine a climbing road).
4.1</p>
      <sec id="sec-3-1">
        <title>CP Model, LNS De nition and LS algorithm for ATSP</title>
        <p>The CP Model we used is chosen from the set of examples in the Gecode package.
We used an existing model in order to point out the possibility of using GELATO
starting from existing CP models, adding only some little modi cations to them
(e.g., changing the class headings and a couple of statements in the constructor).</p>
        <p>The rst state is obtained by a CP search over the same model, without any
pre-assigned value of the variables, and without a timeout (because we need an
initial feasible solution to start the LS algorithm).</p>
        <p>
          The Large Neighborhood de nition we have chosen is the following: given a
number N &lt; jV j and given a solution (a tour on the nodes, i.e. a permutation
of jV j), N variables are randomly selected and left free, while the other ones
remain xed. Therefore, the exploration of the neighborhood consists in exploring
the N free variables and it is regulated by the following four parameters: (
          <xref ref-type="bibr" rid="ref1">1</xref>
          )
the variables with the smallest domain are selected, (
          <xref ref-type="bibr" rid="ref2">2</xref>
          ) the values are chosen
randomly, (
          <xref ref-type="bibr" rid="ref3">3</xref>
          ) a maximum number failures is set, and (
          <xref ref-type="bibr" rid="ref4">4</xref>
          ) the best solution found
before reaching that number of failure is returned.
        </p>
        <p>The LS algorithm used is a traditional Hill Climbing, with a parameter K.
At each step it selects randomly a set of N variables, frees them and searches
on them for the best neighborhood, until the maximum number of failures is
reached. If the best neighborhood found is better or equal than the old one, it
becomes the new current solution; otherwise (i.e., the value is worse), another
random neighborhood is selected and explored (this is said idle iteration). The
algorithm stops when K consecutive idle iterations have been performed (i.e.,
stagnation of the algorithm has been detected). Every single large neighborhood
is explored using CP.
4.2</p>
      </sec>
      <sec id="sec-3-2">
        <title>Experiments</title>
        <p>The experiments have been performed on the following instances, taken from
the TSPLib [14]: br17 (that we call instance 0, with jV j = 17), ftv33 (instance
1, jV j = 34), ftv55 (instance 2, jV j = 56), ftv70 (instance 3, jV j = 71), kro124p
(instance 4, jV j = 100), and ftv170 (instance 5, jV j = 171).</p>
        <p>The instances have been solved using: 1) a pure constraint programming
approach in Gecode, 2) a pure local search approach in EasyLocal++, and 3)
di erent LNS approaches encoded in GELATO.</p>
        <p>The LNS approaches di er on the number (jF V j) of variables of the
neighborhood, which are computed proportionally on the number of variables of the
instances (jV j). In particular we tested jF V j equal to the 20%, 25%, 30%, 35%,
40%, 45% of jV j.</p>
        <p>We tested also di erent stop criterions for the exploration of a single large
neighborhood, setting di erent values for maximum number of admitted failures:
the formula to calculate the number of failures is 2pjF V j Mult. Let us observe
that the search space and, consequently, the number of failures, grow
exponentially w.r.t. the number of variables. The parameter Mult allows us to tune the
exponential grow of the failures. We tested all the LNS approaches with three
values of Mult : 0.5, 1, and 1.5.</p>
        <p>The number K of consecutive idle iterations admitted before forcing the
termination of the global GELATO algorithm has been xed to 50.</p>
        <p>
          The pure CP approach in Gecode uses the following parameters: (
          <xref ref-type="bibr" rid="ref1">1</xref>
          ) the
leftmost variable is selected, and (
          <xref ref-type="bibr" rid="ref2">2</xref>
          ) the values are chosen increasingly. We run
it only once, since it is deterministic, with a timeout of one hour. We also tried
a rst-fail strategy (the variable with the smallest domain is selected), but its
performance in this case are slightly worse than the leftmost one. This is probably
due to the regular structure of the problem.
        </p>
        <p>The pure LS approach in EasyLocal++ is based on a classical swap-move
de nition: given a tour, two nodes are randomly selected and swapped. The
local search algorithm used is Hill Climbing, with a number of consecutive idle
iterations xed to 500.</p>
        <p>Since LNS and pure LS computations use randomness, we repeated each of
them 20 times. During each run, we stored the values of the objective
function, that correspond to improvements of the current solution, together with the
running time spent. These data have been aggregated in order to analyze the
average behavior of the di erent LNS and pure LS strategies on each group of
20 runs. To this aim we performed a discretization of the data on regular time
intervals; subsequently, for each discrete interval we computed the average value
of the objective function on the 20 runs.
4.3</p>
      </sec>
      <sec id="sec-3-3">
        <title>Results</title>
        <p>In Figure 1 we show the values of best solutions obtained by the three method
for each instance, in percentage of the value of the best known solutions, taken
from the TSPLib [14]. For CP we reported the solution reached in 1 hour of
computation; for GELATO we chose the best solutions obtained with jF V j =
45% of jV j and Mult = 1:5, which turned out experimentally to be the values
for these parameters that behave better in average. For LS and GELATO we
reported the best solution obtained in the 20 runs.</p>
        <p>Figure 2 shows the behavior the di erent methods on instance 5, the most
di cult one (the behavior on the other instances are similar). The GELATO
settings di ers on jF V j (20, 25, 30, 35, 40, and 45 per cent of V ), while the Mult
parameter is set to 1.5.</p>
        <p>Figure 3 shows the behavior of GELATO (on instance 5, with jF V j= 35% of
jV j) for di erent values of the parameter Mult .</p>
        <p>0
0
1
5
9
5
7
n
o
iltso 90
u
n
w
o
n
tk 5
s 8
e
b
f
o
% 0
8</p>
        <p>Global Comparison
CP
EL</p>
        <p>GELATO
0
1
2
3
4</p>
        <p>5</p>
        <p>Instance
Let us add some considerations about the results obtained, that can be
summarized as: when the size grows, GELATO de nitely outperforms Gecode and
EasyLocal++ (if we do not necessarily need of nding the optimum solution).</p>
        <p>For the two small instances (instance 0 and 1), Gecode is able to nd the
optimum, because the search space is rather small. From instance 2 to 5, the
trouble of CP in optimizing large search spaces in reasonable time comes out,
and LNS gives better results. The pure LS approach is the weakest one, always
outperformed by the others. We outline that the solutions found with GELATO
are always between the 95% and 100% of the best known solutions.</p>
        <p>In Figure 2 we show the behavior of the di erent algorithms on instance 5,
which is representative for all the big instances: in the rst seconds of the
execution, CP nds some improving solutions, but from that moment on it starts an
exhaustive exploration of the search space without nding any other signi cant
improving solution (the line of the objective function becomes quite horizontal).
On the other hand, LNS algorithms have a more regular trend: the objective
function is permanently improved during time, since a local minimum is found
and then the algorithm is stopped (after 50 idle iterations). LS ends the search
in a few seconds, without any signi cant improvement.</p>
        <p>Some comparison between the di erent LNS parameters (number of variables
involved and number of failures until stop) can also be done: small LNS (\small"
stands for LNS with few free variables) are very fast, and can make very big
improvements in some seconds. Larger LNS are slower (they have bigger spaces
to analyze), but they can reach better solutions: in fact they optimize a larger
number of variables, so their exploration of the search space is more \global";
it could also happen that a neighborhood is too big to allow CP to perform
an e ective exploration. This trade-o must be investigates experimentally: e.g.,
from our tests it comes out that for instance 5, neighborhoods with jF V j = 35%
of jV j are the best ones, while the ones of 40% and 45% are too big, so ine ective
(see Figure 2).</p>
        <p>A possible meta-heuristic that comes out from these considerations could be
the following:
1. start with small LNS (in this way we try to get the best improvement in the
shortest time);
2. when no better solutions can be found, increase the neighborhood's
dimension, then launch Large Neighborhood Search;
3. iterate 2 since the neighborhood's dimensions increases to intractable/ine ective
ones.</p>
        <p>This Multi-Neighborhood meta-heuristic should nd large improving
solutions in very short time (that can be quickly output to the user), and then run
progressively deeper searches (even if more time expensive).</p>
        <p>In Figure 3 we show the best GELATO method on instance 5 (with jF V j =
35% of jV j) tuned with di erent stop criterions, i.e. di erent maximum numbers
of failures, obtained by changing the parameter Mult . It is clear that for that
instance (but also for the other ones we get the same result) a bigger number of
failures gives better results.</p>
        <p>Source codes and other technical details are in http://tabu.diegm.uniud.
it/EasyLocal++/.
5</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Conclusions and Future works</title>
      <p>In this paper we showed that: using our GELATO hybrid framework, it is
possible to combine a given CP model into a LS framework in a straightforward
way; in particular, we can use a Gecode model as a base to de ne every kind
of neighborhood, using the functionality provided by EasyLocal++ to execute
competitive Large Neighborhood Search algorithms. We tested GELATO on
several instances of the ATSP, and showed that performances of the hybrid LNS
algorighms are very faster w.r.t. the pure CP approach, on all the non-trivial
ATSP instances, even if the LS strategy is naive (Hill Climbing with random
neighborhood selection). We also proposed a general Multi Neighborhood
hybrid meta-heuristic that should improve the results we obtained so far.</p>
      <p>We wish to extend the research pursued in this paper along two lines:
developing and testing new hybrid algorithms; extending GELATO into a more
general modeling framework. First of all we want to develop and test the Multi
Neighborhood meta-heuristic proposed and other hybrid algorithms (e.g., based
on Tabu Search, Steepest Descent, . . . ). We will implement these algorithms
using GELATO and test them on a set of benchmark problems, more structured
than the ATSP, also trying new problem-speci c neighborhood de nitions.</p>
      <p>Concerning the second research line, we want to simplify the class hierarchy
of GELATO (some classes and functionality must be added, other ones need
to be cleaned up and refactored). Once GELATO has a more simple and clear
interface, it will be integrated into a more general modeling tool to easily model
and solve CSPs and COPs, that we already presented in [2]. In this tool, the user
will be able to model a problem in a high-level language (e.g., Prolog, Minizinc,
OPL), and specify the meta-heuristic he want to use to solve the problem; the
model and the meta-algorithm de ned will be automatically compiled into the
solver languages, e.g. Gecode and EasyLocal++, since we use GELATO; at the
end, the overall compiled program will be run and the various tools will interact
in the way speci ed by the user in the modeling phase, to solve the instance of
the problem modeled. We have already implemented two compilers from
declarative languages to a low-level solvers (the Prolog-Gecode and MiniZinc-Gecode
compilers presented in [2]). We want to extend the functionalities of the
compilers already realized and develop the modeling-translating-solving framework
described above. Once this high-level modeling framework is well tested and
reliable, developing hybrid algorithms will be exible and straightforward and their
executions will bene t from the use of low level e cient solvers.
9. E. Monfroy, F. Saubion, and T. Lambert. On hybridization of local search and
constraint propagation. ICLP 2004, LNCS 3132:299{313, Springer, 2004.
10. N. Nethercote, P. J. Stuckey, R. Becket, S. Brand, G. J. Duck, and G. Tack.</p>
      <p>Minizinc: Towards a standard CP modelling language. CP2007, LNCS 4741:529{
543, 2007.
11. Swedish Institute of Computer Science. Sicstus prolog.</p>
      <p>http://www.sics.se/isl/sicstuswww/site/index.html.
12. F. Rossi, P. van Beek, and T. Walsh. Handbook of Constraint Programming
(Foundations of Arti cial Intelligence). Elsevier Science Inc., New York, NY, USA, 2006.
13. Gecode Team. Gecode : Generic constraint development environment.</p>
      <p>http://www.gecode.org.
14. Institut fur Informatik Universitat Heidelberg. Tsplib.
http://www.iwr.uniheidelberg.de/groups/comopt/software/TSPLIB95/.
15. Various Authors. CP-AI-OR conference series. http://www.cpaior.org/.
16. D. H. Wolpert and W. G. Macready. No free lunch theorems for optimization.</p>
      <p>
        Evolutionary Computation, IEEE Transactions on, 1(
        <xref ref-type="bibr" rid="ref1">1</xref>
        ):67{82, 1997.
      </p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <given-names>E.</given-names>
            <surname>Aarts</surname>
          </string-name>
          and
          <string-name>
            <surname>J. K</surname>
          </string-name>
          . Lenstra, editors.
          <source>Local Search in Combinatorial Optimization</source>
          . John Wiley and Sons, Chichester, UK,
          <year>1997</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <given-names>R.</given-names>
            <surname>Cipriano</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Dovier</surname>
          </string-name>
          , and
          <string-name>
            <given-names>J.</given-names>
            <surname>Mauro</surname>
          </string-name>
          .
          <article-title>Compiling and executing declarative modeling languages to Gecode</article-title>
          . ICLP2008, LNCS
          <volume>5366</volume>
          :744{
          <fpage>748</fpage>
          , Springer,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <given-names>E.</given-names>
            <surname>Danna</surname>
          </string-name>
          and
          <string-name>
            <given-names>L.</given-names>
            <surname>Perron</surname>
          </string-name>
          .
          <article-title>Structured vs. unstructured large neighborhood search</article-title>
          .
          <source>In Principles and Practice of Constraint Programming</source>
          ,
          <string-name>
            <surname>CP</surname>
          </string-name>
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <given-names>L. Di</given-names>
            <surname>Gaspero</surname>
          </string-name>
          .
          <article-title>Local Search Techniques for Scheduling Problems: Algorithms and Software Tools</article-title>
          .
          <source>PhD thesis</source>
          , Univ. di Udine, DIMI,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <given-names>L. Di</given-names>
            <surname>Gaspero</surname>
          </string-name>
          and
          <string-name>
            <given-names>A.</given-names>
            <surname>Schaerf</surname>
          </string-name>
          . EasyLocal++:
          <article-title>An object-oriented framework for exible design of local search algorithms</article-title>
          .
          <source>Software | Practice &amp; Experience</source>
          ,
          <volume>33</volume>
          (
          <issue>8</issue>
          ):
          <volume>733</volume>
          {
          <fpage>765</fpage>
          ,
          <year>July 2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <given-names>F.</given-names>
            <surname>Focacci</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Laburthe</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Lodi</surname>
          </string-name>
          .
          <article-title>Local search and constraint programming</article-title>
          . In F. Glover and G. Kochenberger, eds, Handbook of Metaheuristics,
          <source>chapter Local Search and Constraint Programming</source>
          , pages
          <volume>369</volume>
          {
          <fpage>403</fpage>
          . Kluwer,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <given-names>P.</given-names>
            <surname>Van Hentenryck</surname>
          </string-name>
          and
          <string-name>
            <given-names>L.</given-names>
            <surname>Michel</surname>
          </string-name>
          .
          <article-title>Constraint-Based Local Search</article-title>
          . MIT Press,
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <given-names>N.</given-names>
            <surname>Jussien</surname>
          </string-name>
          and
          <string-name>
            <given-names>O.</given-names>
            <surname>Lhomme</surname>
          </string-name>
          .
          <article-title>Local search with constraint propagation and con ictbased heuristic</article-title>
          .
          <source>Arti cial Intelligence</source>
          ,
          <volume>139</volume>
          (
          <issue>1</issue>
          ):
          <volume>21</volume>
          {
          <fpage>45</fpage>
          ,
          <year>2002</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>