=Paper=
{{Paper
|id=Vol-3812/paper11
|storemode=property
|title=Semantics-Preserving Merging of Feature Models
|pdfUrl=https://ceur-ws.org/Vol-3812/paper11.pdf
|volume=Vol-3812
|authors=Mathias Uta,Viet-Man Le,Alexander Felfernig,Damian Garber,Gottfried Schenner,Thi Ngoc Trang Tran
|dblpUrl=https://dblp.org/rec/conf/confws/UtaLFGST24
}}
==Semantics-Preserving Merging of Feature Models==
Semantics-Preserving Merging of Feature Models
Mathias Uta1 , Viet-Man Le2 , Alexander Felfernig2 , Damian Garber2 , Gottfried Schenner3 and
Thi Ngoc Trang Tran2
1
Siemens Energy AG, Erlangen, Germany
2
Graz University of Technology, Graz, Austria
3
Siemens AG, Vienna, Austria
Abstract
Large and globally operating enterprises can be confronted with situations where local variability models representing the
constraints of individual countries and markets have to be integrated to support a centralized variability management. For
example, a car producer operating in the U.S. as well as the European market, could be interested in having a centralized
variability (feature) model representing the variability spaces of all supported markets. To achieve this goal, existing local
feature models and the corresponding knowledge bases have to be integrated in such a way that the configuration spaces
remain the same, for example, for the European market, we would request to support exactly the same set of car configurations
that are supported by the corresponding local feature model. In this paper, we introduce an algorithmic approach that
supports the merging of feature models in such a way that the semantics of the original feature models is preserved. We
present our algorithm and the results of a solver performance analysis which has been conducted on the basis of real-world
feature models.
Keywords
Variability Modeling, Feature Models, Model Merging, Redundancy Elimination, Configuration
1. Introduction equals solutions(πΉ ππππ€ ). In this context, we assume that
πΉ ππ ππππ and πΉ ππΈππππ represent the local feature models
Feature models (FMs) are an intuitive way of represent- of a globally operating car manufacturer and πΉ ππππ€ is the
ing commonality and variability properties of complex result of merging the local feature models (and related
systems [1, 2, 3]. Specifically, in scenarios where com- knowledge bases).
panies are operating on a global basis, integration sce- Knowledge base merging has been approached in var-
narios can arise where country or region-specific feature ious ways. For example, the alignment of knowledge
models have to be integrated to support a more glob- bases is based on the idea of knowledge base integra-
alized variability management. Think about a scenario tion by identifying concepts in different knowledge bases
where a car producer operating in the European and that represent the same underlying concept but are rep-
the US market decides to centralize variability manage- resented by different names. Knowledge base alignment
ment activities. On the technical (feature model) level, is specifically performed in situations where numerous
formerly region- or country-specific models have to be knowledge bases have to be integrated [4]. Knowledge
integrated in a systematic fashion in one centralized vari- base merging is based on a set of predefined merging
ability model. In this paper, we present an algorithmic operations [5, 6], for example, consistency-based merg-
approach to integrate (merge) two different (βoldβ) feature ing follows the goal of deriving a maximally consistent
models (e.g., the feature model πΉ ππ ππππ could denote a set of logical formulae that represent the union of the
local feature model of a US car provider) in a semantics- formulae of the original knowledge bases. Such integra-
preserving way where the solution (configuration) spaces tions basically follow the idea of generating maximally
of the local feature models are βtransferredβ to an in- satisfiable subsets (of rules) [7], i.e., sets that cannot be
tegrated feature model which reflects exactly the same further extended (with original rules) without making
set of solutions: solutions(πΉ ππ ππππ ) βͺ solutions(πΉ ππΈππππ ) the resulting knowledge base inconsistent.
Feature model merging [8, 9, 10, 11, 12, 13] is also in the
ConfWSβ24: 26th International Workshop on Configuration, Sep 2β3,
2024, Girona, Spain
line of the ideas of the previously mentioned approaches.
Envelope-Open mathias.uta@siemens-energy.com (M. Uta); Feature models can become quite large and complex [14],
vietman.le@ist.tugraz.at (V. Le); alexander.felfernig@ist.tugraz.at which makes the development and maintenance of sin-
(A. Felfernig); dgarber@ist.tugraz.at (D. Garber); gle models a challenging task. Following the idea of
gottfried.schenner@siemens.com (G. Schenner); separation of concerns [15], Aydin et al. [16] propose
ttrang@ist.tugraz.at (T. N. T. Tran)
an approach to construct stakeholder-individual feature
Orcid 0000-0002-1670-7508 (M. Uta); 0000-0001-5778-975X (V. Le);
0000-0003-0108-3146 (A. Felfernig); 0000-0002-3550-8352 models which are then merged for the purpose of provid-
(T. N. T. Tran) ing a unified view on the feature space. In the context
Β© 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License
Attribution 4.0 International (CC BY 4.0).
CEUR
ceur-ws.org
Workshop ISSN 1613-0073
Proceedings
Figure 1: Example basic feature model from the automotive domain where type refers to the car type which can be (lim)ousine,
(com)bi, (cit)y, and suv. Furthermore, the car color can be (b)lack or (w)hite, the engine can be 1l, 1.5l, and 2l. Fuel can be
(d)iesel, (e)lectric, (g)asoline, and (h)ybrid, representing the supported types of fuel. Finally, a coupling unit is regarded as an
optional feature.
of such a merging process, different βissuesβ have to be compared to the union semantics introduced by Acher
resolved, for example, some stakeholders regard a feature et al. [8].
as optional while others think it should be mandatory. Compared to related work on feature model semantics
Furthermore, depending on the given scenario, feature preservation [10, 13], our approach provides a generaliza-
naming can also become an issue if no βmaximum fea- tion in terms of (1) supporting arbitrary constraint types
ture setβ has been specified ahead of the merging process. (in contrast to specific feature model related constraints
For such scenarios, Aydin et al. [16] propose a standard such as requires and incompatible) and (2) taking into
merging procedure that is able to generate a reference account redundancy-freeness in terms of assuring that
feature model, which then serves as a basis for further redundant constraints as a result of a merging procedure
discussions and decision-making. can be detected and eliminated from the feature model. In
With a similar motivation, i.e., making large feature our approach, the original feature models and the result-
model development easier, Acher et al. [8], propose a ing feature model (result of the merging operation) are
set of integration operations for βlocalβ feature models represented as constraint satisfaction problems (CSPs)
which basically support the goal of integrating local mod- [19]. To demonstrate the applicability of our approach,
els into a global one. In this context, the authors also we present the results of a corresponding performance
specify feature model relationships on a logical basis, analysis.
for example, one feature model πΉ π1 is the specializa- The remainder of this paper is structured as follows.
tion of a feature model πΉ π2 if the configuration space In Section 2, we introduce a working example consisting
of πΉ π1 is a subset of the configuration space of πΉ π2 of simplified feature models from the automotive domain.
β see also ThΓΌm et al. [17, 3]. The authors also intro- Using this example, we discuss our algorithmic approach
duce a merge operation where the introduced semantics to semantics-preserving feature model merging in Sec-
does not support semantics preservation but requires tion 3. To show the performance of our approach, we
that the result of the merging operation is equivalent report the results of a corresponding performance evalu-
or a superset of the solution (configuration) spaces of ation (see Section 4). Finally, we conclude the paper with
the two original feature models, i.e., solutions(πΉ π1) βͺ a discussion of existing threats to validity (Section 5) and
solutions(πΉ π2) β solutions(merge(πΉ π1, πΉ π2)). Such a a corresponding summary of the contributions of this
semantics of a merge operation is also considered in the paper (Section 6).
contributions of Broek et al. [10], Carbonell et al. [11],
and She et al. [18].
Following the union merge semantics introduced 2. Example Scenario
in Schobbens et al. [12], the feature model merg-
We now introduce a simplified example of a feature model
ing approach presented in this paper focuses on the
merging scenario. Our basic underlying assumption is
preservation of the semantics of the source feature
that the original feature models are consistent, i.e., it
models used as an input for the merging procedure.
is possible that at least one solution can be identified
In other words, it supports a semantics-preserving
and also that the feature set of the original models are
merging where the configuration space of the feature
the same, i.e., the differences are primarily observable in
model resulting from a merging operation is exactly
terms of the constraints defined in the individual mod-
the union of the configuration spaces of the original
els. In our example from the automotive domain, the
feature models: solutions(πΉ π1) βͺ solutions(πΉ π2) =
original feature models are denoted as πΉ ππ ππππ (the orig-
solutions(merge(πΉ π1, πΉ π2)) which is more restrictive
inal US feature model) and πΉ ππΈππππ which denotes the
original European Union feature model. In this context, Table 1
we assume that these feature models are consistent, i.e., Number of consistent solutions (configurations) related to the
non-void [20], meaning that at least one configuration original and contextualized feature models.
can be identified for each of those models. Finally, we Feature model #configurations
denote the resulting model (the merging result) as πΉ ππππ€ . πΉ ππΈππππ 108
Figure 1 represents the basic feature model (i.e., con- πΉ ππ ππππ 96
figuration model [21]) that in the following will be used β²
πΉ π β² = πΉ ππΈππππ β²
βͺ πΉ ππ ππππ 204
as a working example. This feature model represents all β² β²
πΉ ππΈππππ β© πΉ ππ ππππ 84
relevant features that can be used to define variability
knowledge, i.e., we assume that the same set of features
is used to represent variability knowledge in πΉ ππΈππππ fashion, each constraint of the two original feature mod-
and πΉ ππ ππππ . Differences in the two variability models els (represented as CSPs) has to be contextualized using
can exist in terms of constraints representing individual the context variable region.2 Assuming the two regions
configuration spaces. In the following, we specify con- European Union and US, our context variable could be
straints that define the properties of the two original fea- defined as region(πΈπ,π π) denoting the variable region
ture models πΉ ππΈππππ and πΉ ππ ππππ represented in terms with the allowed values {πΈπ , π π}. More precisely, each
of individual constraint satisfaction problems (CSPs) rep- constraint π[π]ππ’ (π[π]π’π ) of the βEUβ (βUSβ) CSP (derived
resenting the European and the US feature model [19].1 from the πΉ ππΈππππ (πΉ ππ ππππ ) feature model) has to be
These CSPs are defined in terms of variables with corre- translated into a contextualized representation β see the
sponding domain definitions (e.g., type(lim,com,sit,suv) following example: π1ππ’ βΆ π π’ππ β π would be translated
denotes the variable type with the allowed values) and a into a corresponding contextualized form π1ππ’β² βΆ ππππππ =
corresponding set of constraints [22]. πΈπ β (π π’ππ β π). The resulting contextualized variants
Note that region is an additional variable representing of the original knowledge bases πΉ ππΈππππ and πΉ ππ ππππ
a contextual information, i.e., to which region a gener- are denoted as πΉ ππΈππππ β² and πΉ ππ π β² .
πππ
ated configuration belongs to. Contexts follow the idea
of separation of concerns [15] which supports a kind of β² :
β’ πΉ ππ ππππ {region(US), type(lim,com,cit,suv),
decentralized modeling [23]. For example, using the con-
color(b,w), engine(1l, 1.5l, 2l), fuel(d, e, g, h), cou-
text variable region, the constraint π1π’π βΆ π π’ππ β β would β² βΆ ππππππ = π π β (π π’ππ β β),
pling(yes,no), π1π’π
be expressed as π1π’π βΆ ππππππ = π π β π π’ππ β β explicitly β²
π2π’π βΆ ππππππ = π π β (π π’ππ = π β πππ’πππππ =
indicating that this constraint has to hold for configura- β² βΆ ππππππ = π π β (π π’ππ = π β πππππ =
ππ), π3π’π
tions generated on the basis of the πΉ ππ ππππ CSP.
π)}
β’ πΉ ππΈππππβ² : {region(EU), type(lim,com,cit,suv),
β’ πΉ ππ ππππ : {region(US), type(lim,com,cit,suv),
color(b,w), engine(1l, 1.5l, 2l), fuel(d, e, color(b,w), engine(1l, 1.5l, 2l), fuel(d, e, g, h), cou-
β² βΆ ππππππ = πΈπ β (π π’ππ β π),
pling(yes,no), π1ππ’
g, h), coupling(yes,no), π1π’π βΆ π π’ππ β β, β² βΆ ππππππ = πΈπ β (π π’ππ = π β πππ’πππππ =
π2π’π βΆ π π’ππ = π β πππ’πππππ = ππ, π2ππ’
β² βΆ ππππππ = πΈπ β (π π’ππ = π β π‘π¦ππ β
ππ), π3ππ’
π3π’π βΆ π π’ππ = π β πππππ = π}
β’ πΉ ππΈππππ : {region(EU), type(lim,com,cit,suv), πππ‘)}
color(b,w), engine(1l, 1.5l, 2l), fuel(d, e,
Note that the solution (configuration) spaces of the
g, h), coupling(yes,no), π1ππ’ βΆ π π’ππ β π, β² and πΉ ππ π β²
contextualized feature models πΉ ππΈππππ πππ
π2ππ’ βΆ π π’ππ = π β πππ’πππππ = ππ,
are the same as those of the original ones (assum-
π3ππ’ βΆ π π’ππ = π β π‘π¦ππ β πππ‘}
ing a corresponding context setting, e.g., ππππππ =
To show the differences between the feature models πΈπ). Following this argumentation, solutions(πΉ ππΈππππ ) βͺ
β² βͺ πΉ ππ π β² )
πΉ ππΈππππ and πΉ ππ ππππ , Table 1 provides an overview of solutions(πΉ ππ π πππ ) = solutions(πΉ ππΈπ πππ πππ
the number of solutions supported by the original (region- which supports our goal of achieving a semantics-
specific) feature models. preserving merging of the original knowledge bases (see
Table 1).
The algorithmic approach to support such a semantics-
3. Merging Feature Models preserving merging is shown in Algorithm 1 (MergeFM)
which itself is a Flama [24] prototype implementation. In
In order to be able to merge the two original feature mod- a first step (starting with line 6 of MergeFM), those con-
els (πΉ ππΈππππ and πΉ ππ ππππ ) in a semantics-preserving straints in the contextualized original knowledge bases
1 2
The feature name abbreviations of πΉ ππΈππππ and πΉ ππ ππππ are defined In general, contexts can be represented by a set of variables (i.e.,
in Figure 1. not necessarily one).
(in Algorithm 1 denoted as πΉ π1β² and πΉ π2β² ) can be decon- β² βΆ ππππππ =
π π β (π π’ππ = π β πππππ = π), π1ππ’
textualized where such a contextualization is not needed β²
πΈπ β (π π’ππ β π), π3ππ’ βΆ ππππππ = πΈπ β (π π’ππ =
(π is a decontextualized version of π β² ): if Β¬π is consistent π β π‘π¦ππ β πππ‘)}
with πΉ π1β² βͺ πΉ π2β² , there (obviously) exist solutions sup-
porting Β¬π. In such a case, the constraint π must be added On the algorithmic level, the resulting knowledge base
in a contextualized fashion to the resulting knowledge πΉ ππππ€ is represented in terms of a constraint satisfaction
base πΉ π, since some feature model configuration (in the problem. One possibility of representing the integrated
other knowledge base) supports Β¬π. If Β¬π is inconsistent knowledge base as the resulting integrated feature model
with πΉ π1β² βͺ πΉ π2β² , π can be added in decontextualized fash- is depicted in Figure 2.
ion to the resulting knowledge base πΉ π. In a second step
(starting with line 14 of Algorithm 1), each constraint of
the resulting knowledge base has to be checked for redun-
4. Performance Evaluation
dancy: in a logical sense, a constraint π can be regarded as In this section, we discuss the results of an initial perfor-
redundant if πΉ π β{π} is inconsistent with Β¬π which means mance analysis we have conducted to evaluate MergeFM
that the constraint does not reduce the solution space of (Algorithm 1) 4 . For this analysis, we applied 8 real-world
FM and thus logically follows from the constraints in FM variability models with varying sizes collected from the
(and can be deleted from the constraints in FM). S.P.L.O.T. feature model repository [25] and the Diverso
Labβs benchmark5 [26]. Table 4 shows the characteristics
Algorithm 1 MergeFM(πΉ π1β² , πΉ π2β² )βΆ πΉ π of these models (denoted as π). In order to generate βto-
be-mergedβ feature models (πΉ π1β² and πΉ π2β² ) with differ-
1: {πΉ π1β² , πΉ π2β² : two contextualized and consistent fea-
ent shares of contextualized constraints from individual
ture models}
πs, we determined the needed number of relationships
2: {π β² : constraint c in contextualized form}
or cross-tree constraints. We then modified these se-
3: {πΉ π: feature model as a result of MergeFM}
lected relationships/cross-tree constraints by changing
4: πΉ π β {};
their type, for example, changing mandatory to optional,
5: πΉ π β² β πΉ π1β² βͺ πΉ π2β² ;
changing alternative to or, or changing requires to ex-
6: for all π β² β πΉ π β² do
cludes. The resulting models (πΉ π1β² βͺ πΉ π2β² = πΉ π β² ) are
7: if ππππππ ππ π‘πππ‘({Β¬π} βͺ πΉ π β² βͺ πΉ π) then
represented as constraint satisfaction problems [19] that
8: πΉ π β πΉ π βͺ {π};
differ individually in terms of the number of constraints
9: else
(#constraints) and the degree of contextualization (ex-
10: πΉ π β πΉ π βͺ {π β² };
pressed as percentages in Tables 2 and 3). In order to
11: end if
take into account deviations in time measurements, we
12: πΉ π β² β πΉ π β² β {π β² };
repeated each experimental setting 10 times where in
13: end for
each repetition cycle the constraints in the individual
14: for all π β πΉ π do
(contextualized) knowledge bases πΉ π β² were ordered ran-
15: if ππππππ ππ π‘πππ‘((πΉ π β {π}) βͺ {Β¬π}) then
domly. All analyses have been conducted with an Apple
16: πΉ π β πΉ π β {π};
M1 Pro (8 cores) computer with 16-GB RAM. For evalu-
17: end if
ation purposes, we used the Choco solver6 to perform
18: end for
the needed consistency checks.
19: πππ‘π’ππ πΉ π;
The number of consistency checks needed for decon-
textualization is linear in terms of the number of con-
straints in πΉ π β² . A performance evaluation of MergeFM
When applying Algorithm 1 to πΉ ππ ππππ and πΉ ππΈππππ ,
with different knowledge base sizes and degrees of con-
the resulting knowledge base πΉ ππππ€ looks like as follows.
β² has textualized constraints in πΉ π is depicted in Table 2. In
In the resulting knowledge base, the constraint π2π’π
MergeFM, the runtime (measured in terms of millisec-
been decontextualized. Also, as a result of applying Al-
β² can be regarded as redundant onds needed by the constraint solver7 to find a solution)
gorithm 1, constraint π2ππ’
increases with the number of constraints in πΉ π β² and de-
and thus can be deleted from πΉ ππππ€ .3
creases with the number of contextualized constraints in
β’ πΉ ππππ€ : {region(US,EU), type(lim,com,cit,suv), 4
The dataset, the implementation of algorithms, and evaluation pro-
color(b,w), engine(1l, 1.5l, 2l), fuel(d, e, g, h), cou- grams can be found at https://github.com/AIG-ist-tugraz/FMMerging.
β² βΆ ππππππ = π π β (π π’ππ β β),
pling(yes,no), π1π’π 5
https://github.com/flamapy/benchmarking
β²
π2π’π βΆ π π’ππ = π β πππ’πππππ = ππ, π3π’π β² βΆ ππππππ = 6
choco-solver.org
7
For the purposes of our evaluation we generated variability models
3 β²
Alternatively, π2π’π could be deleted as a redundant constraint (instead represented as constraint satisfaction problems formulated using
β²
of π2ππ’ ). the Choco constraint solver β www.choco-solver.org.
Figure 2: Example integrated feature model derived from πΉ ππππ€ . This model includes contextual information (the region)
β²
represented as feature(s). Simple contextualized constraints such as π1π’π βΆ ππππππ = π π β (π π’ππ β β) are translated directly
into a corresponding feature model constraint (as excludes relationship), for the representation of more complex constraints
β²
such as π3ππ’ βΆ ππππππ = πΈπ β (π π’ππ = π β π‘π¦ππ β π), the corresponding feature model constraint is textually annotated with
the context information (e.g., region=EU). This graphical representation of contexts in feature models follows the idea of
contextual diagrams as introduced by Felfernig et. al [23].
Table 2
Avg. runtime (in seconds) of MergeFM measured with different configuration knowledge base sizes (of πΉ π1β² and πΉ π2β² ) and
shares of related contextualized constraints (10-50% contextualization).
feature model (π) #constraints(π) 10% 20% 30% 40% 50%
IDE 13 0.008 0.007 0.007 0.006 0.006
Arcade 66 0.060 0.056 0.054 0.052 0.054
FQA 101 2.560 2.341 2.794 2.812 3.684
Invest 233 3.018 3.860 4.879 5.781 5.915
Win8 405 154.825 171.516 165.988 158.998 149.323
EMB 1,029 1,621 1,361 1,138 1,043 972
EA 2,670 3,810 3,870 3,899 4,023 4,032
Linux 13,972 45,641 52,711 47,516 56,536 57,034
Table 3
Avg. runtime (msec) of the merged configuration knowledge bases (πΉ π) to calculate a configuration measured with different
knowledge base sizes (of πΉ π) and shares of contextualized constraints in πΉ π (10-50% contextualization).
feature model (π) #constraints(π) 10% 20% 30% 40% 50%
IDE 13 0.050 0.042 0.039 0.037 0.037
Arcade 66 0.069 0.057 0.060 0.053 0.055
FQA 101 0.072 0.069 0.071 0.072 0.079
Invest 233 4.755 2.992 2.742 2.346 2.293
Win8 405 3.832 4.058 5.385 4.695 4.413
EMB 1,029 22.034 24.190 25.029 25.603 26.980
EA 2,670 40.501 41.227 43.741 45.311 51.483
Linux 13,972 143.698 199.822 143.756 159.515 112.986
Table 4
Feature models used for MergeFM evaluation (IDE=IDE product line, Arcade=Arcade Game PL, FQA=Feature model for
Functional Quality Attributes, Invest=Feature model for Decision-making for Investments on Enterprise Information Systems,
Win8=Accessibility options provided by Windows 8 OS, EMB=EMB Toolkit, EA=EA 2468, Linux=Linux kernel version 2.6.33.3).
feature model (π) IDE Arcade FQA Invest Win8 EMB EA Linux
#features 14 61 178 366 451 1,179 1,408 6,467
#hierarchical constraints 11 32 92 41 267 862 1,389 6,322
#cross-tree constraints 2 34 9 192 138 167 1,281 7,650
#CSP constraints 13 66 101 233 405 1,029 2,670 13,972
πΉ π. The increase in efficiency can be explained by the knowledge base compact in the sense of deleting redun-
fact that a higher degree of contextualization includes dant constraints and not needed contextual information.
more situations where the inconsistency check in Line The runtime performance of our approach is shown on
7 (Algorithm 1) terminates earlier (a solution has been the basis of a first performance analysis with real-world
found) compared to situations where no solution could feature models. Future work will include the evaluation
be found. In addition, Table 3 indicates that the perfor- of our concepts with further knowledge bases and the
mance of solution search does not differ depending on the development of alternative merging algorithms with the
degree of contextualization in the resulting knowledge goal to further improve runtime performance. Further-
base πΉ π. more, we will evaluate different alternative feature model
Consequently, integrating individual variability mod- representations that help to represent contextualized con-
els can trigger the following improvements. (1) De- straints β one such representation has been discussed in
contextualization in πΉ π can lead to less cognitive efforts this paper.
when adapting / extending knowledge bases (due to a
potentially lower number of constraints [27] and a lower
degree of contextualization). (2) Reducing the overall References
number of constraints in πΉ π can also improve runtime
[1] K. Kang, S. Cohen, J. Hess, W. Novak, S. Peter-
performance of the resulting integrated knowledge base.
son, Feature-oriented domain analysis feasibility
study (foda), Technical Report, CMU/SEI-90-TR-021
5. Threats to Validity (1990).
[2] K. Czarnecki, S. Helsen, U. Eisenecker, Formal-
We are aware that our evaluation is in fact based on izing cardinality-based feature models and their
real-world feature models, however, synthesized vari- specialization, SoftwareProcess: Improvement and
ants thereof have been used for MergeFM evaluation Practice 10 (2005) 7β29.
purposes. Furthermore, our approach is based on the [3] A. Felfernig, A. Falkner, D. Benavides, Feature
assumption that the βto-be-mergedβ feature models have Models: AI-Driven Design, Analysis and Applica-
the same set of features, i.e., we assume feature equiv- tions, SpringerBriefs in Computer Science, Springer,
alence. In this context, we assume that in real-world Cham, 2024. doi:10.1007/978- 3- 031- 61874- 1 .
scenarios further streamlining tasks (e.g., feature name [4] L. Galarraga, N. Preda, F. Suchanek, Mining rules
alignment) have to be completed before MergeFM can be to align knowledge bases, in: Proceedings of the
activated. Our basic assumption behind redundancy elim- 2013 Workshop on Automated Knowledge Base
ination and de-contextualization in MergeFM is that the Construction, San Francisco, CA, 2013, pp. 43β48.
understandability and maintainability of feature mod- [5] J. Delgrande, T. Schaub, A consistency-based frame-
els can be improved β although already confirmed by work for merging knowledge bases, Journal of Ap-
related work [27], further research is needed to better un- plied Logic 5 (2007) 459β477.
derstand major impact factors that make feature models [6] P. Liberatore, M. Schaerf, Arbitraton (or how to
(and underlying knowledge bases) easier to understand merge knowledge bases), IEEE Transactions on
and maintainable. Knowledge and Data Engineering 10 (1998) 76β90.
[7] R. Reiter, A theory of diagnosis from first principles,
AI Journal 23 (1987) 57β95.
6. Conclusions and Future Work [8] M. Acher, P. Collet, P. Lahire, R. France, Composing
feature models, in: M. van den Brand, D. GaΕ‘e-
In this paper, we have introduced an approach to the
viΔ, J. Gray (Eds.), Software Language Engineering,
consistency-based merging of variability models repre-
Springer, Berlin, Heidelberg, 2010, pp. 62β81.
sented as constraint satisfaction problems. The approach
[9] V. Bischoff, K. Farias, L. GonΓ§ales, J. Vic-
helps to build semantics-preserving feature models in the
tΓ³ria Barbosa, Integration of feature mod-
sense that the solution spaces of the resulting knowledge
els: A systematic mapping study, Infor-
bases (result of the merging process) correspond to the
mation and Software Technology 105 (2019)
union of the solution spaces of the original knowledge
209β225. URL: https://www.sciencedirect.com/
bases. Such an approach can be useful in the mentioned
science/article/pii/S0950584916302178. doi:https:
integration scenario but as well in situations where differ-
//doi.org/10.1016/j.infsof.2018.08.016 .
ent parts (representing different contexts) of a knowledge
[10] P. van den Broek, I. Galvao, J. Noppen, Merging
are developed in a de-centralized fashion and integrated
feature models, in: 15th International Software
thereafter. Besides the preservation of the original se-
Product Line Conference, Jeju Island, South Korea,
mantics, our approach also helps to make the resulting
2010, pp. 83β90.
[11] J. Carbonnel, M. Huchard, A. Miralles, C. Nebut, Knowledge-based Configuration: From Research to
Feature model composition assisted by formal con- Business Cases, 1st ed., Morgan Kaufmann Publish-
cept analysis, in: 12th International Confer- ers, 2014.
ence on Evaluation of Novel Approaches to Soft- [22] D. Benavides, P. Trinidad, A. Ruiz-Cortes, Us-
ware Engineering, 2017, pp. 27β37. doi:10.5220/ ing constraint programming to reason on feature
0006276600270037 . models, in: 17th International Conference on
[12] P. Y. Schobbens, P. Heymans, J. C. Trigaux, Feature Software Engineering and Knowledge Engineering
Diagrams: A Survey and a Formal Semantics, in: (SEKEβ2005), Taipei, Taiwan, 2005, pp. 677β682.
14th IEEE International Requirements Engineering [23] A. Felfernig, D. Jannach, M. Zanker, Contextual
Conference (REβ06), Minneapolis/St. Paul, MN, USA, diagrams as structuring mechanisms for designing
2006, pp. 139β148. doi:10.1109/RE.2006.23 . configuration knowledge bases in uml, in: 3rd
[13] S. Segura, D. Benavides, A. Ruiz-CortΓ©s, P. Trinidad, International Conference on the Unified Modeling
Automated Merging of Feature Models Using Graph Language (UML2000), volume 1939 of Lecture Notes
Transformations, in: Generative and Transfor- in Computer Science, Springer, York, UK, 2000, pp.
mational Techniques in Software Engineering II, 240β254.
volume 5235 of Lecture Notes in Computer Science, [24] J. Galindo, J. Horcas, A. Felfernig, D. Fernandez-
Springer, 2006, pp. 139β148. doi:10.1109/10.1007/ Amoros, D. Benavides, Flama: A collaborative ef-
978- 3- 540- 88643- 3_15 . fort to build a new framework for the automated
[14] M. Acher, H. Martin, L. Lesoil, A. Blouin, J. JΓ©zΓ©quel, analysis of feature models, in: 27th ACM In-
D. Khelladi, E. Djamel, O. Barais, J. Pereira, Fea- ternational Systems and Software Product Line
ture Subset Selection for Learning Huge Config- Conference - Volume B, SPLC β23, Association for
uration Spaces: The Case of Linux Kernel Size, Computing Machinery, New York, NY, USA, 2023,
in: 26th ACM International Systems and Software pp. 16β19. URL: https://doi.org/10.1145/3579028.
Product Line Conference - Volume A, ACM, 2022, 3609008. doi:10.1145/3579028.3609008 .
pp. 85β96. URL: https://doi.org/10.1145/3546932. [25] M. Mendonca, M. Branco, D. Cowan, S.P.L.O.T.:
3546997. doi:10.1145/3546932.3546997 . Software Product Lines Online Tools, in: Pro-
[15] B. Nuseibeh, J. Kramer, A. Finkelstein, Viewpoints: ceedings of the 24th ACM SIGPLAN Conference
meaningful relationships are difficult!, in: 25th Companion on Object Oriented Programming Sys-
International Conference on Software Engineer- tems Languages and Applications, OOPSLA β09,
ing, 2003, pp. 676β681. doi:10.1109/ICSE.2003. ACM, New York, NY, USA, 2009, pp. 761β762.
1201254 . doi:10.1145/1639950.1640002 .
[16] E. A. Aydin, H. Oguztuzun, A. H. Dogru, A. S. [26] R. Heradio, D. Fernandez-Amoros, J. A. Galindo,
Karatas, Merging Multi-view Feature Models by Lo- D. Benavides, D. Batory, Uniform and scalable sam-
cal Rules, in: 9th International Conference on Soft- pling of highly configurable systems, Empirical
ware Engineering Research, Management and Ap- Software Engineering 27 (2022) 44.
plications, Baltimore, MD, USA, 2011, pp. 140β147. [27] A. Felfernig, M. Mandl, A. Pum, M. Schubert, Em-
doi:10.1109/SERA.2011.34 . pirical knowledge engineering: cognitive aspects
[17] T. ThΓΌm, D. Batory, C. KΓ€stner, Reasoning about in the development of constraint-based recom-
edits to feature models, in: 31st International Con- menders, in: Proceedings of the 23rd Interna-
ference on Software Engineering, ICSE β09, IEEE tional Conference on Industrial Engineering and
Computer Society, USA, 2009, pp. 254β264. URL: Other Applications of Applied Intelligent Systems,
https://doi.org/10.1109/ICSE.2009.5070526. doi:10. IEA/AIEβ10, Springer-Verlag, Berlin, Heidelberg,
1109/ICSE.2009.5070526 . 2010, pp. 631β640.
[18] S. She, U. Ryssel, N. Andersen, A. WΔ
sowski,
K. Czarnecki, Efficient synthesis of feature mod-
els, Information and Software Technology 56 (2014)
1122β1143. URL: https://www.sciencedirect.com/
science/article/pii/S0950584914000238. doi:https:
//doi.org/10.1016/j.infsof.2014.01.012 .
[19] E. Tsang, Foundations of Constraint Satisfaction,
Academic Press, London, 1993.
[20] D. Benavides, S. Segura, A. Ruiz-Cortes, Automated
analysis of feature models 20 years later: A litera-
ture review, Information Systems 35 (2010) 615β636.
[21] A. Felfernig, L. Hotz, C. Bagley, J. Tiihonen,