=Paper= {{Paper |id=Vol-3308/paper9 |storemode=property |title=Revisiting Algorithms for Fuzzy Concept Lattices |pdfUrl=https://ceur-ws.org/Vol-3308/Paper09.pdf |volume=Vol-3308 |authors=Domingo López-Rodríguez,Ángel Mora,Manuel Ojeda-Hernández |dblpUrl=https://dblp.org/rec/conf/cla/Lopez-Rodriguez22 }} ==Revisiting Algorithms for Fuzzy Concept Lattices== https://ceur-ws.org/Vol-3308/Paper09.pdf
Revisiting Algorithms for Fuzzy Concept Lattices
Domingo López-Rodríguez1,∗ , Ángel Mora1 and Manuel Ojeda-Hernández1
1
    Departamento de Matemática Aplicada, Universidad de Málaga, 29071 Málaga, Spain


                                         Abstract
                                         A central notion in Formal Concept Analysis is the concept lattice. This lattice allows describing a
                                         hierarchical biclustering between objects and attributes of a formal context, whose hierarchy is defined
                                         by an order that expresses the specialisation-generalisation relationship between concepts. It is a
                                         fundamental way of representing the knowledge implicit in the context. Therefore, in practice, due to its
                                         theoretical complexity, it is necessary to define computationally efficient algorithms for its calculation.
                                         In the literature, several algorithms, using different approaches, have been proposed for the computation
                                         of the lattice in the classical framework, where the presence of an attribute in an object is modelled as a
                                         binary value, indicating that the attribute is either present or absent. However, it is possible to extend
                                         this framework to take into account the different degrees to which an attribute could be present in an
                                         object. Through this extension, it is possible to model fuzzy situations where the attribute is not 100%
                                         present in an object, giving flexibility to the model. In this paper, we review the best known algorithms
                                         for the calculation of the concept lattice in the binary version, and we extend them for the calculation of
                                         the fuzzy concept lattice, presenting the most significant differences with respect to the original binary
                                         versions. In addition, we will present examples of the execution of these new versions of the algorithms.

                                         Keywords
                                         Formal concept analysis, Concept Lattice, Graded attributes, Algorithms




1. Introduction
In recent years, Formal Concept Analysis (FCA) is reaching a high degree of maturity. Only a
few decades have passed since the first works of the pioneers, R. Wille and B. Ganter [1, 2]. The
number of publications in the area is increasing, as well as the direct applications of FCA in
emergent and hot topics as Social Network Analysis [3], Recommender Systems [4, 5], Medical
Diagnosis [6, 7], E-learning Systems [8, 9] and others. For a reader not used to FCA, there are
many possible references but we highlight [10] because of the pedagogical vision of the authors.
   Within the classical FCA framework, the knowledge extracted from a binary table of data
(formal context) is essentially represented as two complementary entities: the concept lattice
and a basis of context-valid implications. An interesting research line is the one that studies
algorithms to compute both knowledge entities. The intrinsic exponential nature of the process


Published in Pablo Cordero, Ondrej Kridlo (Eds.): The 16𝑡ℎ International Conference on Concept Lattices and Their
Applications, CLA 2022, Tallinn, Estonia, June 20–22, 2022, Proceedings, pp. 105–116.
∗
    Corresponding author.
Envelope-Open dominlopez@uma.es (D. López-Rodríguez); amora@uma.es (Á. Mora); manuojeda@uma.es
(M. Ojeda-Hernández)
Orcid 0000-0002-0172-1585 (D. López-Rodríguez); 0000-0003-4548-8030 (Á. Mora); 0000-0003-4785-6802
(M. Ojeda-Hernández)
                                       © 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
    CEUR
    Workshop
    Proceedings
                  http://ceur-ws.org
                  ISSN 1613-0073
                                       CEUR Workshop Proceedings (CEUR-WS.org)
of building the lattice and the basis of the implications encourages this line to be very active
from the very beginning. This paper focuses on this line.
   In this work, we emphasise the importance of the concept lattice: it represents an exhaustive
analysis of the closed sets according to the well-known derivation operators forming a Galois
connection. That allows us to establish a hierarchical biclustering of the relationships between
objects and attributes.
   Let us recall that the number of concepts can be exponential in the size of the input context
and the problem of determining this number is #P-complete [11]. It is therefore necessary to
develop algorithms that intelligently exploit the structure of the lattice itself and the properties
of the closure system for the efficient computation of the concept set. To solve this problem, a
wide variety of proposals have been made. The most naive one consists of exhaustively enumer-
ating all possible subsets and then checking which ones are closed. More advanced methods
use pruning strategies to avoid such complete enumeration. For example, the NextClosure
algorithm [2] uses the lectic order of intents. Others are based on determining the set of all
extents by performing a recursive search accompanied by canonicity tests that allow a more
efficient exploration. Those based on the CbO strategy [12], such as FastCbO (FCbO) [13, 14],
or InClose [15] stand out.
   The classical FCA framework is extended by considering non-binary relationships between
the objects and attributes, i.e. 𝕃-fuzzy relations. The first approaches were due to Burusco
and Fuentes [16] and Bělohlávek [17]. Fuzzy FCA (FFCA) considers 𝕃-contexts ⟨𝐺, 𝑀, 𝐼 ⟩ where
𝐺 is a set of objects, 𝑀 is a set of attributes an 𝐼 is a fuzzy relation between 𝐺 and 𝑀. The
interpretation of 𝐼 (𝑥, 𝑦) is the truth degree to which object 𝑥 has attribute 𝑦.
   For FFCA, few algorithms have been developed by adapting the classical ones to the gener-
alised framework. We highlight [18, 19] as the first approaches to the problem in this fuzzy
setting. In the same way that in the classical one, more efficient and faster approaches should
be developed. The purpose of this work is to generalise other known and efficient algorithms
to FFCA. We analyse their efficiency and see what the most relevant differences with respect to
the binary versions are.
   The rest of the work is organised as follows. In Section 2, we present the background and
the most prominent algorithms in classical FCA and in Section 3, we show how to extend the
InClose family of algorithms to the fuzzy version. We provide an example of the execution of
these new versions in Section 4 and present our conclusions and the future lines of work in
Section 5.


2. Preliminaries and related works
In this section we give a brief summary of fuzzy set theory. Throughout the paper, the reader is
expected to be familiar with Formal Concept Analysis notions.
   Let 𝕃 = (𝐿, ∧, ∨, ⊗, →, 0, 1) be a complete residuated lattice, that is, (𝐿, ∧, ∨) is a complete
lattice with 0 and 1 being the least and the greatest elements of 𝐿, respectively, (𝐿, ⊗, 1) satisfies
⊗ is commutative, associative, and 1 is neutral with respect to ⊗, and ⊗ and → satisfy the
so-called adjointness property: for all 𝑎, 𝑏, 𝑐 ∈ 𝐿, we have that 𝑎 ⊗ 𝑏 ≤ 𝑐 iff 𝑎 ≤ 𝑏 → 𝑐.
   An 𝕃-set is a mapping 𝑋 ∶ 𝑈 → 𝐿 from the universe set 𝑈 to the truth values set 𝐿, where
𝑋 (𝑢) means the degree to which 𝑢 belongs to 𝑋. The set of 𝕃-sets on the universe 𝑈 is denoted
by 𝐿𝑈 . If 𝑈 is finite, it is common to denote fuzzy sets as 𝑋 = {𝑙1/𝑢1, 𝑙2/𝑢2, … , 𝑙𝑛/𝑢𝑛}, where each
element 𝑢𝑖 belongs to 𝑋 in degree 𝑙𝑖 . If 𝑙𝑖 = 0, the element 𝑢𝑖 might be omitted and if 𝑙𝑖 = 1 we
denote it by {𝑢𝑖 }. Operations with 𝕃-sets are defined element-wise. For instance, 𝐴 ⊗ 𝐵 ∈ 𝐿𝑈 is
defined as (𝐴 ⊗ 𝐵)(𝑢) = 𝐴(𝑢) ⊗ 𝐵(𝑢) for all 𝑢 ∈ 𝑈.
   Now we turn our focus to the main features of the classical algorithms that are considered
the main background of this work. All of them start from a binary formal context. For the
pseudocode and more details, we refer the reader to the sources cited in the text.
   Although former attempts to compute the maximal rectangles of a binary relation exist [20],
the first proper approach to compute all the formal concepts in the FCA framework was the
NextClosure algorithm [2]. The goal is to compute all the intents of the formal concepts. Thus,
the search space is formed by all possible attributes subsets. The NextClosure approach differs
from that of the brute force since it introduces a pruning strategy when exploring the search
space, based on the introduction of an order (called lectic order) between the sets of attributes.
   NextClosure begins with the computation of the closure of the empty set. After this closure
computation, the algorithm recursively enumerates the next closed sets in the lectic order. It is
well-known that the complexity is exponential in the worst case, but it is interesting to know
that the algorithm has a polynomial delay, 𝒪(|𝐺||𝑀|2 ) searching all the intents or 𝒪(|𝐺|2 |𝑀|)
searching all the extents. This fact establishes an upper bound to compute one concept.
   Krajča et al. [13] presented a survey on algorithms for computing formal concepts. The
authors said “The major issue of widely-used algorithms for computing formal concepts is that
some concepts are computed multiple times which brings significant overhead”. Close-by-one
(CbO) [12] proposed an interesting line to reduce this overhead and, at present, the faster
algorithms follow the framework posed by CbO. Kuznetsov’s algorithm builds a tree where
nodes represent computed closures, and edges connecting two closures 𝐴, 𝐵 (into two nodes)
are labelled by attributes {𝑦} connecting them when 𝐵 is the closure of 𝐴 ∪ {𝑦}.
   With respect to CbO, Fast Close-by-One (FCbO) [13, 14] achieves a reduction of the number of
concepts which are computed multiple times, including an additional canonicity test. Currently,
FCbO is one of the main methods to compute formal concepts having parallelizable versions
and several new improvements.
   In this paper, due to its simplicity of implementation, we focus on another highly efficient
algorithm named InClose [15]. It uses incremental closure and matrix searching as key features
to achieve a faster method. InClose avoids to repeat the computation of closures and only
computes a closure per concept in an incremental way. We propose fuzzy extensions for the
different versions of the In-Close algorithm in the literature. We implement and compare the
results of the algorithms in the following sections.


3. Extension of the algorithms to the fuzzy setting
The topic of discussion in this section will be the extension of the classical FCA algorithms to
a fuzzy setting. Let 𝕃 = (𝐿, 0, 1, ∧, ∨, ⊗, →) be a complete residuated lattice. Since all our sets
of objects and attributes are finite, without loss of generality we will consider 𝐿 to be finite as
well. This is because the set {𝐼 (𝑥, 𝑦)}𝑥∈𝐺,𝑦∈𝑀 is finite and we can consider 𝕃′ to be the smallest
complete residuated lattice such that {𝐼 (𝑥, 𝑦)}𝑥∈𝐺,𝑦∈𝑀 ⊆ 𝕃′ ⊆ 𝕃. The complete residuated lattice
𝕃′ exists and is finite, the prove is not the goal of this paper so it is omitted.
   In classical algorithms such as InClose2, introduced by Andrews [15], the code runs for each
attribute 𝑚 ∈ 𝑀. The difference in the fuzzy case is that we have to consider the degrees of
each attribute, hence the run of the algorithm will take the first attribute 𝑚1 and run through
all the different truth values in 𝐿. This tour on the possible degrees could go either forward
or backwards through the elements of 𝐿. The choice taken so far is to run backwards on 𝐿 in
order to avoid redundant cycles. For instance, there is no point in adding to the set 𝐵 all the
degrees of an attribute one by one {0.1/𝑚1, 0.2/𝑚1, … , 1/𝑚1} since running backwards the code would
add {1/𝑚1} = {𝑚1 } directly, thus saving computation time.
   The FuzzyInClose2 algorithm is the fuzzy extension to Andrews’ InClose2. In Algorithm 1,
the procedure is initialised. For this, only the formal context is needed, then the auxiliary
function InClose2 _ChildConcepts is called with the initial extent 𝐺, intent ∅, attribute index
0 and concept list ℂ = ∅.

 Algorithm 1: FuzzyInClose2(𝕂)
  Input: 𝕂 = (𝐺, 𝑀, 𝐼 ): A formal context with grades in 𝐿 = {0 < 𝑙1 < … < 𝑙𝑛 = 1}
  Output: 𝔹(𝕂): The concept lattice of 𝕂.
1 ℂ ∶= ∅
2 InClose2_ChildConcepts(𝐺, ∅, 0, ℂ)
3 return 𝔹(𝕂) = ℂ



   The auxiliary function used above is the one represented in Algorithm 2 below. This is the
one that actually extends the InClose2 algorithm to the fuzzy framework and performs the
recursive search. Notice how in line 3 we make 𝑘 range from 𝑛 to 1, that is, we run through the
elements of 𝐿 in a decreasing order.
Remark 1. The notation of the pseudocode may look cumbersome. It is done set-theory style in
order to maximize the amount of information given to the reader. However, from the coding
point of view, the computation of the algorithms is much simpler due to the use of vector
notation. For instance, in Algorithm 4, in line 5, when it reads 𝐵 ∩ {𝑚𝑗 } ⊊ {𝑔/𝑚𝑗} in vector notation
this is just 𝐵(𝑚𝑗 ) < 𝑔.
   Notice that prior to line 1 it is required that 𝐴 is an extent. The algorithm starting with
𝐴 = 𝐺 ensures this since 𝐺 is always an extent and all ramifications of the algorithm are built
as intersections of 𝐺 with attribute-extents, that always give other extents. It is also required
that 𝐵 is the intent corresponding to 𝐴 accumulated over the course of the execution branch
that has led to this node. Note that 𝐵 will be completed in the current level of recursion. Hence,
𝐵 ⊆ 𝐴↑ is needed. This is ensured in the first iteration of the algorithm since 𝐵 = ∅ but through
all the ramifications this holds as well, although it cannot be seen that trivially.
   Line 1: Initialize a list 𝑄 as empty.
   Line 2: Iterate across the formal context, from a starting attribute 𝑦 + 1 up to attribute 𝑛,
where 𝑛 is the number of attributes in the context.
   Line 3: Iterate across the truth values in 𝐿 backwards and...
   Line 4: ... call the current truth value 𝑔.
  Algorithm 2: InClose2_ChildConcepts(𝐴, 𝐵, 𝑦, ℂ)
   Input: 𝐴: An extent; 𝐵: The intent corresponding to 𝐴, that will be completed in this
           execution; 𝑦: index of the attribute where to start the exploration of this branch;
           ℂ: the global variable where to accumulate the computed concepts.
 1 𝑄 ∶= ∅
 2 for 𝑗 ∈ {𝑦 + 1, … , |𝑀|} do
 3     for 𝑘 ∈ {𝑛, … , 1} do
 4         𝑔 ∶= 𝑙𝑘
 5         if 𝐵 ∩ {𝑚𝑗 } ⊊ {𝑔/𝑚𝑗} then
 6             𝐶 ∶= 𝐴 ∩ {𝑔/𝑚𝑗}↓
 7             if 𝐶 ↑ ∩ {𝑚𝑗 } = {𝑔/𝑚𝑗} then
 8                 if 𝐶 = 𝐴 then
 9                      𝐵 ∶= 𝐵 ∪ {𝑔/𝑚𝑗}
10                 else
11                      if 𝐵 ∩ 𝑀𝑗 = 𝐶 ↑𝑗 then
12                           𝑄 ∶= 𝑄 ∪ {(𝐶, 𝑗, 𝑘)}


13 ℂ ∶= ℂ ∪ {(𝐴, 𝐵)}
14 for (𝐶, 𝑗, 𝑘) ∈ 𝑄 do
15     𝐷 ∶= 𝐵 ∪ {𝑙𝑘/𝑚𝑗}
16     InClose2_ChildConcepts(𝐶, 𝐷, 𝑗, ℂ)



   Line 5: Skip attributes already in 𝐵, since the truth values are gone through in decreasing
order, 𝐵 ∩ {𝑚𝑗 } ≠ ∅ implies 𝑚𝑗 belongs to 𝐵 in a higher degree than 𝑔, so this can be skipped too.
   Line 6: Form an extent, 𝐶, by intersecting the current extent, 𝐴, with the next column of
objects in the context with degree 𝑔.
   Line 7: Skip all 𝑔 such that 𝐶 ↑ ∩ {𝑚𝑗 } ≠ {𝑔/𝑚𝑗}. This is a partial canonicity test because in this
iteration we are focused only on 𝑔.
   Line 8: ... if 𝐶 = 𝐴, then...
   Line 9: ...add {𝑔/𝑚𝑗} to the set 𝐵...
   Lines 10 and 11: ... else, for the canonicity test, if the partial intent of 𝐶 up to 𝑗 is exactly 𝐵
restricted to the first 𝑗 − 1 attributes...
   Line 12:... store (𝐶, 𝑗, 𝑘) ∈ 𝑄...
   Line 13: Store the concept (𝐴, 𝐵) in the list ℂ
   Line 14: For each extent 𝐶, attribute 𝑚𝑗 and truth value 𝑙𝑘 stored in 𝑄...
   Line 15: Create the intent 𝐷 = 𝐵 ∪ {𝑙𝑘/𝑚𝑗}.
   Line 16: Call InClose2 _ChildConcepts to compute the child concepts of 𝐶 starting from the
attribute 𝑚𝑗 and complete the intent 𝐷.
   It is interesting to remark that in line 7 the calculation of 𝐶 ↑ ∩ {𝑚𝑗 } does not require the
computation of the intent since it is simply

                           𝐶 ↑ ∩ {𝑚𝑗 } = 𝐶 ↑ (𝑚𝑗 ) = ⋀ (𝐶(𝑜) → 𝐼 (𝑜, 𝑚𝑗 )) .
                                                   𝑜∈𝐺

Similarly, the computation of the intent 𝐶 ↑𝑗 is not needed since it follows from the previous
formula: the implementation would only present a loop over the attributes 𝑖 < 𝑗, computing
𝐶 ↑ ∩{𝑚𝑖 } using the formula above, and stopping early when 𝐶 ↑ ∩{𝑚𝑖 } = 𝐶 ↑ (𝑚𝑖 ) ≠ 𝐵(𝑚𝑖 ) = 𝐵 ∩{𝑚𝑖 }.
   InClose2 has been proved to be correct and fairly quick timewise, even though it does several
redundant computations. For instance, assume that 𝐴 ∩ {𝑙𝑘/𝑚𝑗}↓ is empty. Then, for all child
concepts from this extent on, it follows trivially that 𝐶 ∩ {𝑙𝑘/𝑚𝑗}↓ = ∅, hence this iteration may be
skipped with no loss of information. This is taken into account in a series of algorithms called
InClose4, introduced by Andrews [15] as well. Actually, there are two variations of InClose4
which are called InClose4a and InClose4b. In these algorithms a list 𝑃 is added to the parameters
in order to track the empty intersections in all the child extent iterations. This way, several
calculations are skipped and thus, runtime is shortened.
   Next, we give some details of the differences between FuzzyInClose2 and FuzzyInClose4a .
Due to space reasons, only the pseudocode of FuzzyInClose4b is displayed, since it is the
fastest and the less computation demanding.
   The main method FuzzyInClose4a only differs from the InClose2 version in that the auxiliary
function, now called InClose4a_ChildConcepts , requires an extra argument, 𝑃, that keeps
track of the empty intersections occurring in a branch of the execution, and that is initially
empty.
   There are subtle changes in lines 5, 11 and 12 of InClose2a _ChildConcepts that improve it
to InClose4a _ChildConcepts .
   Line 5 Skip all attributes that are already in 𝐵 in degree 𝑔 or higher or in 𝑃 in degree 𝑔 or
lower.
   Line 5 in InClose4a: we can omit all attributes that are already in 𝐵 in degree 𝑔 or higher or in 𝑃
in degree 𝑔 or lower, by performing the check 𝐵∩{𝑚𝑗 } ⊊ {𝑔/𝑚𝑗} and (𝑃 ∩{𝑚𝑗 } = ∅ or {𝑔/𝑚𝑗} ⊊ 𝑃 ∩{𝑚𝑗 }).
   After line 10 Before performing the canonicity test 𝐵 ∩ 𝑀𝑗 = 𝐶 ↑𝑗 in line 11 (Algorithm 2), the
fuzzy version of InClose4a will check if the new extent is empty, thus updating 𝑃, the record
of empty intersections, accordingly: if 𝐶 = ∅ then 𝑃 ∶= {𝑔/𝑚𝑗} ∪ (𝑃 ∖ {𝑚𝑗 }). The update of 𝑃
is designed to keep track of the minimal degree 𝑔 for which the computation of the extent 𝐶
provides the empty set. Thus, the condition in line 5 will be optimal and reject all the cases that
inherit the empty intersection.
   Notice that the execution of FuzzyInClose4a (𝕂) computes all the extents of the given context
from 𝐺 to ∅, in case ∅ is indeed an extent.
   Another algorithm that blends the spirit of FuzzyInClose2 and the avoiding empty intersec-
tions of FuzzyInClose4a is the so-called FuzzyInClose4b algorithm.
   Introduced in the classical FCA by Andrews [15], InClose4b keeps the idea of skipping empty
intersections to speed up the code but checks this condition earlier in the process. As we can
see in Algorithm 4, lines 8 and 9, the filter of the extent 𝐶 being empty is applied before than in
FuzzyInClose4a , where this is done in lines 11 and 12. This makes it faster to discard empty
intersections, but it comes with a price. Originally, there were extents not computed by this
 Algorithm 3: FuzzyInClose4b(𝕂)
  Input: 𝕂 = (𝐺, 𝑀, 𝐼 ): A formal context with grades in 𝐿 = {0 < 𝑙1 < … < 𝑙𝑛 = 1}
  Output: 𝔹(𝕂): The concept lattice of 𝕂.
1 ℂ ∶= ∅
2 InClose4b_ChildConcepts(𝐺, ∅, 0, ∅, ℂ)
       ↓
3 if 𝑀 = ∅ then
4     ℂ ∶= ℂ ∪ {(∅, 𝑀)}
5 return 𝔹(𝕂) = ℂ



  Algorithm 4: InClose4b_ChildConcepts(𝐴, 𝐵, 𝑦, 𝑃, ℂ)
   Input: 𝐴: An extent; 𝐵: The intent corresponding to 𝐴, that will be completed in this
           execution; 𝑦: index of the attribute where to start the exploration of this branch;
           𝑃: record for empty intersections; ℂ: the global variable where to accumulate the
           computed concepts.
 1 𝑄 ∶= ∅
 2 for 𝑗 ∈ {𝑦 + 1, … , |𝑀|} do
 3     for 𝑘 ∈ {𝑛, … , 1} do
 4         𝑔 ∶= 𝑙𝑘
 5         if 𝐵 ∩ {𝑚𝑗 } ⊊ {𝑔/𝑚𝑗} and (𝑃 ∩ {𝑚𝑗 } = ∅ or {𝑔/𝑚𝑗} ⊊ 𝑃 ∩ {𝑚𝑗 }) then
 6             𝐶 ∶= 𝐴 ∩ {𝑔/𝑚𝑗}↓
 7             if 𝐶 ↑ ∩ {𝑚𝑗 } = {𝑔/𝑚𝑗} then
 8                 if 𝐶 = ∅ then
 9                      𝑃 ∶= (𝑃∖{𝑚𝑗 }) ∪ {𝑔/𝑚𝑗}
10                 else
11                      if 𝐶 = 𝐴 then
12                           𝐵 ∶= 𝐵 ∪ {𝑔/𝑚𝑗}
13                      else
14                           if 𝐵 ∩ 𝑀𝑗 = 𝐶 ↑𝑗 then
15                                𝑄 ∶= 𝑄 ∪ {(𝐶, 𝑗, 𝑘)}


16 ℂ ∶= ℂ ∪ {(𝐴, 𝐵)}
17 for (𝐶, 𝑗, 𝑘) ∈ 𝑄 do
18     𝐷 ∶= 𝐵 ∪ {𝑙𝑘/𝑚𝑗}
19     InClose4b_ChildConcepts(𝐶, 𝐷, 𝑗, 𝑃, ℂ)



algorithm. Fortunately, this occurs only once and it is the extent ∅, in the case it is indeed an
extent. This is a small price to pay and it is solved in Algorithm 3, lines 3 and 4, where a direct
check on (∅, 𝑀) being a formal concept is made and, if it is, it is stored in ℂ.
4. An example
In this section, we present an example of the implementation of the fuzzy versions
FuzzyInClose2 , FuzzyInClose4a and FuzzyInClose4b . Due to space restrictions, we can-
not present the complete trace of the execution of these algorithms. However, we have chosen
to present this trace in graphical form, in order to be able to check the different steps followed
by each of these algorithms. For this work, we have implemented the algorithms using the
functionalities provided by the fcaR package [21] in the R programming language and used
those implementations for the example in this section.
   In this example, we consider the formal context of Table 1, where the set of grades for
the attributes 𝑀 = {𝑎, 𝑏, 𝑐, 𝑑, 𝑒} is 𝐿 = {0, 1/2, 1}. It is not a large context in order to make the
execution graphs legible.

Table 1
Simple formal context for the example.
                                              𝑎     𝑏     𝑐    𝑑     𝑒
                                         O1   0     0     ⁄2
                                                          1    0     1
                                         O2   1     1⁄2   ⁄2
                                                          1    0     1⁄2
                                         O3   1⁄
                                                2
                                                    1⁄
                                                      2   0    1⁄
                                                                 2
                                                                     1⁄
                                                                        2




   The first of the tested versions is FuzzyInClose2 . In Figure 1, we can check the order in
which the extent intersections are computed in this algorithm. Starting with ∅, the first level
of recursion (intersection of 𝐺 with all attribute extents) is carried out from left to right, in
the attribute order defined in the previous section. We have used a colour code to indicate the
possible situations that can occur when checking a given node: a red box indicates that the
canonicity test has failed at that point; a light orange box indicates that the partial canonicity
test is not passed; and a grey box appears when parent and child nodes have the same extent, so
the intent of the parent is updated. Once the whole level has been checked, we start to develop
the branches in the next phase of the recursion, going through the nodes from left to right. So,
under the node labelled {𝑎}, we start to develop its branch, following exactly the same procedure
as before: we have to go through all the attributes/degrees starting from the attribute {𝑏}, and
then develop the leftmost branch that is not developed at that moment. Repeating the process,
we arrive at 5 levels of recursion for the calculation of all concepts.
   The execution graphs for FuzzyInClose4a and FuzzyInClose4b , which take into account
the occurrence of empty intersections at each level of recursion of the algorithm, are presented
in Figure 2 below. These empty intersections are inherited downwards in the branches arising
from that level of recursion, and allow to reduce the number of operations to be performed, as
well as the depth of the recursion. The same colour code is used as in Figure 1, adding the blue
boxes to represent iterations where an empty extent was found.
   As can be seen, there is no great difference, at least in this example, between the two proposals,
although there is a clear reduction in the computational load with respect to our baseline of
comparison, the FuzzyInClose2 method. We can see that, instead of 5 levels of recursion, in
the FuzzyInClose4 family, only 4 appear. Due to the different characteristics of both methods,
Figure 1: Order of computations performed by the fuzzy version of InClose2.


it is to be expected that in larger contexts differences will appear and that, probably, they will
be in favour of the FuzzyInClose4b variant, since it makes a greater pruning in the exploration
of the graph of intersections of extents.
    Finally, we present in Table 2 a quantitative comparison of the number of operations carried
out by each algorithm. We will only count computationally expensive operations: partial
canonicity tests, i.e., checks of 𝐶 ↑ ∩ {𝑚𝑗 } = {𝑔/𝑚𝑗}; complete canonicity tests, as in the binary case,
on the equality 𝐵 ∩ 𝑀𝑗 = 𝐶 ↑𝑗 ; and the number of attribute intents computed and the number of
intersections of extents. In this table we show the number of such operations for each algorithm.

Table 2
Number of computations performed by each of the algorithms for the dataset in Table 1.
                      Algorithm    Partial tests   Full tests   #Intents   #Extents
                      InClose2               49           29         97         50
                      InClose4a              41           29         89         42
                      InClose4b              33           25         74         42

   In the table, different facts can be observed. On the one hand, looking at the #Extents column,
we can see how versions 4a and 4b of the algorithm build smaller execution graphs, since the
number of intersections coincides with the number of nodes explored in the execution of the
graph. On the other hand, although the execution graph is the same for versions 4a and 4b, the
different strategy for exploiting empty intersections produces a clear reduction in the number
of operations to be carried out, since fewer tests, both partial and complete, are checked. Thus,
Figure 2: Order of computations performed by the fuzzy version of InClose4a (up) and InClose4b
(down).


the exploration carried out by FuzzyInClose4b is more efficient and, for larger contexts, could
be the fastest of the 3 algorithms.


5. Conclusions and future work
The concept lattice is a fundamental way of representing the knowledge implicit in the context.
In the literature, several algorithms, using different approaches, have been proposed for the
computation of the lattice in the classical framework. In this paper some of the extensions of
these algorithms to the fuzzy framework have been introduced. These algorithms are of the
family of InClose. After a brief explanation of the code, an example was shown to illustrate
the trace of the different approaches. Lastly, the algorithms are compared taking into account
runtime, number of test calculations and number of extents computed.
   As a future work, we devise to study several optimisations to the InClose4 family, taking
advantage of the structure of the degrees in 𝐿. The aim of the possible optimisations will
be to reduce the number of computations both of intents and extents, hence reducing the
computational cost of the algorithm.
   Furthermore, we aim to explore generalisations of other algorithms, such as the FastCbO
family or the NextNeighbour or NextPriorityConcept, for the fuzzy setting, along with different
optimisations that could alleviate the greater computational cost when compared to the binary
case. We want to perform a thorough comparison between the different versions of the algo-
rithms in terms of execution time and the number of computations required to compute the
fuzzy concept lattice. We expect to incorporate the implementation of all algorithms for lattice
computation in the fuzzy framework into the above-mentioned fcaR package.
  Other research line for future works will study the extension of the provided algorithms to
compute the canonical basis of implications in this fuzzy setting. It has already been proved
that, in the binary case, CbO-like algorithms can be modified to provide such a basis. We aim to
extend the fuzzy versions of the algorithms to compute the basis.


Acknowledgments
This work has been partially supported by the Ministerio de Ciencia, Innovación y Universi-
dades [FPU19/01467, TIN2017-89023-P, PGC2018-095869-B-I00] and the Junta de Andalucía
[UMA2018‐FEDERJA‐001].


References
 [1] R. Wille, Restructuring lattice theory: An approach based on hierarchies of concepts, in:
     I. Rival (Ed.), Ordered Sets, Springer Netherlands, Dordrecht, 1982, pp. 445–470.
 [2] B. Ganter, Two basic algorithms in concept analysis, 1984. FB4-Preprint 831. Darmstadt,
     Germany: Technische Hochschule Darmstadt.
 [3] H. E. Salman, Feature-based insight for forks in social coding platforms, Information and
     Software Technology 140 (2021) 106679.
 [4] P. Cordero, M. Enciso, Á. Mora, M. Ojeda-Aciego, C. Rossi, A formal concept analysis
     approach to cooperative conversational recommendation, Int.Journal of Computational
     Intelligence Systems 13 (2020).
 [5] G. Chemmalar Selvi, G. G. Lakshmi Priya, Rating prediction method for item-based
     collaborative filtering recommender systems using formal concept analysis, EAI Endorsed
     Transactions on Energy Web 8 (2021).
 [6] F. Zheng, L. Cui, A lexical-based formal concept analysis method to identify missing
     concepts in the NCI thesaurus, in: Proceedings - 2020 IEEE International Conference on
     Bioinformatics and Biomedicine, BIBM 2020, 2020.
 [7] P. Cordero, M. Enciso, D. López, A. Mora, A conversational recommender system for
     diagnosis using fuzzy rules, Expert Systems with Applications 154 (2020).
 [8] U. Priss, A preliminary semiotic-conceptual analysis of a learning management system,
     in: Procedia Computer Science, volume 176, 2020.
 [9] M. Ojeda-Hernández, F. Pérez-Gámez, Á. Mora Bonilla, D. López-Rodríguez, Using logic
     to determine key items in math education, in: 15th International Conference e-Learning,
     EL 2021, 2021, pp. 62–69.
[10] B. Ganter, S. Obiedkov, Conceptual Exploration, Springer Berlin Heidelberg, 2016.
[11] S. O. Kuznetsov, Interpretation on graphs and complexity characteristics of a search for
     specific patterns, Automatic Documentation and Mathematical Linguistics 24 (1989) 37–45.
[12] S. Kuznetsov, A fast algorithm for computing all intersections of objects in a finite semi-
     lattice, Automatic Documentation and Mathematical Linguistics 27 (1993) 11–21.
[13] P. Krajca, J. Outrata, V. Vychodil, Advances in algorithms based on CbO, in: CLA, volume
     672, Citeseer, 2010, pp. 325–337.
[14] J. Outrata, A lattice-free concept lattice update algorithm based on* CbO., in: CLA, volume
     2013, Citeseer, 2013, pp. 261–274.
[15] S. Andrews, Making use of empty intersections to improve the performance of CbO-type
     algorithms, in: In.Conference on Formal Concept Analysis, Springer, 2017, pp. 56–71.
[16] A. Burusco-Juandeaburre, R. Fuentes-González, The study of the L-fuzzy concept lattice.,
     Mathware and Soft Computing 1 (1994) 209–218.
[17] R. Belohlavek, Fuzzy relational systems: foundations and principles, volume 20, Springer
     Science & Business Media, 2002.
[18] R. Belohlavek, Algorithms for fuzzy concept lattices, in: Int. Conf. on Recent Advances in
     Soft Computing, 2002, pp. 200–205.
[19] R. Belohlavek, B. De Baets, J. Outrata, V. Vychodil, Lindig’s algorithm for concept lattices
     over graded attributes, LNCS 4617 (2007) 156–167.
[20] E. M. Norris, An algorithm for computing the maximal rectangles in a binary relation,
     Revue Roumaine de Mathématiques Pures et Appliquées 23 (1978) 243–250.
[21] D. López-Rodríguez, A. Mora, J. Domínguez, A. Villalón, I. Johnson, fcaR: Formal Concept
     Analysis, 2020. URL: https://cran.r-project.org/package=fcaR.