=Paper= {{Paper |id=Vol-71/paper-15 |storemode=property |title=Ontologies for Learning Agents: Problems, Solutions and Directions |pdfUrl=https://ceur-ws.org/Vol-71/Stanescu.pdf |volume=Vol-71 }} ==Ontologies for Learning Agents: Problems, Solutions and Directions== https://ceur-ws.org/Vol-71/Stanescu.pdf
            Ontologies for Learning Agents: Problems, Solutions and Directions

                   Bogdan Stanescu, Cristina Boicu, Gabriel Balan, Marcel Barbulescu,
                                     Mihai Boicu, Gheorghe Tecuci
                       4A5, Learning Agents Laboratory, George Mason University
                                4400 University Dr., Fairfax, VA-22030, US
                     {bstanesc, ccascava, gbalan, mbarbule, mboicu, tecuci}@gmu.edu


                         Abstract                                lege. The center of gravity of a force (state, alliance, coa-
                                                                 lition or group) represents the foundation of capability,
     We are developing a general end-to-end approach,
                                                                 power and movement, upon which everything depends
  called Disciple, for building and using personal problem
                                                                 [Clausewitz, 1976]. In any conflict, a force should con-
  solving and learning agents. This approach raises complex
                                                                 centrate its effort on its enemy’s center of gravity, while
  challenges related to ontology specification, import, elici-
                                                                 adequately protecting its own. As a consequence, the ex-
  tation, learning, and merging, that we have explored to
                                                                 amples used in this paper will be from the COG domain,
  various degrees, as we are developing successive versions
                                                                 but they will not require an understanding of this domain.
  of Disciple. This paper presents some of these challenges,
                                                                    The rest of this paper is organized as follows. The next
  our current solutions and the future directions, that are
                                                                 section discusses the use of the ontology for representa-
  relevant for building agents in general.
                                                                 tion, communication, problem solving, and learning, both
                                                                 in general, and in the context of the Disciple family. Sec-
1 Introduction                                                   tion 3 gives an overview of the Disciple agent building
The long term objective of our research is to develop the        methodology, stressing the ontology-related activities.
science and technology that will allow typical computer          Then sections 4 to 7 discuss in more details some of our
users to train and use their personal intelligent assistants.    main results on ontology specification, exception-based
Our approach to this problem is to develop a series of           ontology learning, example-based ontology learning, and
increasingly more capable agents from the Disciple fam-          ontology import and merging. These sections will include
ily of learning agent shells [Tecuci, 1998; Tecuci et al.,       experimental results and plans for future research.
2002]. A Disciple agent can be initially trained by a sub-
ject matter expert and a knowledge engineer, in a way            2    Knowledge representation for
that is similar to how an expert would teach an appren-               problem solving and learning
tice, through problem solving examples and explanations.
Once trained to a significant level of competence, copies        A Disciple learning agent shell includes general problem-
of the agent are handed over to typical computer users.          solving and learning engines for building a knowledge
These agents then assist their users through mixed-              base consisting of an object ontology that specifies the
initiative reasoning, increasing their recall, speed and         terms from a particular domain, and a set of problem
accuracy, without impeding their creativity and flexibil-        solving rules expressed with these terms [Tecuci et al.,
ity. In the same time, the assistants continue to learn          2002]. The problem-solving engine is based on the gen-
from this joint problem solving experience, adapting to          eral task reduction paradigm. In this paradigm, a task to
their users to become better collaborators that are aware        be performed is successively reduced to simpler tasks, by
of users’ preferences, biases and assumptions.                   applying task reduction rules. Then the solutions of the
   The process of building and using such problem solv-          simplest tasks are successively combined, by applying
ing and learning agents raises complex challenges related        solution composition rules, until they produce the solu-
to ontology specification, import, elicitation, learning,        tion of the initial task.
and merging, that we have explored to various degrees,              The object ontology is a hierarchical representation of
as we are developing successive versions of Disciple.            the objects and types of objects from the application do-
The goal of this paper is to present some of these chal-         main. It represents the different kinds of objects, the
lenges, our current solutions and the future directions,         properties of each object, and the relationships existing
that are relevant for building agents in general.                between objects. A fragment of the object ontology for
   In the last three years, the development of the Disciple      the COG domain is shown in the bottom part of Figure 1.
approach was driven by the attempt to find an automatic             The reduction rules are IF-THEN structures that ex-
solution to the complex Center of Gravity (COG) analy-           press how and under what conditions a certain type of
sis problem, in collaboration with the US Army War Col-          task may be reduced to simpler subtasks. The reduction
                                   Example                                                                         Rule
          I need to                                                                      IF
          Analyze the will_of_the_people_of_Caribbean_States_Union as                    Analyze the ?O2 as a potential strategic_COG_candidate
          a potential strategic_COG_candidate of the OECS_Coalition                      of the ?O1 with respect to the ?O3
          with respect to the people_of_Caribbean_States_Union                           Question: Is the ?O2 a legitimate candidate?
                                                                                         Answer: No
              Is the will_of_the_people_of_Caribbean_States_Union                        THEN
              a legitimate candidate?                                                    The ?O2 is not a strategic_COG_candidate with respect to
                                                                                         the ?O3
                                       No
                                                                                         IF
          Therefore
                                                                                         Analyze the will of the people as a potential strategic COG
          The will_of_the_people_of_Caribbean_States_Union is not a                      candidate of a force with respect to the people of a force
          strategic_COG_candidate with respect to the                                         The will is ?O2
          people_of_Caribbean_States_Union                                                    The force is ?O1
                                                                                              The people are ?O3

                                   object                                                Explanation
                                                                                         ?O1 has_as_member?O4
                                                                                         ?O4 has_as_people ?O3
                                       agent                    will-of-agent            ?O3 has_as_will ?O2
                                                                                           Plausible Upper Bound           Plausible Lower Bound
                           force                                                                 Condition                       Condition
                                                                                         ?O1 is multi_member_force ?O1 is dominant_partner_
                                                      people          will-of-people
                                                                                                has_as_member ?O4        multi_state_alliance
           multi_member_        single_member_
                                                                                                                          has_as_member ?O4
                force                 force
                                                                                         ?O2 is will_of_agent          ?O2 is will_of_people
                                                                 has_as_ will_of_the_    ?O3 is people                 ?O3 is people
                                                                   will   people_of_           has_as_will ?O2               has_as_will ?O2
         multi_state_alliance   single_state_force    people_of_          Caribbean_
                                                      Caribbean_         States_Union    ?O4 is force                  ?O4 is single_state_force
                                                     States_Union                               has_as_people ?O3             has_as_people ?O3
         dominant_partner_
                                     Caribbean_ has_as_                                  THEN:
         multi_state_alliance
                                      States_    people                                  The will of the people is not a strategic_COG_candidate
                                       Union                                             with respect to the people of a force
             OECS_         has_as_                                                             The will is ?O2
            Coalition      member                                                              The people are ?O3

                                                       Figure 1: Ontology based rule learning.
rule are paired with IF-THEN composition rules that ex-                         tions (Plausible Lower Bound Condition and Plausible
press how and under what conditions the solutions of the                        Upper Bound Condition) that define the plausible version
subtasks may be composed into the solution of the task.                         space of the exact condition of the rule. Similarly, each
An example of a simple task reduction rule is shown in                          partially learned feature F from the object ontology has
the right hand side of Figure 1. In this case the IF task is                    its domain and range represented as plausible version
reduced to its solution.                                                        spaces. The domain to be learned of the feature F is a
   The learning engines use several strategies to learn the                     concept that represents the set of objects that could have
rules and to refine the object ontology. At the basis of the                    the feature F. Similarly, the range to be learned is a con-
learning methods are the notion of plausible version                            cept that represents the set of possible values of F.
space [Tecuci, 1998; Boicu, 2002] and the use of the ob-                           The object ontology plays a crucial role in Disciple,
ject ontology as an incomplete and partially incorrect                          being at the basis of user-agent communication, problem
generalization hierarchy for learning.                                          solving, knowledge acquisition and learning. First of all,
   A plausible version space is an approximate represen-                        the object ontology provides the basic representational
tation for a partially learned concept, as illustrated in                       constituents for all the elements of the knowledge base.
Figure 2. The partially learned concept is represented by                       When an expert teaches a Disciple agent, the expert ex-
a plausible upper bound concept which, as an approxima-                         presses his/her reasoning process in natural language, as
tion, is more general than the concept Eh to be learned,                        illustrated by the task reduction example in the upper left
and by a plausible lower bound concept which, again as                          side of Figure 1. The top task is the task to be reduced.
an approximation, is less general than Eh. During learn-                        In order to reduce this task the expert asks a relevant
ing, the two bounds (which are first order logical expres-                                    Universe of                        Plausible
sions) converge toward one another through successive                                         Instances
                                                                                                                      Eh        Upper Bound
generalizations and specializations, approximating Eh
better and better.                                                                                                       Plausible
   The partially learned knowledge pieces from the                                                                      Lower Bound
knowledge base of Disciple are represented with plausi-
ble version spaces. Notice, for example, that the IF-
THEN rule from the bottom right part of Figure 1 does
not have a single applicability condition but two condi-                                Figure 2: A representation of a plausible version space
question. The answer to this question leads to the reduc-                            Domain analysis and
tion of this task to a solution. As the expert types these                          ontology specification

expressions using natural language, the agent interacts
with him/her to replace certain phrases with the ontology                                Ontology import
                                                                                         and development
terms they designate (e.g. “will of the people of Carib-
bean State Union” or “strategic COG candidate”). The
                                                                                    Scenario specification
recognition of these terms facilitates the understanding of
the expert’s phrases and the learning of a general rule
from this specific example. The learned rule has an in-                             Modeling the problem
                                                                                      solving process
formal structure (shown in the top right part of Figure 1)
and a formal structure (shown in the bottom right part of
Figure 1). The informal structure preserves the natural                     Ontology          Mixed            Rules
                                                                            learning        initiative       learning
language of the expert and is used in agent-user commu-                                     problem
                                                                             Ontology                         Rules
nication. The formal structure is used in the actual rea-                   refinement       solving       refinement
soning of the agent. Notice that the two plausible version
space conditions from the formal structure are expressed                                 Exception based
with the terms from the object ontology. The formal tasks                                 KB refinement
and their features are also part of the task ontology, and
feature ontology, respectively.                                         Figure 3: Main agent development processes.
   As mentioned above, the object ontology has a                 During ontology import and development, this specifi-
fundamental role in learning, being used as a                 cation guides the process of importing ontological
generalization hierarchy. Indeed, notice that the specific    knowledge from existing knowledge repositories, such as
instances from the example (“will of the people of            CYC [Lenat, 1995], as discussed in section 7. However,
Caribbean State Union”, “OECS Coalition”, “people of          not all the necessary terms will be found in external re-
Caribbean State Union”) are replaced in the learned rule      positories and therefore the knowledge engineer and the
with more general concepts from the object ontology           subject matter expert will also have to extend the im-
(“will of agent”, “multi member force”, “people”), and        ported ontology using the ontology development tools of
their relationships.                                          Disciple. For instance, Figure 4 shows the interfaces of
   While the corresponding learning algorithm is pre-         three different ontology browsers of Disciple, the asso-
sented in [Boicu et al., 2000; Boicu 2002], it is important   ciation browser (which displays and objects and its rela-
to stress here that the agent’s generalization hierarchy      tionships with other objects), the tree browser (which
(the object ontology) is itself evolving during learning      displays the hierarchical relationships between the ob-
(as discussed in sections 4, 5, and 6). Therefore Disciple    jects in a tree structure), and the graphical browser
addresses the complex and more realistic problem of           (which displays the hierarchical relationships between
learning in the context of an evolving representation lan-    the objects in a graph structure).
guage. The next section gives an overview of the agent           Once the object ontology is developed, the knowledge
building methodology, stressing the ontology-related          engineer has to define elicitation scripts using the Script
activities.                                                   Editor of Disciple. The elicitation scripts will be exe-
                                                              cuted by the Scenario Elicitation tool, guiding the user of
3    Agent building methodology                               Disciple to define a specific scenario or problem solving
The Disciple learning agent shell could be used to rapidly    situation (e.g. the current war on terror, including the
develop a Disciple agent for a specific application do-       characteristics of the participating forces, such as US and
main, by following the steps from Figure 3. There are         Al Qaeda). This process will be described in more detail
two main phases in this process: the development of an        in section 4. The result of this initial KB development
initial object ontology and the teaching of the agent. The    phase is an object ontology with instances characterizing
first phase has to be performed jointly by a knowledge        a specific scenario.
engineer and a subject matter expert. The second phase           In the next major phase, the subject matter expert will
may be performed primarily by the subject matter expert,      use the current scenario to teach Disciple how to solve
with limited assistance from a knowledge engineer.            problems (e.g. how to determine the centers of gravity of
   During domain analysis and ontology specification, the     the opposing forces in the current war on terror).
knowledge engineer works with the subject matter expert          First, the expert will interact with the Modeling advi-
to develop an initial model of how the expert solves          sor tool of Disciple. This tool will assist the expert to
problems, based on the task reduction paradigm. The           express his or her reasoning process in English, using the
model identifies also the object concepts that need to be     task reduction paradigm. The result of this process will
represented in Disciple’s ontology so that it can perform     be task reduction steps like the one from the upper left
this type of reasoning. These object concepts represent a     part of Figure 1. These steps may also include new terms
specification of the ontology needed for reasoning.           that are not yet present in the object ontology of Disciple.
                                                              Each such term is an example for learning a general con-
                                                               a COG analysis of a scenario and to generate an analysis
                                                               report. Over 95% of the students from the 2002 Terms II
                                                               and III sessions of this course agreed with the following
                                                               statement: Disciple helped me to learn how to perform a
                                                               strategic center of gravity analysis of a scenario. In the
                                                               follow-on MAAI course, the students taught personal
                                                               Disciple agents their own expertise in COG analysis. Af-
                                                               ter the experiments conducted in Spring 2001 and Spring
                                                               2002, 19 of the 25 students agreed (and 6 were neutral)
                                                               with the statement: I think that a subject matter expert
                                                               can use Disciple to build an agent, with limited assis-
                                                               tance from a knowledge engineer.
                                                                  The following sections will provide more details on
                                                               some of the most important ontology-related processes of
                                                               the Disciple agent development methodology, as well as
                                                               results from the above experiments.
     Figure 4: Association, tree, and hierarchical browsers.
cept or a general feature using the Ontology learning          4       Scenario specification
method discussed in section 6. Also, each specific rea-        As part of the initial ontology development, the knowl-
soning step formulated with the Modeling advisor is an         edge engineer uses the Script Editor to define elicitation
example from which a general rule is learned using the         scripts that specify how to elicit the description of a sce-
Rule Learning tool. An example of such a rule is pre-          nario from the user. These scripts are associated with the
sented in the right hand side of Figure 1.                     concepts and features from the ontology. Each script has
   As Disciple learns more rules, the interaction with the     a name, a list of arguments, and it specifies how to dis-
subject matter experts evolves from a teacher-student          play the dialog with the user, the questions to ask the
type of interaction to an interaction where both collabo-      user, how to store the answers in the ontology, and what
rate in solving a problem. This interaction is governed by     other scripts to call. Table 1 shows the script “elicit gov-
the mixed-initiative problem solving tool. In this case,       ernment type” associated with the concept “state gov-
Disciple uses the partially learned rules to propose solu-     ernment”.
tions to the current problems, and the expert’s feedback          The elicitation scripts are executed by the Scenario
will be used by the Rule Refinement tool and the Ontol-        Elicitation tool. As illustrated in Figure 5, the left hand
ogy Refinement tool to improve both the rules and the          side of the Scenario Elicitation interface displays a table
ontology elements involved in the rules’ applications.         of contents. When the expert clicks on one of these titles,
   There is no fixed sequence of tool invocations. Instead,    questions that elicit the corresponding description are
they are used opportunistically, based on the current          displayed in the right hand side of the screen. The use of
problem solving situation. For example, while the expert       the elicitation scripts allows a knowledge engineer to
and Disciple are performing mixed-initiative problem           rapidly build a customized interface for a Disciple agent,
solving, the expert may need to define a new reduction         thus effectively transforming this software development
that requires modeling, rule learning and rule refinement.     task into a knowledge engineering one.
   Because the rule learning and refinement processes             The Protégé system [Noy et al., 2000] has a similar
take place in the context of an incomplete and partially       capability of using elicitation scripts to acquire instances
incorrect object ontology, some of the learned rules may       of concepts. However, Disciple extends Protégé in sev-
accumulate exceptions. In such a case, the exception-          eral directions. In Disciple the expert does not need to
based KB refinement tool may be invoked to extend or           see or understand the object ontology in order to answer
correct the object ontology and to correspondingly refine      the questions and describe a scenario. Instead, the expert-
the rules. This process will be presented in section 5.        agent interaction is directed by the execution of the
   Because one of the goals of this research is the rapid      scripts. Once the expert answers some questions or up-
development of knowledge bases, the Disciple shell also
includes tools to merge the ontologies and the rules de-               Table 1: Sample elicitation script.
veloped in parallel by the subject matter experts. Section
                                                                   Script: state_government.elicit government type
7 discusses this issue in more detail.                             Arguments: , 
   In the last three years we have performed extensive             Control: single-selection-list
experiments with Disciple at the US Army War College,                  Question: What type of government does  have?
where it is used in two courses, Case Studies in Center of             Answer variable: 
                                                                       Possible values: the elementary subconcepts of state_government
Gravity Analysis (the COG course), and Military Appli-
                                                                       Allow adding new subconcepts: Yes
cations of Artificial Intelligence (the MAAI course). In           Ontology actions:
the COG course, Disciple is used as an assistant that was               instance-of 
trained by the instructor, helping the students to perform         Script call: .elicit properties
                                                                       Arguments: 
                                                                  such as the one presented in Figure 1. As a result, a rule
                                                                  may accumulate negative and positive exceptions.
                                                                     A negative exception is a negative example that is cov-
                                                                  ered by the rule because the current object ontology does
                                                                  not contain any knowledge that distinguishes the negative
                                                                  example from the positive examples of the rule [Tecuci,
                                                                  1998; Boicu et al., 2003]. Therefore, the rule cannot be
                                                                  further specialized to uncover the negative example,
                                                                  while still covering all the positive examples of the rules.
                                                                  A positive exception is defined in a similar way.
                                                                     A comparative analysis of the examples and the excep-
                                                                  tions will facilitate identifying what distinguishes them
                                                                  and how the object ontology needs to be extended to in-
                                                                  corporate the identified distinction. This is precisely the
    Figure 5: Execution of the elicitation script from Table 1.
                                                                  main idea behind our exception-based learning method in
                                                                  which a subject matter expert collaborates closely with
dates his answers, new titles may be inserted into the            the agent to discover possible ontology extensions (such
table of contents, as directed by the script calls. For in-       as new concepts, new features or new feature values) that
stance, after the expert specifies the opposing forces in a       will eliminate the exceptions.
scenario, their names appear as titles in the table of con-          The exception-based learning method consists of four
tents, together with the characteristics that need to be          main phases: 1) a candidate discovery phase in which the
elicited for them. Experimental results show that the ex-         agent analyzes a rule, its examples and exceptions, and
perts can easily use the Scenario Elicitation module [Te-         the ontology and finds the most plausible types of exten-
cuci et al., 2002].In Protégé, each concept has exactly           sions of the ontology that may reduce or eliminate the
one script that specifies how to elicit the properties of its     rule’s exceptions; 2) a candidate selection phase in which
instances. In Disciple, a concept can have any number of          the expert interacts with the agent to select one of the
scripts that can be used for any purpose. In particular, the      proposed candidates; 3) an ontology refinement phase in
knowledge engineer can define more scripts that specify           which the agent elicits the ontology extension knowledge
how to elicit instances for the same concept. For in-             from the expert and 4) a rule refinement phase in which
stance, to elicit the military factors for a single-state         the agent updates the rule and eliminates the rule’s ex-
force, different questions have to be asked if the force is       ceptions based on the performed ontology extension.
part of an alliance, or is a standalone opposing force.              As an illustration, consider the example and the corre-
   The most recent development of the Scenario Elicita-           sponding partially learned rule from Figure 1. This rule is
tion tool is to allow the user to extend the ontology with        used in problem solving and generates the reasoning step
new concepts in a controlled manner. For instance when            from Figure 6, which is rejected by the expert because
the script from Table 1 is executed, the user can specify a       both the answer to the question and the resulting solution
new type of state government (e.g. “feudal god-king gov-          are wrong. However, there is no knowledge in the current
ernment”), as illustrated in Figure 5. As a result a new          ontology that can distinguish between the objects from
concept is created under “state government”. As future            the positive example in Figure 1 and the corresponding
developments, we plan to extend the capability of the             objects from the negative example in Figure 6. Therefore,
Script Editor to facilitate the script definition task for the    the negative example from Figure 6 will be kept as a
knowledge engineer, by taking into account the structure          negative exception of the rule in Figure 1.
of the ontology and by using customization of generic                Figure 7 shows the interface of the exception-based
scripts. We also plan to add natural language processing          learning tool in the ontology refinement phase. The upper
capabilities to the Scenario Elicitation module.                  left panel of this tool shows the negative exception which
                                                                  needs to be eliminated. Below are the objects that are
5     Exception-based ontology learning                           currently differentiated: “Caribbean States Union” (from
                                                                  the positive example) and “USA” (from the negative ex-
As we have mentioned in section 2, the object ontology            ception). The right panel shows the elicitation dialog, in
plays a crucial role in the learning process of the agent,        which the expert is guided by the agent to indicate the
as it is used as the generalization hierarchy for learning.       name and value of a new feature that expresses the dif-
However, this ontology is itself incomplete and partially         ference between “Caribbean States Union” and “USA.”
incorrect and will have to be improved during the teach-          The expert defines the new feature “is minor member of”
ing of the agent. In this section we will briefly present an      and specifies that “Caribbean States Union” is a minor
exception-based approach to ontology learning.                    member of “OECS Coalition,” while “USA” is not. Based
  Because the ontology is incomplete, it may not contain          on this elicitation, Disciple learns a general definition of
the knowledge to distinguish between all the positive             the feature “is minor member of” and refines the ontology
examples and the negative examples of a learned rule,             to incorporate this knowledge. A fragment of the refined
        I need to                                                          eliminate these exceptions, the experts extended the on-
        Analyze the will_of_the_people_of_USA as a potential               tology with 4 new features and 6 new facts. Some of the
        strategic_COG_candidate of the OECS_Coalition with respect         newly created features eliminated the exceptions from
        to the people_of_USA
                                                                           several rules. As a result of these ontology extensions,
                                                                           the rules were correspondingly refined.
          Is the will_of_the_people_of_USA a legitimate candidate?
                                                                              This experiment proved that the exception-based learn-
                                     No                                    ing tool can be used to extend the object ontology with
        Therefore
                                                                           new elements that represent better the subtle distinctions
                                                                           that the experts make in their domains of expertise. This
        The will_of_the_people_of_USA is not a
        strategic_COG_candidate with respect to the people_of_USA          tool allows the elimination of the rules' exceptions and it
                                                                           improves the accuracy of the learned rules by refining
 Figure 6: Incorrect reasoning step generated by the agent                 their plausible version space conditions. It also enhances
ontology is shown in the right part of Figure 7. Notice                    the agent's problem solving efficiency by eliminating the
that both the domain and the range of the new feature are                  need to explicitly check the exceptions. We plan several
represented as plausible version spaces. The plausible                     extensions to the presented method: propose suggestions
upper bound domain of this feature is "single member                       and help the user during the exception-based learning
force" and the plausible lower bound domain is "single                     process; use analogical reasoning and hints from the user
state force."                                                              in the discovery of plausible ontology extensions; extend
   The exception-based learning tool was evaluated dur-                    the method to discover new object concepts in order to
ing the Spring 2002 agent teaching experiment performed                    eliminate the rules' exceptions; and extend the method to
with Disciple at the US Army War College, as part of the                   also remove the positive exceptions of the rules.
“Military Applications of Artificial Intelligence” course.
The tool was used by seven subject matter experts with                     6         Example-based ontology learning
the assistance of a knowledge engineer, to eliminate the                   There are many situations during the agent teaching
negative exceptions of the rules. We did not expect a sig-                 process where the subject matter expert has to specify a
nificant number of exceptions, because before the ex-                      fact involving a new instance or a new feature. In such a
periment we attempted to develop a complete ontology,                      case, the example-based ontology learning tool is in-
which contained 191 concepts and 206 features. How-                        voked to learn a new concept or a new feature definition,
ever, during the experiment, 8 of the learned problem                      from the provided fact. One such situation was encoun-
solving rules have collected 11 negative exceptions, indi-                 tered in the previous section where the expert indicated
cating that the ontology was not complete. In order to                     that “Caribbean States Union is minor member of OECS




                                                                                                  A fragment of the refined ontology
                                                                                   Domain                                                    Range
                                                                                   PUB:                                                        PUB:
                                                                            single_member_force                                         multi_member_force
                                                                                    PLB:                                                     PLB:
                                                                             single_state_force          is_minor_member_of        dominant_partner_multi_
                                                                                                                                        state_alliance




                                                                             single_state_force                          dominant_partner_multi_state_alliance


                                                                                           instance_of        ad_hoc_governing_body                  opposing_force
                                                                     instance_of
                                                                                                                                       instance_of

                                                                                                            is_minor_member_of
                                                                       USA         Caribbean_States_Union                              OECS_Coalition




             Figure 7: The interface of the Exception-Based Learning Module and a fragment of the refined ontology
Coalition." From this specific fact Disciple attempts to                           Our import method consists of identifying key terms in
learn a general definition of the feature “is minor member                      the CYC KB that correspond to the terms from the ontol-
of.” The most important characteristics of the feature that                     ogy specification, extracting the knowledge related to
need to be learned are its position in the feature hierar-                      those terms and importing it into the Disciple knowledge
chy, its domain of applicability, and its range of possible                     base. The extraction of knowledge is an automated proc-
values. First Disciple identifies the features that are most                    ess in which all the terms related to the start-up terms are
likely to be more general than “is minor member of.”                            elicited, then all the terms related to those terms, and so
This set initially includes all the features whose domain                       on until a transitive closure or a user-specified stopping
and range cover “Caribbean States Union” and “OECS                              criteria is met. This method extends the one of Chaudhri
Coalition,” respectively, as shown in Figure 8. This set if                     et al. [2000] by adding stopping criteria, by allowing
further pruned by applying various heuristics (for in-                          taxonomy relations to be followed down the hierarchy,
stance by eliminating the other features of “Caribbean                          and by considering the feature hierarchy. The translation
States Union”) and by directly asking the expert:                               of the extracted knowledge into the Disciple formalism
   Consider the statement “Caribbean States Union is                            consists of a syntactic phase and a semantic one, being
   minor member of OECS Coalition." Is this a more                              similar with the method used in OntoMorph [Chalupsky,
   specific way of saying: “Caribbean States Union is                           2000]. During the automatic transformation of extracted
   member of OECS Coalition"?                                                   knowledge into Disciple’s knowledge representation, the
   As a result of this process “is minor member of” is de-                      system records logs with a number of decisions that re-
fined as a subfeature of “is member of.” The domain and                         quire the user’s approval or refinement.
the range of the “is member of” feature become the upper                           The imported ontology is further extended using the
bounds of the domain and range of “is minor member of.”                         ontology development tools of Disciple, as discussed in
The corresponding lower bounds are the minimal gener-                           section 3, leading to an initial knowledge denoted with
alizations of “Caribbean States Union” and “OECS Coa-                           KB0 in Figure 9.
lition,” respectively (see the bottom part of Figure 7).                           Another result of the Domain analysis phase is a parti-
   The next step is to further refine the plausible version                     tioning of the application domain into several subdo-
spaces of the domain and range. The lower bounds are                            mains. A team of experts can now develop separate
generalized based on new positive examples of this fea-                         knowledge bases for each independent subdomain. Each
ture, encountered during further teaching. However, the                         expert teaches a personal Disciple agent, starting from
agent will not encounter negative examples. Therefore                           the common knowledge base KB0 and building a refined
the specialization of the upper bounds is based on a dia-                       one, as indicated in Figure 9. Then, the developed knowl-
log with the expert who will be asked to identify objects                       edge bases are merged into the Final KB. This KB will
that cannot have this feature, or cannot be a value of this                     contain a merged ontology, but separate partitions of
feature. There are other difficult problems related to                          rules, one for each subdomain. The ontology merging
learning and refining features: how to elicit its special                       algorithm exploits the fact that the KBs to be merged
characteristics (e.g. whether the feature is transitive or                      share KB0 as a common ontology. It starts with one of
not), how to elicit its cardinality, or how to differentiate                    the KBs and successively merges it with the other KBs,
between required and optional features for an object.

7      Ontology import and merging                                                           1. Domain analysis
                                                                                       Generic                       Ontology
Figure 9 shows another view of the Disciple agent build-                              problems                      specification           External
ing methodology that emphasizes ontology reuse and                                                                                         Repository
parallel knowledge base development. The ontology
specification that results from the domain analysis phase                                                                           2. Ontology
(see Figure 3) guides the process of importing ontologi-                                                                            development
                                                                                                        Expertise
cal knowledge, currently from CYC [Lenat, 1995] and, in                                                subdomains       KB0
the future, also from other knowledge repositories.                                                                                 Initial KB



                                      feature                                           3. Parallel
                           object                  object                            development
                          DOMAIN                   RANGE
                                                                                           Domain expert

                                                                                                             KB1        KB2             KBn
                  is_part_of                            is_opposed_to
        object                  object          force                   force
                               RANGE
       DOMAIN                                 DOMAIN                    RANGE
                                                                                                 4. Knowledge
                                                                                                 bases merging
     single_     is_member_of        multi_
    member_                         member_
      force                          force
    DOMAIN                           RANGE
                                                                                                                     Final KB

               Figure 8: Fragment of the feature hierarchy.                             Figure 9: Rapid knowledge base development.
one at a time. Similarly to Prompt [Noy and Musen,               Reasoning, pages 471--482, San Francisco, California, April
2000] and Chimaera [McGuiness et al., 2000], our ap-             2000. Morgan Kaufmann.
proach to merging is based on providing an interactive           [Chaudhri et al., 1998] Vinay K. Chaudhri, Adam Farquhar,
way of copying one frame from an ontology into the               Richard Fikes, Peter D. Karp, and James P. Rice. OKBC: A
other. While it is acknowledged that the role of the hu-         Programmatic Foundation for Knowledge Base Interoperability.
man cannot be eliminated from this process [Klein, 2001;         In Proceedings of the Fifteenth National Conference on Artifi-
Noy and Musen, 2000], the goal is to provide the most            cial Intelligence, pages 600--607, Madison, Wisconsin, July
assistance to the knowledge engineer. Therefore, our tool        1998. AAAI Press/The MIT Press.
handles the low level operations, allowing the user to
                                                                 [Chaudhri et al., 2000] Vinay K. Chaudhri, Mark E. Stickel,
issue only the most general commands, and assuring that
                                                                 Jerome F. Thomere, and Richard J. Waldinger. Using Prior
the ontology is kept consistent at all times. In addition to
                                                                 Knowledge: Problems and Solutions. In Proceedings of the
that, the agent makes suggestions and keeps the user fo-
                                                                 Seventeenth National Conference on Artificial Intelligence and
cused on the part of the ontology being merged.
                                                                 Twelfth Conference on Innovative Applications of Artificial
   The parallel KB development and merging capabilities
                                                                 Intelligence, pages 436--442. Austin, Texas, July-August 2000.
of Disciple were first evaluated in Spring 2002, as part of
                                                                 AAAI Press/The MIT Press.
“IT 803 Intelligent Agents” course at George Mason
University. The students had to develop an agent for             [Clausewitz, 1976] Clausewitz, C.V.. On War. Translated and
helping someone to choose a PhD advisor. The domain              edited by Howard, M. and Paret, P. Princeton University Press,
was split into six parts that were developed separately by       Princeton, NJ.
the students in the class. They started the knowledge base       [Connolly et al., 2001] Dan Connolly, Frank van Harmelen, Ian
development with a general 23-fact knowledge base pro-           Horrocks, Deborah L. McGuinness, Peter F. Patel-Schneider,
vided by the instructor and each of them had to extend it        and Lynn Andrea Stein. DAML+OIL (March 2001) Reference
with the knowledge needed to express their own part of           Description. W3C Note 18 December, 2001.
the domain. Each student extended its knowledge base             [Noy et al., 2000] Natalya Fridman Noy, Ray W. Fergerson,
with an average of 97 facts. Using the merging tools pro-        and Mark A. Musen. The Knowledge Model of Protégé-2000:
vided by Disciple, the students succeeded to merge all           Combining Interoperability and Flexibility. In Proceedings of
their work into a single agent with an ontology contain-         the European Knowledge Acquisition Workshop, pages 17-32,
ing 473 facts. We plan to validate the entire methodology        2000.
in a new experiment at the US Army War College, as part
of the Spring 2003 MAAI course.                                  [Klein, 2001] Michel Klein. Combining and relating ontolo-
   Future work includes the capability to import from            gies: an analysis of problems and solutions. In Proceedings of
OKBC knowledge servers [Chaudhri et al., 1998] and               the IJCAI-20001 Workshop on Ontologies and Information
from DAML+OIL expressed ontologies [Connolly et al.,             Sharing, Seattle, Washington, August 2001. International Joint
2001], and an improvement of the proactivity of the              Conference on Artificial Intelligence, Inc.
mixed-initiative ontology merging tool.                          [Lenat, 1995] Douglas B. Lenat. CYC: A Large-Scale Invest-
   Acknowledgements. This research was sponsored by              ment in Knowledge Infrastructure. Communications of the
DARPA, AFRL, AFMC, USAF, under agreement number                  ACM, 38(11): 33-38, 1995.
F30602-00-2-0546, by the AFOSR under grant no. F49620-           [McGuinness et al., 2000] Deborah L. McGuinness, Richard E.
00-1-0072, and by the US Army War College.                       Fikes, James Rice, and Steve Wilder. An Environment for
                                                                 Merging and Testing Large Ontologies. In Proceedings of Sev-
References                                                       enth International Conference on Knowledge Representation
[Boicu et al., 2003] Cristina Boicu, Gheorghe Tecuci, Mihai      and Reasoning, San Francisco, California, April 2000. Morgan
Boicu, and Dorin Marcu. Improving the Representation Space       Kaufmann.
through Exception-Based Learning. To appear in Proceedings       [Noy and Musen, 2000] Natalya F. Noy and Mark A. Musen.
of the Sixteenth International Flairs Conference. 2003.          PROMPT: Algorithm and Tool for Automated Ontology Merg-
[Boicu et al., 2000] Mihai Boicu, Gheorghe Tecuci, Dorin         ing and Alignment. In Proceedings of the Seventeenth National
Marcu, Michael Bowman, Ping Shyr, Florin Ciucu, and Cristian     Conference on Artificial Intelligence, pages 450–455, Austin,
Levcovici. Disciple-COA: From Agent Programming to Agent         Texas, July–August 2000. AAAI Press/The MIT Press.
Teaching. In Proceedings of the Seventeenth International Con-   [Tecuci, 1998] Gheorghe Tecuci. Building Intelligent Agents:
ference on Machine Learning, Stanford, California, 2000. Mor-    An Apprenticeship Multistrategy Learning Theory, Methodol-
gan Kaufmann.                                                    ogy, Tool and Case Studies. Academic Press, London, 1998.
[Boicu, 2002] Mihai Boicu. Modeling and Learning with In-        [Tecuci et al., 2002] Gheorghe Tecuci, Mihai Boicu, Dorin
complete Knowledge. Doctoral Dissertation. George Mason          Marcu, Bogdan Stanescu, Cristina Boicu, and Jerome Comello.
University, Fairfax, Virginia, 2002.                             Training and Using Disciple Agents: A Case Study in the Mili-
[Chalupsky, 2000] Hans Chalupsky. OntoMorph: a translation       tary Center of Gravity Analysis Domain. AI Magazine, 23(4):
system for symbolic knowledge. In Proceedings of Seventh         51—68, 2002.
International Conference on Knowledge Representation and