Ontological Considerations for Uncertainty Propagation in High Level Information Fusion Paulo C. G. Costa Mark Locher George Mason University George Mason University and SRA, International Fairfax VA USA Fairfax VA USA pcosta@gmu.edu mlocher@gmu.edu Abstract— Uncertainty propagation in a level 2 high level some of these approaches have taken uncertainty information fusion (HLIF) process is affected by a number of considerations into account (e.g. [3] [4] [5] [6]). Various considerations. These include the varying complexities of the techniques exist to model and propagate uncertainty in a fusion various types of level 2 HLIF. Five different types are identified, process, with varying strengths and difficulties. This suggests ranging from simple entity attribute refinement using situation that their relative performance in a fusion system should vary status data to the development of a complete situation assessment significantly depending on the types and nature of the assembled from applicable situational fragment data. Additional uncertainties within both the input data and the context of the considerations include uncertainty handling in the input data, problem set modeled with the fusion system. Unfortunately, uncertainty representation, the effects of the reasoning technique there is no consensus within the fusion community on how to used in the fusion process, and output considerations. Input data evaluate the relative effectiveness of each technique. Work in considerations include the data’s relevance to the situation, its this area will be hampered until the evaluation question is at credibility, and its force or weight. Uncertainty representation least better defined, if not resolved. concerns follow the uncertainty ontology developed by the W3C Incubator Group on Uncertainty Reasoning. For uncertainty The International Society for Information Fusion (ISIF) effects of the fusion process, a basic fusion process model is chartered the Evaluation of Technologies for Uncertainty presented, showing the impacts of uncertainty in four areas. Reasoning Working Group (ETURWG) to provide a forum to Finally, for output uncertainty, the significance of a closed-world collectively address this common need in the ISIF community, versus open-world assumption is discussed. coordinate with researchers in the area, and evaluate techniques for assessing, managing, and reducing uncertainty [7]. In its Keywords - High level fusion, input uncertainty, process first year, ETURWG defined its scope and developed the uncertainty, output uncertainty, uncertainty representation uncertainty representation and reasoning evaluation framework (URREF) ontology. The URREF ontology aims to provide I. INTRODUCTION guidance for defining the actual concepts and criteria that together comprise the comprehensive uncertainty evaluation The past 20 years have seen an explosion of systems and framework [8]. It is evident that part of the issue in evaluating techniques for collecting, storing and managing large and different uncertainty representation systems is to properly diverse sets of data of interest to a number of communities. understand how a high-level fusion process works and how These data are collected by a wide variety of mechanisms, each uncertainty is propagated through the process. of which has varying considerations that influence the uncertainty in the data. In order to provide useful information This paper aims to help establish the various considerations for a particular question or problem, the relevant data about how uncertainty affects a HLIF process. It will begin by (“evidence”) must be identified, extracted and then fused to defining what is meant by a HLIF process, and then focus on provide insight or answers to the question or problem. The one class of HLIF, the level 2 HLIF. From there, it will define information fusion community has developed a widely a taxonomy of Level 2 HLIF, where increasing complexity of accepted functional layered model of information fusion. level 2 HLIF types have additional uncertainty considerations. These layers can be divided into low level and high-level Then it explores uncertainty propagation issues associated with fusion. At all levels, the data going into a fusion process is uncertainty in the input data, the uncertainty effects of both the recognized as having uncertainty, which affects in various fusion reasoning process and the representation scheme, and ways the degree of certainty in the output of the process. Low- the output uncertainty. It concludes with a top-level discussion level fusion has been widely explored, primarily through the of an overall mathematical approach applicable to these radar tracking community, and issues of uncertainty considerations. determination and propagation are well understood [1]. II. DEFINITION OF HIGH-LEVEL FUSION High-level fusion, on the other hand, requires reasoning about complex situations, with a diversity of entities and A widely accepted definition of High-Level Information various relationships within and between those entities. This Fusion (HLIF) is that it refers to the fusion processes classified reasoning is often expressed symbolically, using logic-based as level 2 and above within the revised Joint Directors of approaches [2]. There has been significant work in using Laboratories data fusion model. This model establishes five ontological approaches in developing fusion techniques and functional levels, as defined in [9] and repeated in Table 1 below. Table 1: JDL Fusion Levels [9] objects and processes. At the independent level, an entity is Level Title: Definition considered by itself, without reference to other entities. At the relative level, an entity is considered in single relation to 0 Signal / Feature Assessment: Estimate signal or feature another entity. Finally, the idea of mediating takes into account state. May be patterns that are inferred from observations or measurements, and may be static or dynamic, and may have two items: the number and complexity of the various locatable or causal origins interrelationships among the entities, and the unifying idea – its purpose or reason – that allows one to define a situation or a 1 Entity Assessment: Estimation of entity parametric and structure that encompasses the relevant entities [11]. attributive states (i.e. of individual entities) The combination of these three aspects results in the 12 2 Situation Assessment: Estimate structures of parts of reality ontological categories shown in Table 2. Table 3 provides a (i.e. of sets of relationships among entities and implications more detailed definition of each ontological category and for states of related entities.) provides some examples. 3 Impact Assessment: Estimate utility/cost of signal, entity or situation states, including predicted utility / cost given a A key point in looking at this ontological categorization is system’s alternative courses of action that one must understand the context and viewpoint from which a given entity is categorized, and that changes to either of these 4 Process Assessment: A system’s self-estimate of its two might result in different categorizations for the same entity. performance as compared to desired states and measures of To illustrate this point, an airplane can be considered as either effectiveness. an independent object flying in the air, or a complex mediating structure with thousands of component objects and processes A key item is that these assessments are not just a combination that work together for the purpose of achieving aerial flight. of information, but they are also analytic judgments. For The viewpoint one takes depends on the context one is example, a level 2 fusion process is more than a unified display interested in. In the airplane example, it depends on whether of information (e.g. a common operational picture); rather, it one is tracking a particular aircraft using a variety of sensors, or requires explicit statements about how certain specific elements attempting to determine the various capabilities of a new of reality are structured, in order to address specific questions aircraft type. that a user of that process wants answered. Level 2 fusion Table 2: Sowa’s Categories [11] essentially answers the question “what is going on?” Level 3 Physical Abstract fusion addresses “what happens if …?”, where “if” is followed Continuant Occurrent Continuant Occurrent by a possible action or activity (level 3 is often predictive). Independent Object Process Schema Script Level 4 involves steering the fusion system, including adjusting Relative Juncture Participation Description History data collection based on an assessment of already-collected Mediating Structure Situation Reason Purpose data. There has been some discussion regarding the boundary between level 1 and level 2. Das, for instance, considers It is tempting to suggest that Sowa’s three relationship identification and object classification as beyond level 1, levels correspond to the JDL levels 1 / 2 / 3 (i.e., Independent, suggesting that this type of fusion should be a level 1+ [10]. Relative, and Mediating, respectively). However, this has at Steinberg, on the other hand, considers this to be clearly level 1 least three major problems. First, Sowa’s relative level is [9]. Sowa’s ontological categories provide insight into this focused on a single relationship between two entities, while question, and can be used to illuminate some factors on JDL level 2 can (but does not have to) consider multiple uncertainty propagations considerations. In the present work, relationships in and between multiple entities. Second, JDL these ontological categories were used as a basis for defining a level 2 situation assessment includes making assessments about taxonomy of level 2 fusion. the purpose or reason for the situation. This reason or purpose III. TAXONOMY OF LEVEL 2 HLIF is the key characteristic that distinguishes one situation from another. A raucous sports team victory celebration, a protest Sowa defined twelve ontological categories, and together and a riot share many entities and relationships, but they comprise a very attractive framework for analyzing fusion understanding the reason/purpose behind it can make a processes at level 2. He suggests that one way of categorizing significant difference to a chief of police. Third, there are level entities in the world is to consider them from three orthogonal 1 inferences that depend on the existence of fixed relationships aspects [11]. The first is whether they are physically existing between entities. or abstract. Abstract entities are those that have information content only, without a physical structure. This includes the To illustrate the latter point above, consider the case of an idea of geometric forms or canonical structures (e.g. idea of a intercepted radar signal that has been classified as having come circle), or entities like computer program source code. from a specific type of radar system. Now let us suppose that the radar type is tightly associated with a larger system, such as The second aspect defining the ontological categorization is the AN/APG-63 radar on older versions of the US F-15 aircraft whether the entity is a continuant (i.e., having time-stable [12]. If one has detected the APG-63 radar, one also has very recognizable characteristics) or an occurrent (i.e., significantly high confidence that one has detected an F-15 aircraft. This F- changing over time). This means that an entity can either be an 15 object identification occurs because there is a fixed object (a continuant) or a process (an occurrent – also called an relationship between the two objects (it’s not a 100% event). The third and final aspect of his ontological relationship, as the APG-63 is also installed on fourteen United categorization is the degree of interrelatedness with other States Customs and Border Protection aircraft [13]). This A key differentiator between JDL levels 1 and 2 is the focus situation is a clear example of a fixed relationship between on an object versus on multiple objects in relationship to each entities (i.e., AN/APG-63 used in F-15 fighters) that supports a other. Yet, as illustrated by the two later examples, a JDL level level 1 object identification, thus making it applicable to 1 assessment can use techniques that are grounded in Sowa’s directly associate JDL level 1 to Sowa’s Independent relative level. In general, determining an object’s level 1 relationship. attributes and states often depends on fusing different sensor Table 3: Definitions [11] outputs of processes that an object has undergone – thus Definition Examples making use of participation level information. Object Any physical continuant Any specific existing Using Sowa’s categories, one can create the taxonomy of considered in isolation item (e.g. car serial number 123, etc.) level 2 situations shown in Figure 1. This taxonomy ranges Process The changes that occur to Explosion, most action widely in complexity and analytic inferences required. There an object over time, with a verbs are five cases presented in the Figure, each created by first focus on the changes determining whether one is dealing with a known situation, or Schema The form of an continuant Circle, language whether the situation itself must be inferred. In general, the concepts for classes of least complex case is for known situations where one is objects (e.g. cat, airplane) determining / refining the attribute of an entity. This case Script The time or time-like Process instructions, straddles the level 1 / 2 line. It is object / process identification sequence of an occurrent software source code, where the relationship between elements within the object of radar track file interest may vary. An example is the radar / vehicle case above. Juncture Time-stable relationship Joint between two The defined situation is that a Tin Shield radar has been between two objects bones, connection detected at a particular location. The question is whether an between parts of a car Participation Time-varying relationship Artillery firing a shell, SA-10 battery (a higher level object) is at that location, or between two objects, or a radio communication whether the radar is operating in a stand-alone mode (whether process related to an object between two people operationally, for system testing, or for system maintenance). The inferences generally are based on schema-based evidential Description An abstraction about the The idea behind reasoning (e.g. “there is a 95% chance that this radar will be types of relationships that concepts like “join”, “ associated with an SA-10 battery in its immediate vicinity”). can exist between “separate”, “works continuants for”, “mother of”, etc. History The recorded information Video file of a traffic about an occurrence as it intersection relates to one or more continuants Structure A complex continuant with Composition of an multiple sub-continuants army, layout of a and many relationships. chemical plant Focus is on the stability of the continuant Situation A complex occurrent with A birthday party, road multiple continuants and traffic in a many relationships. Focus metropolitan area is on the time sequence of changes among the objects and processes Reason The intention behind a Differentiates a structure chemical weapon factory from a fertilizer factory Figure 1: Types of Situation Assessments Purpose The intention driving a Intention that situation differentiates going to war from conducting a The second case is a step up in complexity, where the situation military exercise is well defined but the objective is to identify a specific object of interest within the situation. For example, one might have very credible evidence that a terrorist group will attempt to Now consider the case where the radar is associated with a Surface-to-Air Missile (SAM) system, such as the Tin Shield smuggle a radiological bomb into the United States via a acquisition radar and the SA-10 Grumble SAM system. The freighter. In this case, the situation itself is known (one knows SA-10 system consists of multiple separate vehicles, not a the purpose / intention), but the actors may be hidden. single vehicle. The radar vehicle is physically separate from the Inferring which freighter (an object identification) is a likely other vehicles. It is possible for the Tin Shield radar to be used carrier of the bomb is the question of interest. Another as a stand-alone search radar [14]. In this case, detection of the example would be to determine who committed a robbery of a Tin Shield radar signal may indicate the presence of the SA-10, bank, when one has a video of the act itself (the situation is a but it may not. robbery). In this case, the evidence is extracted from a variety of sources, which can be classified as being junctures, representation within the fusion system, the uncertainty effects participations, histories or descriptions. of the reasoning process, and the resultant uncertainty in the outputs of the process [7, 8]. The subsections below address The inferential process generally becomes more complex some of the ontological considerations associated with the first when the specific situation itself is not known, but must be three factors. Issues associated with output uncertainty are inferred. The taxonomy outlines three such cases, each with an treated in section V. increasing level of complexity. The first is when the specific A. Uncertainty in the Input Data situation is not known, but there is a set of well-defined situation choices to select from. This case is a situation version All conclusions are ultimately grounded on evidence, of a state transition. A classic example is the military drawn from a variety of data sources. But often evidence is indications and warning question, which can be raised when an “inconclusive, ambiguous, incomplete, unreliable, and increase in activity at military locations in a country is detected. dissonant.” Any conclusions drawn from a body of evidence The question then becomes “what is the purpose of the is necessarily uncertain. Schum [15] found that one must activity?” Four major choices exist: a major military exercise, establish the credentials of any evidence used in a reasoning suppression of domestic unrest, a coup d' etat, or preparing to process. These credentials are its relevance to the question / go to war. Each is a relatively well-defined situation with issue at hand, its credibility, and its weight or force [16]. This known entities, attributes and relationships. The selection suggests that one should elaborate on the fundamental among them becomes a pattern-matching exercise. reasoning process from Figure 2 with the additional items shown in Figure 3. The next level of complexity occurs when not only is the situation itself unknown, the situation itself must be developed. Conclusion SAM  System  will  engage  a   Unlike the case above, the issue now is not choosing among a target set of possible situations but to build the situation from the data. This case can be divided into two subcases. In the first Reasoning    Step  N If  SAM  radar  is  active,  then   subcase, one has a series of templates that can be used in SAM  is  preparing  to  engage developing aspects of the situation. For example, in developing an enemy order of battle for a standing nation-state’s military, Intermediate  Reasoning   Steps one has a basic understanding of the objects and relationships SAM  Systems  are  located   that constitute a modern military force. A country may not have with  the  radar all of the elements, and the organizational structure will vary. Reasoning   Step  1 Yet, it is very likely that the structure and deployment will follow patterns similar to those used by other countries. SAM  Radar  active   Event  E at  Location   X The second subcase is the most complex situation. Here, one must develop a situation where the basic purpose itself must be determined. For example, consider the case when a A B government agency is notified that something is significantly amiss, with enough information to spark interest, but not Figure 2: Fundamental Reasoning Process enough to understand what is happening. In that case, the evidence must be assembled without a common template to Data becomes evidence only when it is relevant. Relevance guide the fusion. Rather, the evidence must be fused using assesses whether the evidence at hand is germane to the fragmentary templates, that themselves must be integrated to question(s) being considered. Irrelevant information makes no provide the overall situation. Integrating the data to “connect contribution to the conclusion drawn, and potentially confuses the dots” that could have predicted the September 11, 2001 the fusion process by introducing extra noise. Evidence can be commercial airliner strikes on the World Trade Center and the either positively (supportive) or negatively (disconfirmatory) Pentagon falls into this category. Note also that this case also relevant to a particular hypothesis. Any analytic effort is straddles the level 2 / level 3 fusion line, since determining the obliged to seek and evaluate all relevant data. purpose in this case has a predictive element with possible courses of actions and outcomes. Once data is shown to be relevant to a particular problem (i.e., it becomes evidence), Schum points out that there is an IV. UNCERTAINTY PROPAGATION IN HLIF important but often overlooked distinction between an event (an object, process, juncture or participation in Sowa’s In any fusion process, one follows a fundamental reasoning ontological categories) and the evidence about that event or process, which logically uses a series of reasoning steps, often state. That is, Joe’s statement “I saw Bob hit Bill with a club” of an “if, then” form. Beginning with a set of events, we form does not mean that such event actually happened, and should a chain of reasoning to come to one or more conclusions. be seen only as evidence about it. Credibility establishes how Figure 2a models a simple case, while Figure 2b gives an believable a piece of evidence is about the event it reports on. example of that case. More complex structures can be easily Schum identified three elements of credibility [17]; the created [15]. ETURWG added self-report as a distinct element (see Table 4 The ETURWG found that within this fundamental process for elements and definitions) [7]. there were at least four areas for uncertainty considerations: the uncertainty in the input data, the uncertainty associated with Figure 4: Uncertainty Ontology A Sentence is a logical expression in some language that evaluates to a truth-value (formula, axiom, assertion). For our purposes, information will be presented in the form of Figure 3: Evidential Factors sentences. The World is the context / situation about which the Sentence is said. The Agent represents the entity making the Table 4: Elements of Evidential Credibility Sentence (human, computer etc.). Uncertainty is associated Veracity: Source is telling what it believes to be true (note that the source with each sentence, and has four categories. Three of those are may be deceived) described in Table 5, along with their significance for uncertainty propagation in a HLIF process. Objectivity: Source has received the evidence on which it based its reporting. This includes consideration of system biases and false alarms Table 5: Definition of Uncertainty Categories Uncertainty Derivation Observational Sensitivity: Source has the ability to actually observe what it reports (e.g. Observer actually has the visual acuity needed to see what Objective: Derived in a formal way, repeatable derivation process. was going on, or an electronic intercept was of such low quality the Significance - level of uncertainty can be reliably estimated operator guessed part of the conversation) Subjective: Judgment, possibly a guess. Self-Report: Source provides a measure of its certainty in its report (e.g. a Significance - Level of uncertainty may be unpredictable human source hedges her report with “it’s possible that…” or a sensor reports that detection was done at a signal to noise ratio of 4) Uncertainty Nature Aleatory: Uncertainty inherent in the world The force (or weight) of the event establishes how Significance - Additional data will not resolve uncertainty important the existence of that event is to the conclusion one is trying to establish. By itself, the event “Bob hit Bill with a Epistemic: Uncertainty in an agent due to lack of knowledge Significance - Uncertainty could be resolved by additional evidence club” would have a significant force in establishing a gathering, which eliminates the lack of knowledge conclusion that Bill was seriously injured. It would have less force in establishing that Bill was committing a violent act and Uncertainty Type needed to be stopped at Bill, and even less force in concluding that Bob was angry at Bill. Figure 3 shows that credibility can Ambiguity: Referents of terms are not clearly specified Significance - The same evidence may not distinguish between two or have an effect on the force of an event on the conclusion. For more possibilities example, if the credibility of Joe’s testimony about Bob hitting Bill with a club is low, the certainty of a conclusion that Bob’s Empirical : Sentence about a world is either satisfied or not satisfied in hitting was the cause of Bill’s injuries would be less than if Joe each world, but it is not known in which worlds it is satisfied; this can be testimony’s credibility was high. Schum investigated a number resolved by obtaining additional information (e.g., an experiment) Significance - Uncertainty can be resolved with additional information of different ways in which considerations about data credibility could affect the overall conclusions. One of his most interesting Randomness (Type of empirical uncertainty): sentence is an instance of findings is that, under certain circumstances, having credible a class for which there is a statistical law governing whether instances data on the credibility of a data source can have a more are satisfied significant force on the conclusion than the force of the event Significance - The empirical uncertainty has a predictable basis for making an estimate, using the appropriate statistical law reported in the data [15]. Vagueness: No precise correspondence between terms in the sentence B. Uncertainty in the Representation and referents in the world Uncertainty varies in its forms and manifestations. Therefore, Significance - Uncertainty due to a lack of precision the uncertainty representation scheme used has an effect on Incompleteness: information about the world is incomplete / missing what can or cannot be expressed. To see this, one first needs to Significance - Uncertainty increases because assumptions / estimates of have an understanding on the different types of uncertainty. information must be used, rather than the actual information. May not The W3C Incubator Group exploring uncertainty reasoning have a basis for making an estimate issues for the World Wide Web developed an initial ontology Inconsistency: no world can satisfy the statement. of uncertainty concepts, shown in Figure 4 [18]. Significance - Data is contradictory; must resolve source of     contradiction (Can occur when deception is used) people association data, which must be combined into a social The last category in the ontology is Uncertainty Model, network analysis to reveal the full extent of the relationships. capturing the various approaches that can be used to model uncertainty in a reasoning process. These include (but are not limited to): Level  3  Fusion   Level  3  Data Level  2   • Bayesian Probability Theory Level  2 Fusion • Dempster-Shaffer Evidence Theory Conclusions Evidence   Data • Possibility Theory Extraction Store • Imprecise Probability approaches Data  Alignment Level  2  Data • Random Set Theory Object  A   Object  N   • Fuzzy Theory / Rough Sets Level  1   …… Level  1   Level  1  Data Fusion Fusion • Interval Theory • Uncertainty Factors Level  0  Data Level  0  Data Figure 5: Level 2 Fusion Process Model A critical item in uncertainty propagation is the proper fit between the types of uncertainty in the input data and in the Another example may be that one is interested in whether model(s) used in the fusion reasoning process. Failure to two ships met and transferred cargo in the open ocean. account for all of the uncertainty types in the input data can Suppose that you have a track file on each ship which has long result in an erroneous process output. A classic survey of revisit rates between collections. This does not provide an uncertainty models, with a discussion on applicable uncertainty obvious indication that the ships met and stopped for a while. types, is given in [19], with a recent review of the state-of-the- But the track files show that both ships were on tracks that did art in [20] put them at a common location at a given period, and that the C. Uncertainty in the HLIF Fusion Process average speed dropped significantly during the time a meeting To explore the ontological considerations of the uncertainty could have occurred (implying that the ships may have stopped propagation in a HLIF fusion process, we need to have a basic for a while). Given this data, one could conclude with some fusion process model. We will concentrate on the level 2 fusion level of certainty that they did meet and stopped to transfer process only, and leave out significant detail on the processes something. This level of certainty is driven by at least two at the other levels. Figure 5 shows this model. The first thing to factors: the quality of the track file data (establishing how observe is that the raw data can come in at any level, as certain one is in concluding that the tracks allowed them to meet), and how likely is it that two ships showing these track evidenced by the incoming arrows at the right side of the figure. The model does not require that all data be signal or characteristics actually would have met and stopped. feature (Level 0) data, which is then aggregated into higher- A significant part of the evidence extraction process could level conclusions. For instance, object identification data (level be comparison to historical or reference data. For example, a 1) could come from an on-scene observer or from an image vehicle may be moving outside of a normal shipping lane / analyst reporting on an image. Communications intercepts or airway or off-road. This requires a reference to a map base. human reporting could provide evidence on relationships (level For this reason, the process model includes a data store, for 2) or future intentions (level 3). Note that if a level 3 fusion both reference information and for previous data. process is active, its outputs could affect the level 2 process in two places. It can either be a controlling variable in the fusion The last part of the model is a data alignment process. process itself, or it can affect the interpretation and extraction Data may come in with different reference bases, and need to of evidence. However, a level 3 process will have an effect be aligned to a common baseline in order to be used in the only if it has separate evidence that is not being used in the extraction and fusion processes. level 2 fusion process (otherwise one has circular reporting). Finally, note that the level 2 process includes the possibility There are four basic processes in this model. The first is the of a direct use of level 0 data. An area of active research is the fusion process itself, which is usually some form of a model- multi-source integration of level 0 data that is not of sufficient based process. These models most often take the form of quality, or that does not have enough quantity to allow a high Bayesian networks [10, 21, 22], although alternative quality single-source conclusion. approaches have been proposed using graphical belief models V. MATHEMATICAL CONSTRUCT [23] and general purpose graphical modeling using a variety of uncertainty techniques [14]. A. Model Another important aspect of this model that must be Several authors have developed mathematical constructs for emphasized is that not all of the evidence that goes into the use in assessing the uncertainty of a situation assessment [2, model-based process is (or is assumed to be) in an immediately 25]. Our model is a version of the one put forth by Karlsson usable form. Some data must have the appropriate evidence [26], modified using the terminology put forth by Franconi extracted from it. This is where the uncertainty considerations [27]. Karlsson’s version focuses only on relationships, and associated with representation within the fusion system come does not explicitly include predicates and attributes. While one into play. For example, the raw level 2 data may be a series of can model predicates and attributes using relationships, it is cleaner to separate the entity space from the attribute space. In no candles on the cake, does this mean it is not a birthday addition, the construct formed in this paper acknowledges level celebration? 2 HLIF as explicitly including entity attributes as well as relationships between entities. Including attributes as separate B. Application to Situation Assessment Taxonomy from entity relationships, rather than defining relationships to We can use this model to better understand the varying include attribute states makes this clearer. Per [27], the complexities of the different situation assessment cases given language consists of: in section 3. For the simplest case, entity attribute refinement, we see that we have a very simple situation (“emitter En, the 1-ary predicates operational in the environment”). From the existence of one Ak, the attributes (stated as 2-ary predicates) object (the Tin Shield radar), we are inferring the existence of a second object (the SA-10 SAM system). This is a binary Rp, n-ary predicates for all relationships relation, based on a Sowa Juncture (x1, x2). With this binary There is an interpretation function I = 〈D,   .I〉  where  domain relation, we are operating with a single instance of equation (1). D is a non-empty set = Ω ⋃ B, Ω is the set of all entities, B is We only have the uncertainty measure for “Tin Shield” and the set of all attribute values and Ω ∩ B = ∅. Then “SA-10” to be in juncture. For the second case, entity selection, we again have a defined situation, but now are seeking a EiI ⊆  Ω specific object within multiple choices of objects. We are operating at the level of equation (2) – we are seeking a AiI ⊆  Ω X B specific relation that ship i is the ship of interest. Based on the RiI⊆ Ω X Ω X… X Ω = Ωn evidence, we will create multiple tuples for the different relationships that could lead us to the ship (using equation (1)) xi are the specific instances and xi ∈   Ω and then combine the results to get to equation (2). We can make at least three uncertainty assessments. For For the third case, structure / situation selection, we invoke any specific entity tuple (x1,…, xn), we have a level of equation (4) as the basic equation. We are choosing between uncertainty as to whether that tuple is a member of a specific multiple choices as to what the situation is. We use equation (1) relationship. For a generic uncertainty measure uT, the basic to determine if various relationships exist, and based on those equation for whether a tuple is correctly associated with a findings, determine which situation model is the correct one for defined relationship is this body of evidence. For the fourth case, structure / situation uTj((x1,…, xn)j ∈ Rj | EB, S, I) (1) refinement, we again use equations (1) and (4). But we also use equation (2) to determine what the exact set of relationships is. where EB is the body of evidence used in making the Case 4 differs from case 3 in that we are trying to determine assignment, and S, I are any already known situation or impact what the relationships are that are appropriate for this situation states. A similar equation holds for attribute uncertainty. (or structure). We can also have uncertainty as to whether a relationship For the fifth case, structure / situation creation, we have all that we see in the data is the relationship of interest. Given a of the uncertainties addressed above, and we add an uncertainty set of k possible relationship and a body of evidence EB for a not immediately obvious in the generic equations. Relook particular relationship Rcurrent, we can assess the following equation (4). One of the stated requirements is that we are uncertainty: selecting among a set of defined situations. This essentially is a uRk((Rcurrent = Rk | EB, S, I) (2) closed world assumption. However, in case 5 we are building the situation, rather than determining which situation among a Again, a similar uncertainty equation holds for attribute choice of situations is the applicable one. We still have a uncertainty. Situation assessment depends on the relationships number of models to choose from, but they are more in the situation. A situation then can be defined as fragmentary than in previous cases. The previous cases S ≝ (R1, …., Rk, A1,….An) (3) represent more of a “pieces of the puzzle” approach, where one is assembling the puzzle according to one or more available Finally, we have an uncertainty measure us. Given a set of pictures to help guide you. Case 5 represents the case where we m possible situations and a body of evidence EB for a particular one is assembling the puzzle without a picture or set of pictures relationship Scurrent, we can assess the following uncertainty: to guide one. Rather, you are assembling the puzzle guided by basic puzzle rules about matching shapes and picture colors. us(Scurrent = Sm | EB, I) (4) So, in case 5, we are also determining what the applicable Sks In addition to uncertainties in the evidence and in the are. reasoning process, equation (4) also allows us to account for uncertainties in the situation definition. Equation 3 implies that VI. DISCUSSION every situation can be precisely defined as a set of specific Up to this point we have been able to attest the existence of relationships and attributes. But what if a relationship or a number of uncertainty propagation considerations when attribute is missing in a particular situation instance? For analyzing a level 2 HLIF. Most of these are not necessarily example, a canonical birthday celebration in the United States obvious at a first glance, which suggests the importance of a includes a cake with a number of lit candles on it. If there are framework that supports the analytical process. The framework proposed in this paper is meant for supporting the analysis of processes occurring at JDL fusion level 2, and an Technology and Engineering, George Mason University. Fairfax, VA, USA, 2011. important aspect of it is the ability to correlate such processes [7] Evaluation of Technologies for Uncertainty Reasoning Working Group with the uncertainty considerations raised so far. Figure 6 (ETURWG) website, http://eturwg.c4i.gmu.edu/?q=aboutUs , retrieved summarizes these considerations as they relate to the heart of May 19, 2012. the basic process model shown in Figure 5. [8] P. C. G. Costa, K. B. Laskey, E. Blasch, A. Jousselme, “Towards Unbiased Evaluation of Uncertainty Reasoning: The URREF Ontology”, Proceedings of the 15th International Conference on Information Fusion (To Be Published) [9] A. N. Steinberg, C. L. Bowman, “Revisions to the JDL Data Fusion Model,” Handbook of Multisensor Data Fusion: Theory and Practice (2nd ed), CRC Press, pp 45 - 68, 2009 [10] S. Das, High-Level Data Fusion, Boston MA (USA): Artech House, 2008 [11] J. Sowa, Knowledge Representation: Logical, Philosophical and Computational Foundations, Grove CA (USA): Brooks/Cole, Pacific, 2000 [12] http://www.raytheon.com/capabilities/products/apg63_v3/ [13] http://www.p3orion.nl/variants.html Figure 6: Level 2 HLIF Uncertainty Considerations [14] Air Power Australia website, http://www.ausairpower.net/APA- Acquisition-GCI.html#mozTocId55304, as retrieved on June 2, 2012 The taxonomy of level 2 HLIF types discussed in section 2 [15] D. A. Schum, The Evidential Foundations of Probabilistic Reasoning, defines the complexity of the uncertainty considerations that New York: John Wiley and Sons, Inc., 1994. must be accounted for. Five different types are identified, [16] D. Schum, “Thoughts About a Science of Evidence, ” University ranging from simple entity attribute refinement using situation College London Studies of Evidence Science, retrieved from status data to the development of a complete situation 128.40.111.250/evidence/content/Science.doc on June 2, 2012 assessment assembled from applicable situational fragment [17] K.B Laskey, D. A. Schum, P. C. G Costa, T. Janssen, “Ontology of data. The uncertainty in the input data / evidence must be Evidence”, Proceedings of the Third International Ontology for the assessed for relevance, credibility, and force / weight, per the Intelligence Community Conference (OIC 2008), December 3-4, 2008 ontology of evidence presented in Laskey et al. [17]. The [18] K.J. Laskey, K. B. Laskey, P. C. G. Costa, M. M. Kokar, T. Martin, T. representation uncertainties that drive the modeling Lukasiewicz, Uncertainty Reasoning for the World Wide Web, W3C Incubator Group Report 31 March 2008. Retrieved from methodologies can be classified per the uncertainty ontology http://www.w3.org/2005/Incubator/urw3/XGR-urw3-20080331/ developed by the W3C Incubator Group for Uncertainty [19] P. Walley, “Measures of uncertainty in expert systems”, Artificial Reasoning [18]. A variety of different models can be used to Intelligence, 83(1), May 1996, pp. 1-58 properly capture the aspects of uncertainty in the data [19, 20]. [20] B. Khaleghi, A. Khamis, F. O. Karray, “Multi-sensor Data Fusion: A Finally, the output uncertainty strongly depends on the a priori Review of the State-of-the-Art”, Information Fusion (2011), doi: identification of possible situation choices, or upon having a 10.1016/j.inffus.2011.08.001 fusion process that allows for an effective open world [21] K. B. Laskey, P. C. G. Costa, T. Janssen, “Probabilistic Ontologies for assumption. These uncertainty considerations are the beginning Multi-INT Fusion” in Proceedings of the 2010 conference on Ontologies and Semantic Technologies for Intelligence, 2010 of understanding how to evaluate the effectiveness of various uncertainty management methods in high-level fusion. [22] A. N. Steinberg, “Foundations of Situation and Threat Assessment” in Handbook of Multisensor Data Fusion: Theory and Practice (2nd ed), CRC Press, pp 437 -502, 2009 REFERENCES [23] R. G. Almond, Graphical Belief Modeling, New York NY (USA): [1] D. L. Hall, J. Llinas, “Multi-Sensor Data Fusion” in Handbook of Chapman and Hall, 1995 Multisensor Data Fusion: Theory and Practice (2nd ed), CRC Press, pp [24] P. P. Shenoy, “Valuation-Based Systems for Bayesian Decision 1-14, 2009 Analysis,” Operations Research, pp 463 – 484, Vol 40, No 3, May-June [2] D. A. Lambert, “A Blueprint for Higher Level Fusion Systems”, in 1992 , Information Fusion, pp 6-24, Elsevier, Vol 10, 2009 [25] P. Svensson, “On Reliability and Trustworthiness of High-Level Fusion [3] M. M. Kokar, C. J. Matheus, K. Baclawski, J. A. Letkowski, M. Decision Support Systems: Basic Concepts and Possible Formal Hinman, J. Salerno, “Use Cases for Ontologies in Information Fusion” Methodologies”, 9th International Conference on Information Fusion, Proceedings of the 7th International Conference on Information Fusion Florence (Italy), 10-13 July 2006, retrieved online from (2004), retrieved from http://vistology.com/papers/Fusion04- http://www.isif.org/fusion/proceedings/fusion06CD/Papers/51.pdf on UseCases.pdf on 1 Jun 2012. May 6, 2012 [4] E. G Little, G. L Rogova, “Designing Ontologies for Higher Level [26] A. Karlsson, Dependable and Generic High-Level Algorithms for Fusion”, Information Fusion, pp 70-82, Elsevier, Vol 10, 2009 Information Fusion – Methods and Algorithms for Uncertainty [5] P. C. G. Costa (2005) Bayesian Semantics for the Semantic Web. Management, Technical Report HS-IKI-TR-07-003, University of Doctoral Thesis, School of Information Technology and Engineering, Skovde, retrieved 15 Sep 2011 from his.diva- George Mason University. Fairfax, VA, USA, 2005. portal.org/smash/get/diva2:2404/FULLTEXT01. [6] R. N Carvalho (2011) Probabilistic Ontology: Representation and [27] E. Franconi, Description Logic Tutorial Course, downloaded from Modeling Methodology. Doctoral Thesis, School of Information http://www.inf.unibz.it/~franconi/dl/course/ on 1 May 2012