=Paper=
{{Paper
|id=Vol-430/paper-8
|storemode=property
|title=Re-Consider: The Integration of Online Dispute Resolution and Decision Support Systems
|pdfUrl=https://ceur-ws.org/Vol-430/Paper8.pdf
|volume=Vol-430
|dblpUrl=https://dblp.org/rec/conf/odr/MueckeSM08
}}
==Re-Consider: The Integration of Online Dispute Resolution and Decision Support Systems==
Re-Consider: The Integration of Online
Dispute Resolution and Decision Support
Systems
Nial MUECKE, 1 Andrew STRANIERI, Charlynn MILLER
Center for Informatics and Applied Optimisation: University of Ballarat, Australia
Abstract. Current approaches for the design of Online Dispute Resolution (ODR)
systems involve the replication of Alternative Dispute Resolution practices such as
mediation and negotiation. Though such systems have been found to be popular,
there are concerns that these systems fail to take into account judicial practices. In
this paper a system that supports disputants' decisions making when engaged in an
online dispute is advanced. The system, Re-Consider, is an Australia Family Law
ODR system, that is based on judicial reasoning modelled with Bayesian belief
networks and provides disputants with decision support in the dispute. It is
believed that this approach provides disputants with an online resolution process
that will help them to reach outcomes that take judicial practices into account and
presents a step toward more deliberative form of online dispute resolution.
Keywords. Online Dispute Resolution, Bayesian Network, Decision Support,
Deliberation, Family Law, Knowledge Modelling
Introduction
Alternative Dispute Resolution (ADR) and Artificial Intelligence forms the basis for
most protocols used to resolve disputes by many Online Dispute Resolution (ODR)
systems [20]. These approaches range from the use of e-mail to facilitate a text based
negotiation or mediation to the use of more complex systems such as Family Winner,
which uses artificial intelligence to allow users to negotiate by allocating marital assets,
based on the users ranked preferences[3].
Gramberg [7] identifies three concerns with regards to the uses of ADR: Out-
comes for both parties may not be similar to those achieved at court; a third party
mediator or arbiter, if used, may not be as neutral or unbiased as a judge; and one party
may feel pressured into accepting an agreement. In particular the neutrality of the third
party is perhaps one of the most important concerns that critics of ADR express [7,15].
1
Corresponding author: Nial Muecke, Center for Informatics and Applied Optimisation: University of
Ballarat, Australia E-mail: n.muecke@ballarat.edu.au
62
In an analysis of three case studies on the use of ADR in the workplace Gramberg [7]
found that one of the most prevalent complaints was that the third party did not act
neutrally.
Others such as Alexander [2] feel that the use of dispute resolution techniques
such as negotiation and mediation can lead to unfair outcomes if a power imbalance
exists. For instance, in family law disputes it is not uncommon for the wife to concede
to the demands of the husband, particularly in cases where spousal abuse has taken
place. The practice of conceding to demands when it is not in the disputant's interests
has also been observed when one or both of the disputants' is angered [6,11]. However
Friedman et al [6] notes that anger is not always a negative factor because displaying
anger can help to show that the issue is of particular importance.
These concerns about ADR/ODR that [7] raised are not necessarily the primary
concerns of disputants, for whom litigation may not be practical or possible for
geographical, jurisdictional, financial or other reasons [1,16]. For many disputants,
ADR/ODR represents the only practical dispute resolution mechanism and the notion
that the outcome may have been different, if the matter was litigated, is not pressing.
However, the existence of different mechanisms for resolving disputes that generate
different outcomes challenges a modern states expectation of procedural and
distributive justice. Numerous authors including Ingleby [8] and Mack [10] have
concerns about the impact that ADR and ODR have on the judicial system. This leads
to the question; Should disputants accept outcomes that are markedly different from
judicial decisions just because litigation is not a feasible option?
Arguably, the threat to principles of justice as a result of small scale eCommerce
disputes resolved by ODR is minimal. However disputes in family law are increasingly
resolved by ADR/ODR and often involve custodial arrangements for children and
valuable property. The way a dispute is resolved and the decisions reached impact on
the disputants and children for many years and, given the prevalence of divorce in
many countries has the potential for undermining justice.
One way to mitigate against the possibility of injustices due to divergent out-
comes between ODR and litigation is to design ODR systems that are guided by
judicial reasoning. Zeleznikow et al [19] and Zondag & Lodder [20] have proposed
different approaches to provide ODR systems that go beyond the typical
implementation of ADR ideas. However few approaches are directly aimed at
influencing disputants to arrive at outcomes in line with those a judge would reach.
The Re-Consider system is a hybrid ODR / Decision Support System (DSS)
system that uses a model of judicial reasoning to structure how a percentage split of
assets in a divorce dispute should be calculated. The Re-Consider system aims to
realise the ADR practice of bargaining in the shadow of the law, whereby disputes are
resolved using the most likely litigated outcome as a tool to guide and motivate the
litigants during the dispute resolution process [19].
ReConsider differs from traditional ODR systems as it aims to focus the dispute
on the issues that judges consider when hearing a case. The basis of the approach
involves guiding disputants to discuss only those factors that a judge would consider in
making a decision. This differs from other ODR approaches that are based on creating
environments that facilitate negotiation or mediation. The result of this is that Re-
Consider is not an online court, nor is it a negotiation or mediation system, as it does
not directly provide users with the ability to barter and make trade-offs in order to
resolve a dispute. Nor is Re-Consider an Arbitration system. Although the system does
63
make predictions of judicial findings these are used to advise and persuade users in
their decision making, but it does not mandate that users abide by the
recommendations. Instead, the system focuses more on deliberation as described by
Walton & Krabbe [17], where users are encouraged to adopt deliberative stances
toward mutually beneficial outcomes.
The ODR protocol described here has been designed with several themes in
mind: To restrict the scope of the dispute to only factors that judges would consider; to
make no distinctions based on gender or moral differences; and to create an
environment where users are encouraged to think about their beliefs. The motivation is
that if users are restricted in this way, they are prevented from introducing factors into
the dispute that are not likely to be relevant or conducive to the resolution of the
dispute consistent with a judicial decision. It is believed the benefit of restricting the
scope of the argument to only judicial factors will reduce the potential for the argument
to turn into what Walton & Krabbe [17] describe as an eristic argument motivated
mainly by emotional needs. The potential for an argument to become eristic is
increased when one or both disputants are provoked and become angered [6,11].
Friedman et al [6] found that in such situations there was a high risk that the dispute
could stagnate, which in turn increased the likelihood that the dispute would not be
resolved.
It is believed that by providing users with judicial predictions about what would
likely occur, were the case to be heard in court, that this will encourage the disputants
to consider their positions more carefully. The hypothesis is that the disputants will
change their position to one that is more closely aligned with that of the judicial advice.
In the next section a model of judicial reasoning embedded into Re-Consider is
described. Following that the protocol deployed by the system to guide the discussion
is presented before concluding remarks.
1 Model of Judicial Reasoning
Judicial reasoning aimed at determining a percentage split of assets between husband
and wife was modelled with the assistance of an Australian Family Law expert,
Andrew Combes, A hierarchy of factors called an argument tree following [13,18,14 ]
was elicited. The argument tree extracts the claim and data items from a series of
interconnected arguments, where each argument takes the form described in figure 1.
Each level of the tree represents a sub-argument or factor of the higher level. The top
of the tree is the root node, which represents the division of marital assets.
64
Figure 1. GAAM Family Law example
The three factors considered most important by the expert in determining the
division of marital assets are children of the root node as illustrated in figure 2;
contributions made to the marriage, the future needs of both parties and the level of
wealth of the marriage. Below each of the three factors are the important sub-factors
and so on. For instance, a judge determining the length of a marriage will consider the
number of years of marriage, time apart during cohabitation and the number of years of
cohabitation prior to marriage. A judge will combine these factors in order to reach
what he or she believes to be a judgment about the length of a marriage. The way in
which the factors are combined was modelled using a Bayesian Belief Network at each
level of the tree as described in Section 3.
On commencement, the protocol presents users with child factors of the root
argument (see Fig 2). They are not presented with the root argument to avoid focusing
the user's attention on getting a particular result (i.e. a 60/40 split) rather than working
towards finding a mutually beneficial solution. Disputants are not presented with any
of the sub-factors of the displayed factors. This is because the sub arguments will be
revealed as required in order to restrict the dispute to only those of the branches
relevant to the current dispute.
65
Figure 2. Illustration of active arguments
Users are required to assert their beliefs about each of the root arguments sub-
arguments. Once they have both made their assertions, the system infers the parent
argument using each disputant's assertions and then compares the outcomes. There are
four possible outcomes of the comparison:
Agreed this is where assertions made by both users are the same. For example, both
users agree that the marriage was not wealthy, their combined income was
average, their assets were to the value of $9,000,000 and that their Debt was
$7500000.
Potentially Agreed this is where the inferred argument is the same but assertions
made on sub-factors used in order to make that inference are not all the same. For
example, your spouse asserts that he/she contributed more to the marriage, and
has similar needs, whereas you think he/she contributed equally but your needs
are greater. In this case, the system inference made by the Bayesian network
associated with the parent argument, percentage split of assets, infers the same
percentage split regardless of whose claims are accepted as being true.
Disputed this is where the parties disagree on a factor and on the sub-factors and,
unlike the Potentially Agreed, the Bayesian network infers a different conclusion
at the parent factor. For example, your spouse asserts that he/she contributed
more and you assert you have contributed much more. This results in a different
outcome for the parent argument when applied to the Bayesian network so, they
are both provided with the sub-arguments of the contributions argument.
66
Disagreed this is where neither user can find agreement on a given branch of the
argument tree. This occurs when all sub-factors including leaf nodes have been
explored and the users still do not reach agreement on a claim.
Figure 3. Illustration of active arguments with agreed and disagreed arguments highlighted
If the outcome of the inference from the initial arguments is that there is an
agreement or potential agreement the protocol calls for the users to be informed that
they have potentially resolved their dispute and are shown the outcome inferred by the
Bayesian network. If the users both accept the recommended outcome of the dispute,
then the dispute is considered to be resolved. However in the event that either user
rejects the outcome or that the inference resulted in a dispute, then the user's assertions
will be compared to each other. Where there is an argument in agreement, that branch
of the argument will not be displayed and remains hidden from the users unless, one or
both of the users decides to change their initial assertion. However where there is
disagreement then the sub-arguments for the disputed argument will be revealed to the
users and they are required to then make assertions on those sub-arguments (see figure
3).
Once the users have made assertions on the newly revealed arguments, then the
process is repeated. The disputed argument's sub-arguments are used to infer the
argument. If the argument is agreed or potentially agreed upon then there is no longer
any need to explore that branch of the argument. If there is a dispute then the assertions
are compared and for those sub-arguments that are disputed, those branches will be
revealed for the users to make assertions.
This process of comparing the argument continues (see figure 4) until one of two
things occur; either a leaf argument of the dispute is reached or all the arguments for
67
that branch of the dispute are agreed upon. If all the arguments are agreed upon then
both users are able to infer forward to discover what a judge is likely to find given the
agreed sub-arguments. For each of the possible claims that could be asserted for the
inferred argument, a probability is assigned. These probabilities represent the
likelihood that judges will find that claim to be correct, given that the sub-argument
accepted as being true. In the event that the arguments used to make the inference were
all agreed upon then the probability for both users would be the same.
Figure 4. Illustration of active arguments, with the sub arguments explored and agreed, disagreed and
potentially agreed arguments highlighted
In the event that the leaf arguments are reached for a given part of the argument
tree and both users disagree on the leaf assertions dispute, the system accepts both of
their assertions. To do otherwise would be to have the system determine a finding of
fact. Findings of fact are discussed below. Instead the protocol has the assertions
inferred and compared to see if they make a difference in the result for the parent
argument. If the inferred parent outcome is potentially agreed then the difference at the
leaf nodes is ignored. If a potential agreement is not found then the users are required
to Re-Consider their position, based on the probability that a judge will agree with their
assertion and the argument will be flagged as disagreed. From this point the users are
able to infer their way back down the tree to see if they can resolve their difference at a
lower level. At each level users are prompted to compare their position to that of the
other users and that of the judicial advice.
68
2 Findings of fact
There are two main tasks that a judge is required to perform; making decisions about
what the facts of a case are, and to apply the law based on those facts. Although
findings of fact are beyond automated systems, there are measures that can be taken to
reduce the likelihood that one or both disputants will misrepresent the facts.
The simplest method that was considered for use with the Re-Consider system
was to require the disputants, when making a claim on a leaf argument to state how
strongly they believed a judge would agree with them. In this way, doubt could be
introduced into the argument which is likely to have an impact when inferences are
then made on the parent argument. As a result it may cause the disputants to reconsider
their position. However, it was decided that the use of such an approach was not likely
to have any significant impact on the outcomes of the dispute as any disputant that
asserted a claim that they did not mostly or wholly believe in was also not likely to
admit it.
An alternative to relying on the disputants to assert their belief on the leaf claim,
but that still incorporated doubt to be considered in the inference of the parent
argument was to ask the disputants to list what evidence they had to support their leaf
assertions. The disputants would be provided with a list of options that covered a range
of evidence types. Each type of evidence would, alone or combined, be used to
calculate the level of belief that their claim was correct which would then be fed into
the inference of the parent argument. Depending on the argument the different types of
evidence would provide different levels of support for the claim with some potentially
providing no support if they were not generally seen to support the claim. This
approach is, however, susceptible to the well informed disputants who could
potentially make misleading claims and support them with false claims of credible
evidence. To prevent this, the system would then need to provide a method for the
disputants to evaluate each others evidence. This in turn is open to the disputants
unintentionally or deliberately not accepting the evidence presented to them. To
counter this the system would then require a process whereby a neutral third party
evaluates the evidence, which begs the question: Why not just go to court?
The end result is that any attempt to make a finding on fact is likely to either
require the disputants to be honest or for a third party to intervene on behalf of the
disputants. The Re-Consider system deals with divergent leaf assertions by testing to
see if any differences that may exist in claims, results in having any impact on the
outcome of the parent argument. When there is a significant difference the disputants'
claims, the system uses both of their claims to generate the probabilities that a judge
will agree with; their claim, the claim of the other parties, and the claim that the system
believes a judge would likely find. The users are then provided with these probabilities
to assist them in deciding how best to proceed.
3 Representing uncertainty in predictions
Bayesian Belief Networks (BBN) are used to model a judges reasoning to predict an
outcome[4]. The use of BBNs over that of other methods of predictions is due to the
BBNs ability to provide a level of certainty for every possible outcomes rather than
solely predicting an outcome. For example, a decision tree on neural network that
69
predicts an outcome at the root argument of a 50:50 split, cannot present the likelihood
of a 40:60 split.
An example of how the prediction of uncertainty is represented is illustrated in
Table 1. This table shows the assertions made by the two disputants (Your's & Their's)
as well as those made by the judge (judges). The table also shows the possible claims
and the probability of a judge accepting each of the claims as determined by the
Bayesian network. Looking at Table 1 it is apparent that a claim of Very long or Very
short would not likely be believed by a judge and thus not a wise claim to assert. Their
claim of Short has a low likelihood of being accepted, this should indicate to the
disputants that it would be unwise to press this claim as both Your claim and the
Judge's claim is more likely to be believed. Your Claim of Long is not a particularly
strong claim, but is stronger than Their claim. Lastly the Judge's claim of Average is
much stronger than any of the others, as a result it is believed that this will help
motivate the disputants towards the system's findings.
Table 1. An example of predictions for Marriage Length argument
Assertions Claims Estimated Outcome
Very Long 01.00%
Your's Long 28.00%
Judge's Average 55.00%
Their's Short 15.90%
Very Short 00.10%
Fisher & Ury’s Best Alternative to A Negotiated Agreement (BATNA)[5] is used
by disputants to determine whether to accept or reject an offer. A prediction of judges’
determination on assertions that lead to an ultimate outcome does indirectly provides a
BATNA. The likelihood a judge will agree with a disputant's assertions helps the user
to advance claims about specific aspects of the dispute and ultimately, to engage in
ODR using reasoning that is similar to that a judge would deploy.
A key element to the users’ acceptance of the system is to what extent the
disputants accept the advice given by the system, as it is vital that the participants trust
the ODR system. Trust has been raised as one of the key factors that influence the
success of ADR and ODR [12]. Korobkin [9] reports that most disputants don't trust
ADR and fear unfair outcomes. It is likely that such misgivings are also shared about
ODR. Tyler [16] also notes the need for trust in ODR systems for them to be
successful. However she also reports that about 70% of people are prepared to use
ODR systems.
4 Conclusion
This paper has described an approach to ODR that is unlike conventional approaches,
in that the aim is to structure the dispute around factors that judges consider and to
provide decision support to the disputants in the form of predictions of judicial
decision making. It is hypothesised that this will promote an environment where the
disputants are more receptive to changing their position to one more in line with legal
practices.
70
The Re-Consider system has been described as an ODR system that implements an
integrated DSS and ODR protocol. The Re-Consider system achieves this by
integrating Bayesian Belief Networks into an argument tree model of legal reasoning.
The resulting system allows disputants to navigate legal reasoning and provides
estimations on the likely outcome of each level of the argument. Knowledge of the
likelihood that disputants assertions will be accepted at Court may provide a real
incentive to withdraw extreme claims and ultimately to deliberate on the issue to
achieve outcomes more consistent with judicial decisions. However, Re-Consider is a
prototype ODR system and cannot make findings of fact that are associated with
judicial decision making. Instead the ODR protocol attempts to catch disagreement on
leaf claims by examining the impact that the differences have on the ultimate outcome.
Acknowledgements
Andrew B.J. Combes, Barrister. Owen Dixon Chambers, Melbourne Australia
References
[1] Steve Abernethy. Building large-scale online dispute resolution & trustmark systems. In UNECE
Forum on ODR, 2003.
[2] R Alexander. Mediation, violence and the family. Alternative Law Journal, 17(6):276-99, 1992.
[3] Emilia Bellucci and John Zeleznikow. Developing negotiation decision support systems that support
mediators: A case study of the family winner system. Artificial Intelligence and Law, 13:233 - 271,
2005.
[4] Eugene Charniak. Bayesian networks without tears: making Bayesian networks more accessible to the
probabilistically unsophisticated. AI Mag, 12(4):50-63, 1991. [5] Roger Fisher and William Ury.
Getting to yes: Negotiating an agreement without giving in, Business Books Limited, 1991.
[6] Ray Friedman, Cameron Anderson, Jeanne Brett, Mara Olekalns, Nathan Goates, and Cara Cherry
Lisco. The positive and negative effects of anger on dispute resolution: Evidence from electronically
mediated disputes, Journal of Applied Psychology, 89(2):369-376, 2004.
[7] Bernadine Van Gramberg. ADR and workplace justice: Just settlement? Working Paper, 2007.
[8] Richard Ingleby. Court sponsored mediation: The case against mandatory participation. The Modern
Law Review, 56(3):441-451, 1993.
[9] Russell B Korobkin. Psychological impediments to mediation success: Theory and practice. Law-Econ
Research Paper 05-9, UCLA School of Law, March 2005.
[10] Kathy Mack. Court referral to ADR: Criteria and research. Technical report, Law School, Flinders
University for the National ADR Advisory Council and Australian Institute of Judicial
Administration2003. ISBN 1-875527-45-1
[11] Marta Poblet and Pompeu Casanovas. Emotions in ODR. International Review of Law, Computers &
Technology, 21(2):145 -156, 2007.
[12] Colin Rule and Larry Friedberg. The appropriate role of dispute resolution in building trust online,
Artificial Intelligence and Law, 13(2):193-205, June 2005
[13] A. Stranieri, J. Yearwood, and J. Zeleznikow. Argumentation structure that integrate dialectical and
non-dialectical reasoning. The Knowledge Engineering Review, 16(4):331-348, Aug 2002.
[14] Andrew Stranieri and John Zeleznikow. Knowledge Discovery from Legal Databases. Law and
Philosophy Series. Kluwer Academic Publishers, 2005.
[15] Ernest Thiessen and John Zeleznikow. Technical aspects of online dispute resolution challenges and
opportunities. In Melissa Conley Tyler, Ethan Katsh, and Daewon Choi, editors, Proceedings of the
Third Annual Forum on Online Dispute Resolution, 2004.
[16] Melissa Conley Tyler. 115 and counting: The state of ODR 2004. In Melissa Conley Tyler, Ethan
Katsh, and Daewon Choi, editors, Proceedings of the Third Annual Forum on Online Dispute
Resolution, 2004.
71
[17] Douglase N. Walton and Eric C. W. Krabbe. Commitment in Dialogue: Basic Concepts of
Interpersonal Reasoning. State University of New York Press, Albany, New York, USA, 1995.
[18] John Yearwood and Andrew Stranieri. The generic/actual argument model of practical reasoning.
Decision Support Systems, 41(2):358-379, Jan 2006.
[19] John Zeleznikow, Emilia Bellucci, Uri J. Schild, and Geraldine Mackenzie. Bargaining in the shadow
of the law - using utility functions to support legal negotiation. In ICAIL '07: Proceedings of the 11th
international conference on Artificial intelligence and law, p237-246, New York, NY, USA, 2007.
ACM.
[20] Berry Zondag and Arno R. Lodder. Constructing computer assisted dispute resolution systems by
developing a generic language to analyse information exchange in conflict discourse. International
Review of Law, Computers & Technology, 21(2):191-205, July 2007.
72