=Paper=
{{Paper
|id=Vol-1520/paper35
|storemode=property
|title=Opinionated Explanations of Recommendations from Product Reviews
|pdfUrl=https://ceur-ws.org/Vol-1520/paper35.pdf
|volume=Vol-1520
|dblpUrl=https://dblp.org/rec/conf/iccbr/Muhammad15
}}
==Opinionated Explanations of Recommendations from Product Reviews==
280
Opinionated Explanations of Recommendations
from Product Reviews
Khalil Muhammad
Insight Centre for Data Analytics,
University College Dublin, Belfield, Dublin 4, Ireland.
{firstname.lastname}@insight-centre.org
1 Introduction
Recommender systems are now mainstream and people are increasingly relying
on them to make decisions in situations where there are too many options to
choose from. Yet many recommender systems act like “black boxes”, providing
little or no transparency into the rationale of their recommendation process [1].
Related research in the field of recommender systems has focused on developing
and evaluating new algorithms that provide more accurate recommendations.
However, the most accurate recommender systems may not necessarily be those
that provide the most useful recommendations — due to the influence of how
recommendations are presented and justified to users [2–4]. Therefore, recom-
mender systems must be able to explain what they do and justify their actions
in terms that are understandable to the user. An explanation, in this context,
is any added information presented with recommendations to help users better
understand why and how a recommendation is made [5]. Studies show that ex-
planations help users make better decisions and are therefore provided for many
reasons [6, 7], which normally align with the objective of the recommender sys-
tem. Interestingly, explanations may sometimes be provided from the users (not
from the recommender system) to justify their choices [8].
The availability of user-generated reviews that contain real experiences pro-
vides a new opportunity for recommender systems; yet, existing methods for
explaining recommendations hardly take into account the implicit opinions that
people express in such reviews even though studies show that users are increas-
ingly relying on the reviews to make better choices [9]. Also, explanations usually
provide a posthoc rationalisation for recommendations; but, this work is moti-
vated by a more intimate connection between recommendations and explana-
tions, which poses the question: can the recommendation process itself be guided
by structures generated to explain recommendations to users?
This work builds on existing research in the areas of case-based reasoning,
recommender systems and opinion mining to propose a novel approach for build-
ing explanations in recommender systems. We will also explore the potential of
opinionated explanations in driving the recommendation process.
Copyright © 2015 for this paper by its authors. Copying permitted for private and
academic purposes. In Proceedings of the ICCBR 2015 Workshops. Frankfurt, Germany.
281
2 Research Plan
The core focus of this work is to explore the role of opinions in explaining rec-
ommendations. Accordingly, we have identified the following areas of interest:
Ranking, filtering and evaluating feature quality. Feature-level opin-
ion mining algorithms that are capable of extracting very granular opinions,
such as [10], yield noisy features because they rely on shallow natural language
processing (NLP) techniques. Ultimately, these features lack context and are too
fine-grained to be intuitive to users. For instance, it will be nonsensical to ex-
plain a hotel recommendation as “because visitors liked the wire...”, where ‘wire’
is a feature mined from reviews. Hence, the research question: “how to rank, fil-
ter and evaluate features mined from reviews”. We will use o↵-the-shelf opinion
mining techniques but focus on developing methods for ranking features so that
they can be filtered and evaluated for quality (i.e. the extent to which a feature
is relevant and presentable to users in explanations). This involves creating new
methods for summarising features so that only qualitative and comprehensible
features are presented in explanations.
Generating opinionated explanations. Explanations normally demon-
strate how one or more recommended items relate to a user’s preferences, nor-
mally through an intermediary entity such as a user, item, or feature. For in-
stance, Netflix may use the movies that a user has rated highly in the past to
explain a movie recommendation. And since user ratings are often unable to
fully represent user preferences, there is a place of fine-grained opinions that are
explicitly provided by users in textual reviews. We expect that explanations that
are based on opinionated reviews will be more natural and convincing. Hence the
research question: how to use such opinions to generate explanations of product
recommendations?. We will use opinions from reviews to generate that justify
a particular recommendation or sets of recommendations, and we will conduct
live-user trials to test for its usefulness in decision-making.
Driving recommendations using explanations. To date, most recom-
mender systems have treated explanations as an afterthought, presenting them
alongside recommendations, but with little connection to the recommendation
process itself. This work will explore the potential of using explanations to drive
the recommendation process itself so that, for example, an item will be recom-
mended because it can be explained in a compelling way. Hence the research
question: how to use explanations to support similarity metrics and ranking
strategies in a recommendation process?
3 Progress
To address the problem of feature quality, we used the approach in [10] to mine
opinions from a dataset of TripAdvisor hotel reviews. Then, using various lexical
and frequency-based filtering techniques, we removed noisy, less opinionated and
unpopular features. The remaining features were summarised into higher-level
representations by clustering them based on the words they co-occur with in
282
sentences of reviews. This feature representation allows us to replace a low-
level feature (e.g. ‘orange juice’) with a more meaningful higher-level one (e.g.
‘breakfast’) that is suitable for use in explanations.
We developed a new method for generating personalized explanations which
highlight the pros and cons of a recommended item to a user. Our approach
focuses on the features that the user has mentioned in their reviews, and those
mentioned about the recommended item by other users. In the explanation, we
prioritize the features that are likely to be of interest to the user. Each feature is
classified as a pro or con based on its sentiment, and it is ranked by its popularity
with the user and the recommended item.
We also developed another explanation strategy that explains a recommended
item in comparison with other recommendations. That is, the explanation presents
features of the recommended item that are better or worse than its alternatives.
Acknowledgments. This work is supported by the Insight Centre for Data
Analytics under grant number SFI/12/RC/2289.
References
1. Herlocker, J.L., Konstan, J.A., Riedl, J.: Explaining Collaborative Filtering Rec-
ommendations. In: Proceedings of the 2000 ACM Conference on Computer sup-
ported cooperative work, ACM (2000) 241–250
2. McNee, S.M., Riedl, J., Konstan, J.A.: Being Accurate is not Enough: How Accu-
racy Metrics Have Hurt Recommender Systems. In: CHI’06 extended abstracts on
Human factors in computing systems, ACM (2006) 1097–1101
3. Pu, P., Chen, L., Hu, R.: Evaluating recommender systems from the users perspec-
tive: survey of the state of the art. User Modelling and User-Adapted Interaction
22(4-5) (2012) 317–355
4. Knijnenburg, B.P., Schmidt-Thieme, L., Bollen, D.G.: Workshop on User-centric
Evaluation of Recommender Systems and their Interfaces. In: Proceedings of the
fourth ACM conference on Recommender systems, ACM (2010) 383–384
5. Tintarev, N., Mastho↵, J.: E↵ective Explanations of Recommendations: User-
Centered Design. In: Proceedings of the 2007 ACM conference on Recommender
systems, ACM (2007) 153–156
6. Tintarev, N., Mastho↵, J.: Designing and Evaluating Explanations for Recom-
mender Systems. In: Recommender Systems Handbook. Springer (2011) 479–510
7. Zanker, M.: The Influence of Knowledgeable Explanations on Users’ Perception of
a Recommender System. In: Proceedings of the sixth ACM conference on Recom-
mender systems, ACM (2012) 269–272
8. Nunes, I., Miles, S., Luck, M., De Lucena, C.J.: Investigating Explanations to
Justify Choice. In: User Modeling, Adaptation, and Personalization. Springer
(2012) 212–224
9. Lee, J., Park, D.H., Han, I.: The Di↵erent E↵ects of Online Consumer Reviews on
Consumers’ Purchase Intentions Depending on Trust in Online Shopping Malls:
An Advertising Perspective. Internet research 21(2) (2011) 187–206
10. Dong, R., Schaal, M., O’Mahony, M.P., McCarthy, K., Smyth, B.: Opinionated
Product Recommendation. In: Case-Based Reasoning Research and Development.
Volume 7969. (2013) 44–58