Simplifying Privacy Decisions: Towards Interactive and Adaptive Solutions Bart P. Knijnenburg Donald Bren School of Information and Computer Sciences University of California, Irvine bart.k@uci.edu 1. INTRODUCTION The problem with nudges is that they take a one-size-fits-all Privacy concerns are an important barrier to the growth of social approach to privacy: They assume that the “true cost” [13] of networks, e-commerce, ubiquitous computing, and location disclosure is roughly the same for every user, piece of sharing services. The large majority of Internet users takes a information, and situation. But privacy decisions are highly user- pragmatic stance on information disclosure: they trade off the and context-dependent: The fact that one person has no problems anticipated benefits with the risks of disclosure, a decision process disclosing a certain item in a particular context does not mean that that has been dubbed privacy calculus [10,23]. Privacy decisions disclosure is equally likely for a different person, a different item, are inherently difficult though, because they have delayed and or in a different context [16,24]. Likewise, what is a convincing uncertain repercussions that are difficult to trade-off with the justification to disclose a certain item in a particular context for a possible immediate gratification of disclosure [3,5]. certain person, may be a completely irrelevant reason for a different person, a different item, or a different context [6,21]. How can we help users to balance the benefits and risks of What we need, then, is personalized privacy decision support. information disclosure in a user-friendly manner, so that they can make good privacy decisions? Existing research has explored two 4. EXPLICATING PRIVACY approaches to this problem, but neither provides a satisfying The first step towards personalized privacy decision support is to solution. Below I discuss these two approaches, and introduce a explicate the privacy calculus: to move beyond a mere description new user-tailored approach that provides more user-friendly towards deeper understanding of people’s cognitive decision- privacy decision support. making process. What kind of benefits and threats do users consider when making disclosure decisions? What is the relative 2. TRANSPARENCY AND CONTROL weight of each of these aspects? Can the weights be influenced by To help users with their privacy calculus, experts recommend a justification or a default, and if so, in what context(s)? More giving users comprehensive control over what data they wish to research is needed to answer these questions. share, and more transparency about the implications of their deci- sions [1,22]. However, while users claim to want full control over For example, I showed in [19] that the effect of justifications on their data, they avoid the hassle of actually exploiting this control information disclosure decisions is mediated by users’ perceptions [8]. Moreover, the privacy controls of systems like Facebook are of help, trust and self-anticipated satisfaction with the system. In so complex that users do not even seem to know the implications [17], I demonstrated that the effect of decision context (i.e. the of their own settings [25]. Similarly, informing users about the available options) in a location-sharing service depends on users’ rationale behind information requests does not make them more perception of the privacy and benefits of the available options. discerning about their privacy decisions, but merely makes them Finally, in [20] we show that perceived risk and perceived worry about privacy in general. For example, displaying a privacy relevance mediate users’ evaluation of the purpose-specificity of label on an e-commerce website—a supposed vote of information disclosure requests. confidence—may decrease instead of increase purchases [7]. 5. CONTEXTUALIZING PRIVACY Evidently, transparency and control do not work well in practice. The second step towards a personalized privacy decision support Due to the complexity of privacy decisions and users’ bounded is to contextualize the privacy calculus: to determine how stable rationality [2,3], an increase in transparency and control often just information disclosure is across people, items and situations, and, aggravates the problem by introducing choice overload [12,27] importantly, where it is context-dependent. and information overload [11]. For example, my research shows that although justifications 3. PRIVACY NUDGES generally do not increase disclosure or satisfaction, tailoring An alternative approach to support privacy decisions is to intro- justifications to the user can reduce this negative effect [21]. Such duce subtle yet persuasive nudges. Carefully designed nudges tailored justifications are personalized privacy nudges: they make it easier for people to make the right choice, without intelligently choose the correct justification for the respective limiting their ability to choose freely [29]. A justification, for user, or decide to not show any justification at all. example, makes it easier to rationalize decisions and to minimize Similarly, personalized defaults can be set up in a way that the regret associated with choosing the wrong option [9]. The anticipates people’s disclosure behavior, thereby making the effect of justifications in privacy research seems to vary. In my disclosure decisions easier and more convenient. My work and own research I have found that justifications are regarded as that of others shows that even though privacy preferences vary helpful, but do not increase users’ disclosure or satisfaction but considerably across users, distinct subgroups of users with similar rather decrease them [18,19]. Sensible defaults are another type of privacy preferences can be identified in many domains [16,26]. nudge that strongly impact disclosure [4,14,19]. Examples are Moreover, these subgroups can be mapped to demographics (e.g. framing a disclosure decision as either opt-in or opt-in, or age) and other behaviors (e.g. mobile Internet usage). My recent changing the order of information requests. work shows that these personalized defaults may also be tailored 40 to the website requesting the information: in [20] we show that 11. Eppler, M.J. and Mengis, J. The Concept of Information people are more likely to disclose information that matches the Overload: A Review of Literature from Organization purpose of the website requesting the information. Science, Accounting, Marketing, MIS, and Related Finally, in [17] I show that privacy decisions are influenced by the Disciplines. The Information Society 20, 5 (2004), 325–344. available options to choose from (“context effects”, cf. 12. Iyengar, S.S. and Lepper, M.R. When choice is [15,28,30]). In that study, users of a location-sharing service demotivating: Can one desire too much of a good thing? J. of decided whether to share their location with friends, colleagues Personality and Social Psychology 79, 6 (2000), 995–1006. and third party applications, with the following options: no 13. John, L.K., Acquisti, A., and Loewenstein, G. Strangers on a sharing, city, city block, or exact location. We manipulated the Plane: Context-Dependent Willingness to Divulge Sensitive availability of the “city” and “exact location” options, and showed Information. J of Consumer Research 37, 5 (2011), 858–873. that their absence or presence had a strong impact on how many 14. Johnson, E.J., Bellman, S., and Lohse, G.L. Defaults, users would choose each of the other available options. Framing and Privacy: Why Opting In ≠ Opting Out. Marketing Letters 13, 1 (2002), 5–15. 6. PRIVACY ADAPTATION PROCEDURE 15. Kahneman, D. and Tversky, A. Prospect Theory: An The ultimate purpose of this contextualized and explicated Analysis of Decision under Risk. Econometrica 47, 2 (1979), understanding of users’ privacy calculus is to develop a Privacy 263–292. Adaptation Procedure to support people's privacy decisions. Using 16. Knijnenburg, B.P., Kobsa, A., and Jin, H. Dimensionality of recommender system algorithms, the procedure predicts users’ information disclosure behavior. International Journal of privacy preferences based on their known characteristics. It then Human-Computer Studies, (2013). provides automatic “smart default” settings in line with users’ 17. Knijnenburg, B.P., Kobsa, A., and Jin, H. Preference-based disclosure profile. Smart defaults reduce the burden of control, but location sharing: are more privacy options really better? at the same time respect users’ inherent privacy preferences. Proceedings of CHI, (2013), 2667–2676. Similarly, it provides tailored disclosure justifications, but only to 18. Knijnenburg, B.P., Kobsa, A., and Saldamli, G. Privacy in users who can be expected to react rationally to them, so that they Mobile Personalized Systems: The Effect of Disclosure will not cause privacy scares in the other users. Justifications. Proceedings of U-PriSM, (2012). 19. Knijnenburg, B.P. and Kobsa, A. Making Decisions about This Privacy Adaptation Procedure relieves some of the burden of Privacy: Information Disclosure in Context-Aware the privacy decision from the user by providing the right amount Recommender Systems. ACM Transactions on Interactive of information and control that is useful but not overwhelming or Intelligent Systems 3, 3 (2013). misleading. It thus enables users to make privacy-related 20. Knijnenburg, B.P. and Kobsa, A. Counteracting the negative decisions within the limits of their bounded rationality. effect of form auto-completion on the privacy calculus. 7. REFERENCES Proceedings of ICIS, (2013). 1. Acquisti, A. and Gross, R. Imagined Communities: 21. Knijnenburg, B.P. and Kobsa, A. Helping users with Awareness, Information Sharing, and Privacy on the information disclosure decisions: potential for adaptation. Facebook. In Privacy Enhancing Technologies. 2006, 36–58. Proceedings of IUI, (2013), 407–416. 2. Acquisti, A. and Grossklags, J. Privacy and Rationality in 22. Kolter, J. and Pernul, G. Generating User-Understandable Individual Decision Making. IEEE Security & Privacy 3, 1 Privacy Preferences. 2009 International Conference on (2005), 26–33. Availability, Reliability and Security, IEEE Computer 3. Acquisti, A. and Grossklags, J. What Can Behavioral Society (2009), 299–306. Economics Teach Us About Privacy? In A. Acquisti et al., 23. Laufer, R.S. and Wolfe, M. Privacy as a Concept and a eds., Digital Privacy. Taylor & Francis, 2008, 363–377. Social Issue: A Multidimensional Developmental Theory. 4. Acquisti, A., John, L.K., and Loewenstein, G. The Impact of Journal of Social Issues 33, 3 (1977), 22–42. Relative Standards on the Propensity to Disclose. Journal of 24. Li, H., Sarathy, R., and Xu, H. Understanding situational Marketing Research 49, 2 (2012), 160–174. online information disclosure as a privacy calculus. Journal 5. Acquisti, A. Privacy in Electronic Commerce and the of Computer Information Systems 51, 1 (2010), 62–71. Economics of Immediate Gratification. Proceedings of the 25. Liu, Y., Gummadi, K.P., Krishnamurthy, B., and Mislove, A. ACM Conference on Electronic Commerce, (2004), 21–29. Analyzing facebook privacy settings: user expectations vs. 6. Besmer, A., Watson, J., and Lipford, H.R. The impact of reality. Proc. SIGCOMM 2011, ACM (2011), 61–70. social navigation on privacy policy configuration. 26. Phelps, J., Nowak, G., and Ferrell, E. Privacy Concerns and Proceedings of SOUPS, (2010). Consumer Willingness to Provide Personal Information. 7. Bustos, L. Best Practice Gone Bad: 4 Shocking A/B Tests. Journal of Public Policy & Marketing 19, 1 (2000), 27–41. GetElastic, 2012. http://www.getelastic.com/best-practice- 27. Scheibehenne, B., Greifeneder, R., and Todd, P.M. Can gone-bad-4-shocking-ab-tests/. There Ever Be Too Many Options? A Meta‐Analytic 8. Compañó, R. and Lusoli, W. The Policy Maker’s Anguish: Review of Choice Overload. Journal of Consumer Research Regulating Personal Data Behavior Between Paradoxes and 37, 3 (2010), 409–425. Dilemmas. In T. Moore, D. Pym and C. Ioannidis, eds., 28. Simonson, I. Choice Based on Reasons: The Case of Economics of Information Security and Privacy. Springer Attraction and Compromise Effects. Journal of Consumer US, New York, NY, 2010, 169–185. Research 16, 2 (1989), 158–174. 9. Connolly, T. and Zeelenberg, M. Regret in decision making. 29. Thaler, R.H. and Sunstein, C. Nudge  : improving decisions Current directions in psych. science 11, 6 (2002), 212–216. about health, wealth, and happiness. Yale University Press, 10. Culnan, M.J. “How Did They Get My Name?”: An New Haven, NJ & London, U.K., 2008. Exploratory Investigation of Consumer Attitudes toward 30. Tversky, A. Elimination by aspects: A theory of choice. Secondary Information Use. MISQ 17, 3 (1993), 341–363. Psychological Review 79, 4 (1972), 281–299. 41