=Paper= {{Paper |id=None |storemode=property |title=Nudging Users Towards Privacy on Mobile Devices |pdfUrl=https://ceur-ws.org/Vol-722/paper6.pdf |volume=Vol-722 }} ==Nudging Users Towards Privacy on Mobile Devices== https://ceur-ws.org/Vol-722/paper6.pdf
          Nudging Users Towards Privacy on Mobile Devices
         Rebecca Balebako, Pedro G. Leon, Hazim Almuhimedi, Patrick Gage Kelley, Jonathan
                Mugan, Alessandro Acquisti, Lorrie Faith Cranor and Norman Sadeh
                                     Carnegie Mellon University
                                          5000 Forbes Ave.
                                     Pittsburgh, PA 15213 USA


ABSTRACT                                                                       or monitored; other times, while aware of ongoing infor-
By allowing individuals to be permanently connected to the                     mation flows, we do not understand their consequences, or
Internet, mobile devices ease the way information can be ac-                   properly assess their risks. Such challenges are magnified
cessed and shared online, but also raise novel privacy chal-                   in mobile scenarios. Therefore, a mobile device user may
lenges for end users. Recent behavioral research on “soft”                     end up sharing information in a manner that goes against her
or “asymmetric” paternalism has begun exploring ways of                        own long-term self interests.
helping people make better decisions in different aspects of
their lives. We apply that research to privacy decision mak-                   In recent years, there has been growing interest in using
ing, investigating how soft paternalistic solutions (also known                lessons from behavioral economics to influence and ame-
as nudges) may be used to counter cognitive biases and ame-                    liorate decision making in situations where cognitive and
liorate privacy-sensitive behavior. We present the theoret-                    behavioral biases may adversely affect the individual [11,
ical background of our research, and highlight current in-                     16]. This approach is often referred to as soft or asymmetric
dustry solutions and research endeavors that could be classi-                  paternalism, or with the more popular term “nudges.” Soft
fied as nudging interventions. We then describe our ongoing                    paternalism aims at countering and overcoming those biases,
work on embedding soft paternalistic mechanisms in loca-                       so as to assist individual decision making. Our research aims
tion sharing technologies and Twitter privacy agents.                          at applying and extending lessons from the nascent field of
                                                                               soft paternalism to the field of privacy decision making. This
Author Keywords                                                                paper presents an overview of our research agenda in this
Nudge, Privacy, Security, Location Sharing, Mobile Devices,                    area. First, we introduce the research exploring cognitive
Soft Paternalism                                                               and behavioral biases in privacy decision making. Then, we
                                                                               examine current academic studies and industry products that
ACM Classification Keywords
                                                                               focus on influencing privacy (and security) decision making,
                                                                               and that therefore may be compared to nudging interven-
H.1.2 User/machine systems: Human information process-
                                                                               tions. Finally, we discuss how we are integrating soft pater-
ing; J.4 Social and behavioral sciences: psychology
                                                                               nalistic mechanisms in our research on privacy in location
                                                                               sharing applications and social networks.
INTRODUCTION
As mobile devices and applications become pervasive, pri-
vacy risks to their users also grow. The accessibility and                     FROM HURDLES IN PRIVACY DECISION MAKING
ease of use of these devices make it easy to casually broad-                   TO SOFT PATERNALISM
cast personal information at any time, from anywhere, to                       Findings from behavioral economics and behavioral deci-
friends and strangers. Without a doubt, users benefit from                     sion research have highlighted hurdles in human decision
and enjoy such streams of information sharing. However,                        making that lead, sometimes, to undesirable outcomes. The
they also expose themselves to tangible and intangible risks:                  hurdles are often due to lack of information or insight, cog-
from tracking by commercial entities interested in exploit-                    nitive limitations and biases, or lack of self-control [16]. Be-
ing personal information for profit, to surveillance or even                   cause of those hurdles, individuals may end up making de-
stalking by malicious parties. However, it is difficult for in-                cisions that they later regret. Those decisions may include
dividuals to determine the optimal balance between reveal-                     (not) saving for retirement, (not) eating well, or smoking
ing and hiding personal data. Sometimes we are not even                        cigarettes [11]. They may also include decisions about pro-
aware that information about us is being broadcast, shared,                    tecting too much, or not enough, personal information [3].
                                                                               Privacy decisions are complex and often taken in conditions
                                                                               of information asymmetry (that is, individuals may not have
                                                                               full knowledge of how much of their personal information is
                                                                               being gathered, and how it is being used). Furthermore, pri-
                                                                               vacy decision making may be overwhelming: the cognitive
                                                                               costs associated with considering all the ramifications of a
Copyright ⃝ c 2011 for the individual papers by the papers’ authors.
Copying permitted only for private and academic purposes. This volume is
                                                                               disclosure may hamper decision making [3]. Finally, cog-
published and copyrighted by the editors of PINC2011.                          nitive biases may affect one’s propensity to reveal personal


                                                                           1
information: for instance, heightened control of one’s per-           users are interested in protecting their privacy and may even
sonal information may, paradoxically, make the user over-             pay for it, if appropriate tools and salient, simple, and com-
confident about sharing information [5].                              pact privacy information are offered. Specifically, one series
                                                                      of studies explored the impact of making information about
Paternalistic policies try to solve decision-making hurdles           privacy practices on web sites more accessible to buyers.
by mandating decisions for individuals. Such policies are of-         The results showed that online customers are more likely to
ten heavy-handed and generate externalities [11]. Soft pater-         shop online from websites that exhibit more protective pri-
nalism, on the other hand, avoids coercion; it seeks to steer         vacy policies. Additionally, those customers are willing to
users in a direction (believed to be more desirable based on          pay a premium for privacy. Furthermore, privacy indicators
the user’s own prior judgement, or on external empirical val-         displayed at the moment an individual is shopping online
idation), without impinging on her autonomy. A soft pater-            may have an impact on consumer decisions. In particular,
nalistic solution, for instance, would consist of making an           they increase the willingness to pay for privacy; however,
individual aware of the biases, lack of information, or cog-          if the indicator is provided only after the shopper has al-
nitive overload that may affect her decision.                         ready chosen the website from which to buy, the user will
                                                                      not change their already-made decisions. The authors find
Nudges are tools of soft-paternalism, and may be used to              that timing is essential when trying to help people to protect
ameliorate privacy (as well as security) decision making [2].         their privacy [17], [6]. Similarly, another study found that
Their application to scenarios involving mobile devices is            merely priming Facebook users with questions about their
particularly appealing. In the case of insecure communica-            online disclosure behavior and the visibility of their Face-
tion channels, or covert data collection through a mobile de-         book profiles was sufficient to trigger changes in their dis-
vice, a nudge may take the form of an alert that informs the          closure behavior [13]. Application interface design is also
user of the risk. In the case of mobile devices that store sen-       important, and should help users notice when changes in
sitive information (which could be accessed by strangers if           context generate changes in information flows and then help
the phone was misplaced), a nudge might discourage users              them to maintain their privacy [7].
from storing private data on mobile phones. When informa-
tion is being disclosed through a smart-phone, nudges may             In the context of location sharing applications, providing
provide alerts about the recipients, contexts, or type of data        feedback to users whose location has been requested by oth-
being shared.                                                         ers has been shown to have both positive and negative im-
                                                                      plications [8]. It can prevent excessive requests and hence
Many different types of nudging interventions are possible.           protect people’s privacy. However, unless appropriate notifi-
Some simply consist of informing the user — in which case             cations are used, feedback receivers could also be annoyed.
they relate to privacy research on informed consent. Some             In addition, notifications may inhibit users from requesting
focus on making systems simpler to use — in which case,               others locations and hence affect system usage.
privacy nudges fall into the realm of research on privacy
usability. However, other nudges aim at countering spe-               PRIVACY NUDGES IN INDUSTRY
cific cognitive and behavioral biases, such as neutralizing           Examples of industry products or solutions that influence de-
the detrimental effects of immediate gratification biases in          cision making in regards to privacy (either to better protect
privacy decision making [1] by altering the individual’s per-         the user, or instead to influence her to reveal more informa-
ception of the sequence of costs and benefits associated with         tion) take various forms, and some have been applied to mo-
revealing sensitive information.                                      bile devices. Some of these solutions may be interpreted as
                                                                      soft paternalistic for privacy protection, in the sense that they
The literature on soft paternalism applied to privacy deci-
                                                                      nudge towards privacy. They include privacy/security us-
sion making is in its infancy, and therefore extremely scarce.
                                                                      ability solutions, simplifications of privacy settings, or tests
However, a number of recent studies and products focus on
                                                                      and delays before one can post information. More frequent,
mechanisms that may be categorized as nudges. We present
                                                                      however, are the examples of products and solutions that
a brief overview of them in the following sections.
                                                                      nudge individuals to give up even more of their privacy, sur-
                                                                      rendering sensitive information. These include privacy de-
PRIVACY NUDGES IN THE LITERATURE                                      faults that are open, lack of usability in privacy settings in-
Previous research on the drivers of privacy concerns has demon-       terfaces, poorly designed warnings, and other rewards for
strated that users’ attitudes towards security and privacy are        sharing data or encouraging friends to share data.
influenced by numerous factors, including information avail-
able, personal beliefs, economic valuations, moral reason-            Connections in social applications
ing, social values, cognitive biases, and so on. Therefore,           Some applications provide information about who can see
providing adequate information, making privacy tools more             your data, who has seen your data, or how many people can
evident, or rewarding and punishing users as they make safer          see your data. For instance, Flickr.com, a video and image
or riskier decisions are all ways of nudging or influencing           sharing website, provides information on each user-owned
privacy behavior. The privacy literature offers some exam-            picture stating who can see it, followed by a link to edit the
ples of these approaches.                                             privacy settings for that picture. This may be a nudge to-
                                                                      wards privacy, as users may decide to share certain photos
For example, recent experimental research has shown that              with friends, and share other photos with everyone.


                                                                  2
Social networking sites often show the number of connec-              doing so. Sophisticated users may choose to employ soft-
tions a user has. These connections may be called follow-             ware tools to prevent excess disclosure. For example, the So-
ers, friends, or ties. In some cases, connections can have            cial Media Sobriety Test, socialmediasobrietytest.
access to all the user’s information that is on the applica-          com, and Mail Goggles on Gmail googlelabs.com both
tion. Twitter and Google Buzz are examples of sites that              allow the user to set certain hours of the week when they may
prominently show the number of connections. In the case of            typically embarrass themselves, such as weekend evenings
LinkedIn.com, a job searching social network, the user may            after trips to the bar. During these hours, social network
prefer to add additional connections, even with people they           sites or Gmail may be blocked until the user can complete a
don’t know well, in order to grow their job-searching net-            dexterity or cognitive test. The user has the option to bypass
work. However, by opening their information to more con-              the test. Alternatively, a user may set up a warning system
nections, they may be compromising their privacy. These ap-           if a message is likely to be poorly interpreted. ToneCheck
plications may nudge users towards increasing their connec-           tonecheck.com scans emails written in Outlook to dis-
tions and revealing more information. Indeed, several online          cover whether the tone is off-putting, and will ask the user to
social networks such as Facebook.com and LinkedIn.com                 confirm before sending it. This may help discourage users
periodically encourage users to add new connections by search-        from sending or posting regrettable information.
ing the user’s email accounts for email contacts.
                                                                      Other tools may discourage users from posting information
Connections such as friends in Facebook and followers in              by reminding the user who can see it. NetNanny is a tool
Twitter do not set the boundaries for information flow. One’s         that parents can user to protect their children online. It will
connections may be able to share information with other un-           show a message every time a child posts on a social network.
intended recipients, or even make it available to the pub-            This message reminds the child that her parents will see the
lic. In Twitter, for example, re-tweets allow connections to          post as well netnanny.com.
pass on information without the original sender’s control. In
Facebook, default privacy settings usually allow sharing of
individual’s information with friends of friends. Therefore,          ONGOING WORK WITH MOBILE APPLICATIONS
the information provided about the number of connections              By studying and understanding the specific biases and user
may mislead the user about the privacy of their data and de-          actions in regards to mobile applications, we hope to sug-
crease the likelihood that the user will take an information-         gest and test nudges that will help users make decisions that
protective stance.                                                    improve their satisfaction and well being. We are moving
                                                                      towards that goal by first understanding users’ needs, prefer-
Privacy Settings                                                      ences, biases, and limitations about privacy, and second by
The privacy settings allowed in an application impact the             using that information to evaluate the efficacy of techniques
user’s ability to control how their information is shared. Both       that exploit biases to improve decision making. As an exam-
the default settings and the usability of the settings user in-       ple, we are currently pursuing foundational studies with two
terface create nudges towards and away from privacy [10,              applications developed at Carnegie Mellon: a location shar-
12, 13].                                                              ing application called Locaccino [15] and a privacy agent for
                                                                      Twitter.
Some websites make privacy options very simple. For exam-
ple, Pandora.com, an online music station, explicitly gives           Locaccino is a unique location sharing application that al-
users two options regarding their profile page: make pri-             lows users to control the conditions under which they make
vate or keep public. These clear options allow a user to              their location visible to others. This includes controlling the
choose without understanding complex details or settings.             times and days of the week when different groups of people
Conversely, the lack of granularity may encourage users to            can see the user’s location as well as the specific locations
make everything public.                                               where the user is willing to be visible. For instance, a user
                                                                      can specify rules such as “I’m willing to let my colleagues
Several tools provide simple ratings of privacy settings. Pri-        see my location but only when I am on company premises
vacyCheck,1 and ProfileWatch,2 give Facebook settings a               and only 9am-5pm on weekdays.” Research conducted by
privacy score. Other services provide a user-friendly layer           our group has shown that this level of expressiveness is crit-
on the Facebook privacy settings, allowing the user to change         ical to capturing the location sharing preferences many peo-
the settings. For example, Privacy Defender3 provides a slid-         ple have when it comes to disclosing their locations to oth-
ing color scale that allows the user to set their Facebook op-        ers across a broad range of scenarios [4]—in contrast to the
tions as more or less private. These software services ac-            much narrower set of scenarios supported by location shar-
tively encourage stricter privacy settings.                           ing applications such as Foursquare.

                                                                      As part of our ongoing research, we are interested in bet-
Reduction of Information Disclosure                                   ter understanding how different elements of Locaccino func-
If an individual expects she may be likely to post information        tionality effectively nudge people in different directions. This
she may later regret, software exists to discourage her from          includes experimenting with new interface designs as well
1
  http://rabidgremlin.com/fbprivacy                                   as new ways of leveraging some of the machine learning
2
  http://atherionsecurity.com/idpro.html                              techniques we have been developing, from exposing differ-
3
  http://privacydefender.net                                          ent sets of default privacy personas to users [14] to helping


                                                                  3
them refine their privacy preferences [9]. We are looking at          6. S. Egelman, J. Tsai, L. F. Cranor, and A. Acquisti.
the preferences of like-minded users who have been using                 Timing is everything?: the effects of timing and
the system for a while and trying to use their preferences to            placement of online privacy indicators. In Proceedings
guide new users. This would have the potential of reducing               of the 27th international conference on Human factors
regret by giving new users the benefit of the experience ac-             in computing systems, CHI ’09, pages 319–328, New
quired over time by others. We plan to explore to what extent            York, NY, USA, 2009. ACM.
such an approach can be made to work and to what extent it
seems beneficial.                                                     7. G. Hull, H. R. Lipford, and C. Latulipe. Contextual
                                                                         gaps: Privacy issues on facebook. Ethics and
The Twitter privacy agent is an application we are building              Information Technology, pages 1–14, 2010.
to help Twitter users behave in a more privacy protective             8. L. Jedrzejczyk, B. A. Price, A. K. Bandara, and
way. We plan to build tools that will provide nudges that                B. Nuseibeh. On the impact of real-time feedback on
guide users to restrict their tweets to smaller groups of fol-           users’ behaviour in mobile location-sharing
lowers or discourage them from sending tweets from mobile                applications. In Proceedings of the Sixth Symposium on
devices that they may later regret. We plan to empirically               Usable Privacy and Security, SOUPS ’10, pages
test the impact of these nudges on user behavior. We will                14:1–12, New York, NY, USA, 2010. ACM.
also examine whether fine-grained privacy controls result in
more or less data sharing.                                            9. P. Kelley, P. Hankes Drielsma, N. Sadeh, and L. Cranor.
                                                                         User-controllable learning of security and privacy
We expect our work on nudges in behavioral advertising, so-              policies. In Proceedings of the 1st ACM workshop on
cial networks, and location sharing to be effective for im-              Workshop on AISec, pages 11–18. ACM, 2008.
proving privacy decisions on mobile devices. We further              10. Y.-L. Lai and K. L. Hui. Internet opt-in and opt-out:
hope our soft-paternalistic approach to have a broader im-               investigating the roles of frames, defaults and privacy
pact, guiding the development of tools and methods that as-              concerns. In Proceedings of the 2006 ACM SIGMIS
sist users in privacy and security decision making.                      CPR conference on computer personnel research,
                                                                         pages 253–263, 2006.
ACKNOWLEDGMENTS
This material is based upon work supported by the National           11. G. F. Loewenstein and E. C. Haisley. The Foundations
Science Foundation under Grant CNS-1012763 (Nudging                      of Positive and Normative Economics, chapter 9: The
Users Towards Privacy), and by Google under a Focused Re-                Economist as Therapist: Methodological Ramifications
search Award on Privacy Nudges. This work has also been                  of ‘Light’ Paternalism. Oxford University Press, 2008.
supported by NSF grants CNS-0627513, CNS-0905562, and                12. W. Mackay. Triggers and barriers to customizing
by CyLab at Carnegie Mellon under grants DAAD19-02-1-                    software. In Proceedings of the SIGCHI conference on
0389 and W911NF-09-1-0273 from the Army Research Of-                     Human factors in computing systems: Reaching
fice. Additional support has been provided by the IWT SBO                through technology, pages 153–160. ACM, 1991.
project on Security and Privacy in Online Social Networks
(SPION), Nokia, France Telecom, and the CMU/Portugal                 13. Ralph Gross and Alessandro Acquisti. Information
Information and Communication Technologies Institute.                    Revelation and Privacy in Online Social Networks. In
                                                                         Workshop on Privacy in the Electronic Society (WPES),
REFERENCES                                                               pages 71–80, 2005.
 1. A. Acquisti. Privacy in electronic commerce and the
                                                                     14. R. Ravichandran, M. Benisch, P. Kelley, and N. Sadeh.
    economics of immediate gratification. In Proceedings
                                                                         Capturing social networking privacy preferences: can
    of the ACM Conference on Electronic Commerce (EC
                                                                         default policies help alleviate tradeoffs between
    ’04), pages 21–29, 2004.
                                                                         expressiveness and user burden? In Proceedings of the
 2. A. Acquisti. Nudging privacy: The behavioral                         5th Symposium on Usable Privacy and Security,
    economics of personal information. Security & Privacy,               page 1. ACM, 2009.
    IEEE, 7(6):82–85, 2009.                                          15. N. Sadeh, J. Hong, L. Cranor, I. Fette, P. Kelley,
 3. A. Acquisti and J. Grossklags. Privacy and rationality               M. Prabaker, and J. Rao. Understanding and capturing
    in individual decision making. Security & Privacy,                   peoples privacy policies in a mobile social networking
    IEEE, 3(1):26–33, 2005.                                              application. Personal and Ubiquitous Computing,
                                                                         13(6):401–412, 2009.
 4. M. Benisch, P. Kelley, N. Sadeh, and L. Cranor.
    Capturing Location-Privacy Preferences: Quantifying              16. R. Thaler and C. Sunstein. Nudge: Improving decisions
    Accuracy and User-Burden Tradeoffs. Journal of                       about health, wealth, and happiness. Yale Univ Pr,
    Personal and Ubiquitous Computing, 2011.                             2008.
 5. L. Brandimarte, A. Acquisti, and G. Loewenstein.                 17. J. Y. Tsai, S. Egelman, L. Cranor, and A. Acquisti. The
    Misplaced Confidences: Privacy and the Control                       effect of online privacy information on purchasing
    Paradox. Technical report, Mimeo, Carnegie Mellon                    behavior: An experimental study. Information Systems
    University, 2010.                                                    Research, In press, 2010.


                                                                 4