=Paper=
{{Paper
|id=Vol-2684/2-paginated
|storemode=property
|title=On the Importance of Context: Privacy Perceptions ofPersonal vs. Health Data in Health Recommender Systems
|pdfUrl=https://ceur-ws.org/Vol-2684/1-paginated.pdf
|volume=Vol-2684
|authors=Laura Burbach,Poornima Belavadi,Patrick Halbach,Nils Plettenberg,Johannes Nakayama,Lilian Kojan,André Calero Valdez
|dblpUrl=https://dblp.org/rec/conf/recsys/BurbachBHPNKV20
}}
==On the Importance of Context: Privacy Perceptions ofPersonal vs. Health Data in Health Recommender Systems==
On the Importance of Context: Privacy Perceptions of General
vs. Health-specific Data in Health Recommender Systems
Laura Burbach Nils Plettenberg
Poornima Belavadi Johannes Nakayama
Patrick Halbach Lilian Kojan
burbach@comm.rwth-aachen.de André Calero Valdez
belavadi@comm.rwth-aachen.de plettenberg@comm.rwth-aachen.de
halbach@comm.rwth-aachen.de nakayama@comm.rwth-aachen.de
RWTH Aachen University kojan@comm.rwth-aachen.de
Aachen, Germany calero-valdez@comm.rwth-aachen.de
RWTH Aachen University
Aachen, Germany
ABSTRACT 1 INTRODUCTION
Recommender systems are essential to reduce complexity on the Many people use the Internet to seek health-related information be-
web due to the plethora of available content. However, depending fore or after a doctor’s appointment [1]. However, such information
on design choices they require a lot of (potentially personal) data to is often complex and contradictory which makes it difficult for users
work, raising the issue of privacy and acceptance of such systems. to assess it along with its relevance to their personal situation [26].
This is particularly true when they are used in sensitive matters such Recommender systems tackle this issue by filtering information and
as health. We addressed these issues in a survey of 163 participants offering personalized recommendations to users [6]. Such system
in which we presented three different health-related contexts where can also be used in the healthcare sector to recommend information,
recommender systems can be used: 1) desire for better nutrition therapies, or side-effect free medicine [9]. While recommender sys-
and more exercise, 2) information about causes and treatment of tems are already more established and accepted in many other areas
headaches and nausea, and 3) information about side effects of of application, (potential) users of health recommender systems are
a medication prescribed by a doctor. We found that participants even more concerned about privacy and security. User acceptance
are generally more willing to disclose their general data than their is hampered by technical aspects such as data ownership or privacy
specifically health-related data. The more health-critical the context and security, as well as user diversity aspects such as data and
of use was, the more willing they were to disclose health-related health literacy [8, 29].
data.
CCS CONCEPTS
• Human-centered computing → Human computer interaction 2 RELATED WORK
(HCI); Collaborative and social computing; Empirical studies Health recommender systems can improve the quality of preventive
in collaborative and social computing; • Security and privacy → health care [24]. Nonetheless, when asked about their inclination
Social aspects of security and privacy. to disclose data to these systems, users are often concerned about
their privacy and these concerns must be taken into account when
KEYWORDS considering acceptability [18].
Li et al. investigated the acceptance of wearables in the health
Health Recommender Systems; Privacy; User Perceptions; Trust;
sector and found that users conduct a risk-benefit analysis to decide
Acceptance; Application contexts
whether to use wearables: If the perceived benefit outweighs the
ACM Reference Format: perceived risk, they are more likely to use them [15]. The phenom-
Laura Burbach, Poornima Belavadi, Patrick Halbach, Nils Plettenberg, Jo- enon of users performing a risk-benefit analysis to decide which of
hannes Nakayama, Lilian Kojan, and André Calero Valdez. 2020. On the their personal information they want to disclose is called Privacy
Importance of Context: Privacy Perceptions of General vs. Health-specific
Calculus [3, 14]. For this risk-benefit analysis, it has been shown
Data in Health Recommender Systems. In Proceedings of the 5th Interna-
that patients who use computers more frequently [21], use the In-
tional Workshop on Health Recommender Systems co-located with 14th ACM
Conference on Recommender Systems (HealthRecSys ’20), September 26, 2020, ternet more often, or have a higher level of education are more
Online, Worldwide, 6 pages. willing to disclose data to obtain a benefit.
Caine and Hanania have investigated which type of health data
users voluntarily disclose [7]. They found that users are less willing
HealthRecSys ’20, September 26, 2020, Online, Worldwide to disclose more sensitive health data such as information on their
© 2020 Copyright for the individual papers remains with the authors. Use permitted mental and sexual health. In contrast, Frost et al.’s analysis of online
under Creative Commons License Attribution 4.0 International (CC BY 4.0). This
volume is published and copyrighted by its editors. cancer communities found that patients affected by poorer health
were more willing to disclose their private data [12].
HealthRecSys ’20, September 26, 2020, Online, Worldwide Burbach et al.
In addition to the sensitivity of the data itself, it has been shown a chronic disease, that they might fall in with a serious illness or
that other experiences on the Internet affect the willingness to that they get infected when sick people are in their environment
disclose data as well. Awad and Krishnan found that a previous (Cronbach’s 𝛼 = .807).
invasion of privacy decreased the respondents’ willingness to be Privacy concerns. Perceived privacy while using Internet ser-
profiled for personalized advertising [3]. Similarly, Frost et al. found vices was assessed with seven items from Xu et al., Li et al. and
that patients who previously had bad experiences on the Internet Morton et al. [16, 20, 32]. The items measure generalized fear that
were less willing to disclose their data [12]. general data stored online could be “insecure” and concerns about
When considering user preferences, technology acceptance mod- misuse of personal data (Cronbach’s 𝛼 = .777).
els are also relevant. Research of technology acceptance has shown Trust. To assess institution-based trust, we used six items from
an influence of user factors such as gender, age, and technology self- McKnight et al. [17]. Through principal component analysis we
efficacy on the willingness to use a technology [27]. Further, when discovered that the scale breaks down into two dimensions. The
asked to provide personal data to an Internet service provider [23], first dimension depicts users’ trust in online services concerning
users differ in their perceptions of trust [17] and privacy con- the handling of their (personal data) (Cronbach’s 𝛼 = .617). The
cerns [16, 20, 32]. second dimension assesses how much users trust the technical
Some studies have shown that a majority of respondents (patients infrastructure to ensure privacy on the Internet (technical) (Cron-
and doctors) gave positive ratings to the use of computers for patient bach’s 𝛼 = .862). In addition, we measured general disposition to
health. For them, the advantages outweigh the disadvantages in trust using six items by McKnight [17] (Cronbach’s 𝛼 = .732).
terms of confidentiality [15, 21]. Application contexts. In the last part of the survey we presented
Nevertheless, the decision to use a health recommendation sys- three different application contexts of recommendation algorithms
tem remains a balance between benefit and concern. Different us- in health settings to the participants. For twelve different types
age contexts may provide different benefits and result in different of data, such as date of birth or medication currently being taken,
concerns. Much of previous research has looked at specific illness- we asked whether the participants would disclose these in each
related contexts (e.g., smoking cessation, weight loss, sports) or application context.
specific privacy concerns in isolation. First, the participants should imagine that they committed to a
Our Contribution. The objective of this study is to consider the healthier lifestyle (context healthy life). We explained that the health
privacy concerns (potential) users have when using recommenda- recommendation system is a mobile app that provides nutritional
tion systems in different health application contexts, and the extent recommendations and encourages users to be more active.
to which they are willing to disclose different general and health For the second application context (complaints) the participants
data. In this study, we identify what general and health data the should imagine that they feel headaches and nausea and therefore
participants consider to be sensitive and whether there are differ- use an app to find out about the causes and treatment options. They
ences in the willingness of participants to disclose more sensitive were told that the more data they would enter, the more reliable
data. We also consider whether different user factors influence the the diagnosis would be.
willingness to disclose the aforementioned data. In the last application context (drugs) the participants should
imagine that the doctor prescribes a medication for them and they
would like to check with an app which side effects can occur. They
3 METHODS were told at this point that the more data the app receives, the more
To find out whether the application context of health recommen- reliably it can assess the risks.
dation systems influences the users’ willingness to disclose their For all three contexts, we performed a factor analysis with the
data, we conducted an online survey in German. Participants were 12 different data items, resulting in two scales, general data (Date
acquired using convenience sampling between July and August of Birth, Gender, height, weight) and health data (preexisting con-
2018 and March and April 2019. The survey was distributed via the ditions, chronic illnesses, illnesses of family members, allergies,
social network Facebook using snowball-sampling. current medication, information about diet, alcohol consumption,
The survey consisted of three parts: First, we asked the partici- smoking behavior). We then tested the reliability of the two scales
pants for demographic factors (age and gender), perceived health, for each context individually as shown in table 1.
and smoking habits. Next, we measured technology self-efficacy,
health concerns, privacy concerns, institution-based (dis)trust and Table 1: Scales, items and reliability as Cronbach’s 𝛼.
disposition to trust. Lastly, we assessed the participants willingness
to disclose personal and health data for three different application
contexts. Context Scale Items 𝛼
Technology Self-Efficacy (TSE). We used eight items of Beier’s healthy life general data 4 .89
scale for measuring technology self-efficacy (TSE) [4], extended by healthy life health data 8 .95
two additional items to account for the shift in answering tendency. complaints general data 4 .91
Internal reliability was good according to DeVellis [11] (Cronbach’s complaints health data 8 .96
𝛼 = .82). drugs general data 4 .94
Health concerns. To assess participants’ general health con- drugs health data 8 .96
cerns, we asked them four questions about whether they were
worried about their general health status, that they might develop
Privacy Perceptions in Health Recommender Systems HealthRecSys ’20, September 26, 2020, Online, Worldwide
3.1 Hypotheses lower general disposition to trust as well as a higher institution-
Following the results of the study of Caine and Hanania (see sec- based distrust personal data. Participants with a higher computer
tion 2), we assume for all contexts that the participants are less self-efficacy have also higher privacy concerns. Higher computer
willing to disclose health data, which should be more sensitive to self-efficacy and higher privacy concerns also correlate positively
them than general data (𝐻 1 ). We also assume that, according to the with a higher institution-based distrust personal data. Interestingly,
risk-benefit analysis, participants distinguish between the three participants with higher privacy concerns have also more institution-
application contexts and are more willing to disclose their data based trust technical. Participants with a higher institution-based
for the context drugs, as this is where they could see the strongest trust technical tend to have a lower disposition to trust.
benefit—preventing potentially dangerous side-effects(𝐻 2 ). Application contexts As described in section 3, we presented
We further assume that negative experiences with the Internet three application contexts of health recommendation systems to
and thus higher privacy concerns (𝐻 3 ) and lower institution-based the participants and asked if the participants would disclose their
trust (𝐻 4 ) inhibit the willingness to disclose data, while the disposi- personal and health data. Figure 1 shows, that the participants
tion to trust boosts it (𝐻 5 ). Lastly, we assume that higher age (𝐻 6 ), indicated for each context a higher willingness to share their general
lower Technology Self Efficacy (𝐻 7 ), and being female (𝐻 8 ) correlate data than their health data. The highest difference occurs for a
with a lower willingness to disclose data. healthy life.
Means of the willingness to disclose data for the three contexts
3.2 Statistical Procedures
healthy life
To analyze our descriptive results we used means, standard devia-
health data ●
tions, and 95% within-subject confidence intervals [19]. We ensured
sampling adequacy by using the Kaiser-Meyer Olkin criterion. With general data ●
Bartlett’s 𝜒 2 test we tested the sphericity of our data. We further complaints
looked at associations between variables using Pearson correla- Data type
health data ●
tions. We report the correlation coefficient 𝑟 and an asymmetric
general data ●
95% confidence interval that is generated by population bootstrap-
ping [10]. Finally, we used MANOVA repeated measurements to drugs
analyze differences between the contexts. health data ●
All study materials, data, and analysis code are available online
general data ●
at the Open Science Foundation.1
1 2 3 4 5
Willingness to disclose data
4 RESULTS Errorbars denote 95% ws−confidence intervals. Dotted line is threshold of neutrality.
We analyzed the data using R version 3.6.2 and several packages [2,
22, 25, 28, 30, 31]. Analyses were run on Mojave 10.14.6 MacOs Figure 1: Relative comparison of the willingness to disclose
(system x86 64, darwin 15.6.0). Our data showed good sampling different types of data in our three contexts.
adequacy using the Kaiser-Meyer Olkin criterion (𝑀𝑆𝐴 > 0.8 for
all items) and showed sufficient sphericity. With Bartlett’s 𝜒 2 test Comparing the three contexts, we found that participants are
we tested the sphericity of our data (𝜒 2 (630) = 7008.197, 𝑝 < .001), less willing to disclose their general data for complaints and most
which was present. Next, we will describe our sample and then willing to disclose their general data for a healthy life. In contrast,
present the findings of our analyses. they are less willing to disclose their health data for a healthy life
and most willing to disclose their health data to find side-effects of
4.1 Description of the sample drugs. The more sensitive the use context (most to less sensitive:
Of the 163 participants 108 (66%) were female and 55 (34%) were drugs, complaints, healthy life), the more willing they are to disclose
male. The participants were on average M = 28.8 years old (SD = health data.
11.1). Most participants in our sample did not smoke (145, 89%). Men A computed MANOVA for repeated measurements with the
and women were about the same age on average (𝑡 (161) = −0.695, three contexts and the general data showed a significant overall
𝑝 = .488). The participants showed a rather low technology self- effect of the contexts (𝑊 𝑖𝑙𝑘𝑠Λ = .754, 𝐹 (2, 143) = 23.33, 𝑝 < .001)
efficacy (M = 3.20, SD = 0.80) and rather low health concerns (M with a large effect (Partial 𝜂 2 = .246). Gender is not related to
= 3.14, SD = 1.18). They showed an even lower institution-based willingness to disclose general data ( 𝜒˜ 2 = 31.68 − 34.66, 𝑝 > .05).
trust technical (M = 2.74, SD = 1.06) and matching this rather high We also found a significant overall effect of the contexts for health
privacy concerns (M = 4.21, SD = 0.06) and a rather high institution- data (𝑊 𝑖𝑙𝑘𝑠Λ = .807, 𝐹 (2, 143) = 17.08, 𝑝 < .001) with a large effect
based distrust personal data (M = 4.41, SD = 1.08). Interestingly they (𝑃𝑎𝑟𝑡𝑖𝑎𝑙𝜂 2 = .193). The 𝜒˜ 2 -Test showed a small effect of gender
showed a rather high disposition to trust (M = 3.89, SD = 0.70). on the drugs context ( 𝜒˜ 2 (16) = 26.40 − 18.00, 𝑝 = .049), females
Correlations of independent variables To get a more accurate are more willing to disclose their health data (M = 3.92, SD = 1.01)
impression of our sample, we can look at the Pearson correlations than males (M = 3.46, SD = .20). Gender did not relate to the other
of our independent variables (see Table 2). Older people have a contexts ( 𝜒˜ 2 = 11.27 − 18.00, 𝑝 > .05).
So far, we looked at the overall willingness to disclose both
1 Link to the OSF Repository:https://osf.io/5f6jy/ general and health data in the three contexts. Following, we look
HealthRecSys ’20, September 26, 2020, Online, Worldwide Burbach et al.
Table 2: Correlation table of our independent variables
Variable 1 2 3 4 5 6 7
1. Age .212** -.207**
2. Computer self-efficacy .242** .203*
3. Health concerns
4. Privacy concerns .437** .224**
5. Institution-based distrust personal data
6. Institution-based trust technical -.231**
7. Disposition to trust
Note. * indicates p < .05. ** indicates p < .01.
at the willingness to disclose the (12) individual data types. As life (𝑟 = .18, 𝑝 = .028). Further, disposition to trust causes the partic-
mentioned before and as can be seen in Figure 2, the participants’ ipants to be more willing to disclose their health data for a healthy
willingness to disclose data is higher for general data than for health life (𝑟 = .22, 𝑝 = .007). We did not find an influence of age, com-
data. From the general data, the participants are less willing to puter self-efficacy, privacy concerns and institution-based trust on
disclose their day of birth. This applies to all contexts, but for the willingness to disclose any data (all 𝑝 > .05). Looking at the
a healthy life the contrast between the participants’ willingness different data types in the three contexts, participants that are more
to disclose their personal and health data is clearer. In particular, willing to disclose any data for any context are also more willing
the participants are less willing to disclose hereditary diseases to disclose other data or for other contexts (all 𝑟𝑠 > .47, 𝑝 < .001).
and medicine intake for a healthy life, whereas they are willing to
disclose their medicine intake for the drugs context. Looking at the
health data, the participants are for all contexts more willing to 5 DISCUSSION
disclose eating habits, sleeping data and activity data. In contrast, In this study, we investigated the effects of three different applica-
they distinguish between the contexts for pre-existing conditions, tion contexts for health recommendation systems and the effect
chronic diseases, hereditary diseases, allergies and medicine intake. of user diversity factors on the willingness to disclose personal
For the context drugs, they are strongest inclined to disclose the and health data. We first state, that participants differentiate be-
most types of health data, followed by the context complaints and tween personal and health data and are more willing to disclose
√
they are least willing to disclose the data for a healthy life. their general data (𝐻 1 ). Furthermore, the different contexts had a
significant influence on the willingness to disclose. For health data,
Means of the willingness to disclose data for all data types our results show that the more sensitive the application context is,
the more willing the participants are to disclose their health data
day of birth ●
● ●●
●● √
● ●
(𝐻 2 ). For general data, the participants prefer to disclose their
gender ●●● ●
height Complaints ●● ●
● ●● data for a healthy life, whereas they are least willing to disclose
weight ●
● ●
● ●
● data for complaints (𝐻 2 X).
Type of data
pre−existing conditions From the investigated user-factors only health concerns and
chronic diseases Drugs √
disposition to trust (𝐻 5 ) seem to influence the willingness to
hereditary diseases Healthy life
allergies disclose data. At this point, the increased concern about health
medicine intake seems to increase the participants’ willingness to disclose their
eating habits data. People with better health status may expect fewer personal
sleeping data
benefits from disclosing their data [13]. We did not see a strong
activity data
effect of previous experience (𝐻 3 and 𝐻 4 X), age (𝐻 6 X), gender (𝐻 7
1 2 3 4 5
Willingness to disclose data X) or technology self-efficacy (𝐻 8 X).
Participants had to think of a fictitious situation which can lead to
Data category ●
● general data health data reports revealing less or more data than they would actually reveal.
Besides, it is conceivable that users of health recommendation
Errorbars denote standard errors. Dotted line is threshold of neutrality.
systems would change their initial willingness after experiencing
the benefits of the recommendation systems. Nevertheless, studies
Figure 2: Individual comparison of the willingness to dis- in technology acceptance showed that preferences are at least to
close different types of data in our three contexts. some degree stable over time [27].
In reality, users often do not consciously decide whether they
Beyond the effect of the different application contexts and differ- want to disclose their data but disclose their data unconsciously or
ent types of data we found that higher health concerns are associated inadvertently. Nevertheless, our study shows that different applica-
with a higher willingness to disclose general data for all three con- tion contexts of health recommendation systems have an impact on
texts (healthy life: 𝑟 = −.19, 𝑝 = .018; complaints: 𝑟 = .20, 𝑝 = .018; what data users want to disclose. In future research, we would like
drugs: 𝑟 = .19, 𝑝 = .019) and their health data for context healthy to take up this point and use experiments to examine how users
Privacy Perceptions in Health Recommender Systems HealthRecSys ’20, September 26, 2020, Online, Worldwide
actually perceive the recommendation systems in the respective [6] Robin Burke. 2002. Hybrid recommender systems: Survey and experiments. User
context. modeling and user-adapted interaction 12, 4 (2002), 331–370.
[7] Kelly Caine and Rima Hanania. 2012. Patients want granular privacy control
over health information in electronic medical records. Journal of the American
5.1 Does only the application context influence Medical Informatics Association 20, 1 (2012), 7–15.
[8] André Calero Valdez and Martina Ziefle. 2018. The Users’ Perspective on the
the acceptance of recommendation Privacy-Utility Trade-offs in Health Recommender Systems. International Journal
of Human-Computer Studies 121 (04 2018), 108–121. https://doi.org/10.1016/j.
systems? ijhcs.2018.04.003
Of course, more aspects that may influence the acceptance of rec- [9] André Calero Valdez, Martina Ziefle, and Katrien Verbert. 2016. HCI for Recom-
mender Systems: The Past, the Present and the Future. In Proceedings of the 10th
ommendation systems than the application context. Burbach et ACM Conference on Recommender Systems (Boston, Massachusetts, USA) (RecSys
al. [5] for example investigated, if individuals accept five different ’16). ACM, New York, NY, USA, 123–126. https://doi.org/10.1145/2959100.2959158
recommendation algorithms (content-based recommendation, col- [10] Geoff Cumming. 2014. The new statistics: Why and how. Psychological science
25, 1 (2014), 7–29.
laborative filtering, hybrid recommendation, social-, trust-based [11] R.F. DeVellis. 2016. Scale Development: Theory and Applications. SAGE Publica-
recommendation) for three different product categories (books, mo- tions, Singapore. https://books.google.de/books?id=48ACCwAAQBAJ
[12] Jeana Frost, Ivar Vermeulen, and Nienke Beekers. 2014. Anonymity Versus
biles, and contraceptives). Critically, not only the purpose of the Privacy: Selective Information Sharing in Online Cancer Communities. Journal
recommendation but also the use of data inside of different algo- of medical Internet research 16 (05 2014), e126. https://doi.org/10.2196/jmir.2684
rithms seems to play a role in the acceptance of recommendation [13] Nima Kordzadeh, John Warren, and Ali Seifi. 2016. Antecedents of privacy
calculus components in virtual health communities. International Journal of
systems. Here, algorithms that are able to create a more accurate Information Management 36, 5 (2016), 724–734.
picture of the users, were less likely to be accepted [5]. [14] Robert S. Laufer and Maxine Wolfe. 1977. Privacy as a concept and a social issue:
Amultidimensional developmental theory. Journal of social Issues 33, 3 (1977),
22–42.
6 CONCLUSION [15] He Li, Jing Wu, Yiwen Gao, and Yao Shi. 2016. Examining individuals’ adop-
tion of healthcare wearable devices: An empirical study from privacy calculus
Concluding, many aspects determine how much individuals accept perspective. International journal of medical informatics 88 (2016), 8–17.
recommendation systems. The acceptance of different recommen- [16] Yuan Li. 2014. The impact of disposition to privacy, website reputation and
dation systems depends among other things on the application website familiarity on information privacy concerns. Decision Support Systems
57, 1 (2014), 343–354. http://dx.doi.org/10.1016/j.dss.2013.09.018
context of the recommendations, but also on the product type that [17] D Harrison McKnight, Vivek Choudhury, and Charles Kacmar. 2002. Devel-
is recommended. Our research has shown, that the users have a oping and validating trust measures for e-commerce: An integrative typology.
very distinct idea of what type of data should be used in what type Information systems research 13, 3 (2002), 334–359.
[18] Marci Meingast, Tanya Roosta, and Shankar Sastry. 2006. Security and Privacy
of context and show decreased willingness if the data seems un- Issues with Health Care Information Technology. Conference proceedings : ...
necessary for a health-related decision. Accordingly, there is not a Annual International Conference of the IEEE Engineering in Medicine and Biology
Society. IEEE Engineering in Medicine and Biology Society. Conference 1 (02 2006),
one-size-fits-all recommendation system, but the acceptance of the 5453–8. https://doi.org/10.1109/IEMBS.2006.260060
recommendation system is always determined by a combination of [19] Richard D Morey et al. 2008. Confidence intervals from normalized data: A
different contexts and users. correction to Cousineau (2005). reason 4, 2 (2008), 61–64.
[20] Anthony Morton. 2013. Measuring Inherent Privacy Concern and Desire for
Privacy - A Pilot Survey Study of an Instrument to Measure Dispositional Privacy
7 OUTLOOK Concern. In 2013 International Conference on Social Computing (SocialCom), Vol. 1.
IEEE Computer Society, Los Alamitos, CA, USA, 468–477. https://doi.org/10.
In the future, we will conduct additional studies on the user ac- 1109/SocialCom.2013.73
ceptance of recommendation systems. We will consider different [21] Gihan Perera, Anne Holbrook, Lehana Thabane, Gary Foster, and Donald J.
Willison. 2011. Views on health information sharing and privacy from primary
aspects in one study. For example, it would be interesting to con- care practices using electronic medical records. International journal of medical
sider whether the application context or the recommended item informatics 80, 2 (2011), 94–101.
have a greater influence on the acceptance of different recommen- [22] William Revelle. 2020. psych: Procedures for Psychological, Psychometric, and
Personality Research. https://CRAN.R-project.org/package=psych R package
dation systems and how these two aspects influence each other. A version 2.0.7.
particularly suitable method for this would be a conjoint analysis, [23] Anne Kathrin Schaar, André Calero Valdez, and Martina Ziefle. 2013. The impact
of user diversity on the willingness to disclose personal information in social network
in which different aspects of a recommendation system could be services. A comparison of the private and business context. LNCS, Vol. 7946. Springer,
weighed against each other. Berlin [u.a.], 174–193. https://doi.org/10.1007/978-3-642-39062-3_11
[24] Hanna Schäfer, Santiago Hors-Fraile, Raghav Pavan Karumur, André
Calero Valdez, Alan Said, Helma Torkamaan, Tom Ulmer, and Christoph Trattner.
REFERENCES 2017. Towards health (aware) recommender systems. In Proceedings of the 2017
[1] H.K. Andreassen, M.M. Bujnowska-Fedak, C.E. Chronaki, R.C. Dumitru, I. Pudule, international conference on digital health. 157–161.
H. Santana, S. Vosands, and R. Wynn. 2007. European citizens’ use of e-health [25] Ravi Selker, Jonathon Love, and Damian Dropmann. 2020. jmv: The ’jamovi’
services: a study of seven countries. BMC public health 7, 1 (2007), 53. Analyses. https://CRAN.R-project.org/package=jmv R package version 1.2.23.
[2] F. Aust and M. Barth. 2020. papaja: Prepare reproducible APA journal articles with [26] K. Sommerhalder, A. Abraham, M.C. Zufferey, J. Barth, and T. Abel. 2009. Internet
R Markdown. https://github.com/crsh/papaja R package version 0.1.0.9997. information and medical consultations: experiences from patients’ and physicians’
[3] Naveen Awad and M. Krishnan. 2006. The Personalization Privacy Paradox: An perspectives. Patient education and counseling 77, 2 (2009), 266–271.
Empirical Evaluation of Information Transparency and the Willingness to be [27] Viswanath Venkatesh, James Thong, and Xin Xu. 2012. Consumer Acceptance
Profiled Online for Personalization. MIS Quarterly 30 (03 2006), 13–28. https: and Use of Information Technology: Extending the Unified Theory of Acceptance
//doi.org/10.2307/25148715 and Use of Technology. MIS Quarterly 36 (03 2012), 157–178. https://doi.org/10.
[4] Guido Beier. 1999. Kontrollüberzeugungen im Umgang mit Technik [Self Efficacy 2307/41410412
in the Use of Technology]. Report Psychologie 9 (1999), 684–693. [28] Hadley Wickham, Mara Averick, Jennifer Bryan, Winston Chang,
[5] Laura Burbach, Johannes Nakayama, Nils Plettenberg, Martina Ziefle, and An- Lucy D’Agostino McGowan, Romain François, Garrett Grolemund, Alex
dré Calero Valdez. 2018. User preferences in recommendation algorithms: Hayes, Lionel Henry, Jim Hester, Max Kuhn, Thomas Lin Pedersen, Evan Miller,
the influence of user diversity, trust, and product category on privacy per- Stephan Milton Bache, Kirill Müller, Jeroen Ooms, David Robinson, Dana Paige
ceptions in recommender algorithms. In Proceedings of the 2018 ACM Con- Seidel, Vitalie Spinu, Kohske Takahashi, Davis Vaughan, Claus Wilke, Kara Woo,
ference on Recommender Systems. ACM, Vancouver, Kanada, 306–310. https: and Hiroaki Yutani. 2019. Welcome to the tidyverse. Journal of Open Source
//doi.org/10.1145/3240323.3240393 Software 4, 43 (2019), 1686. https://doi.org/10.21105/joss.01686
HealthRecSys ’20, September 26, 2020, Online, Worldwide Burbach et al.
[29] Martin Wiesner and Daniel Pfeifer. 2014. Health recommender systems: con- Open Science Framework. Journal of Open Source Software 5, 46 (2020), 2071.
cepts, requirements, technical basics and challenges. International journal of https://doi.org/10.21105/joss.02071
environmental research and public health 11, 3 (2014), 2580–2607. [32] Heng Xu, Tamara Dinev, H. Smith, and Paul Hart. 2008. Examining the Forma-
[30] Claus O. Wilke. 2019. cowplot: Streamlined Plot Theme and Plot Annotations for tion of Individual’s Privacy Concerns: Toward an Integrative View. ICIS 2008
’ggplot2’. https://CRAN.R-project.org/package=cowplot R package version 1.0.0. Proceedings - Twenty Ninth International Conference on Information Systems 1, 1,
[31] Aaron R. Wolen, Chris H.J. Hartgerink, Ryan Hafen, Brian G. Richards, Court- 6.
ney K. Soderberg, and Timothy P. York. 2020. osfr: An R Interface to the