<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Federated Learning as an Analytical Framework for Personal Data Management -a proposition paper</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Maciej</forename><surname>Zuziak</surname></persName>
							<email>maciejkrzysztof.zuziak@isti.cnr.it</email>
							<affiliation key="aff0">
								<orgName type="institution">Consiglio Nazionale delle Ricerche</orgName>
								<address>
									<addrLine>Via Giuseppe Moruzzi, 1</addrLine>
									<settlement>Pisa</settlement>
									<country key="IT">Italy</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Salvatore</forename><surname>Rinzivillo</surname></persName>
							<email>rinzivillo@isti.cnr.it</email>
							<affiliation key="aff1">
								<orgName type="institution">Consiglio Nazionale delle Ricerche</orgName>
								<address>
									<addrLine>Via Giuseppe Moruzzi, 1</addrLine>
									<settlement>Pisa</settlement>
									<country key="IT">Italy</country>
								</address>
							</affiliation>
						</author>
						<author>
							<affiliation key="aff2">
								<orgName type="department">International Workshop on Imagining the AI Landscape After the AI Act</orgName>
								<address>
									<addrLine>May</addrLine>
									<postCode>2022</postCode>
									<settlement>Amsterdam</settlement>
									<country key="NL">Netherlands</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Federated Learning as an Analytical Framework for Personal Data Management -a proposition paper</title>
					</analytic>
					<monogr>
						<imprint>
							<date/>
						</imprint>
					</monogr>
					<idno type="MD5">5AD23A5DA36898C3C512E204D77FFA61</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-23T23:34+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Federated Learning</term>
					<term>General Data Protection Regulation</term>
					<term>Data Management</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Data minimisation and storage limitation are two principles incorporated in the GDPR aimed to increase personal data subjects' control over their own data and put restrictions on the amount of information that may be extracted from them in the data mining process. Implementation of those two principles has always been a challenging task, as their interpretation is discretional and current legislative measures may not necessarily protect data subjects adequately. In this paper, we introduce the concept of distributed learning as a viable tool for implementing data minimisation and storage limitation principles and argue that perhaps it could be appropriate to consider a branch of distributed learning, namely the concept of federated learning, as an analytical measure for guaranteeing data limitation and minimisation. To further support this thesis, we discuss how Federated Learning may be used in geospatial data analysis while the final outcomes of the experiments are yet to be published.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.">Introduction</head><p>This article lays down a proposal for the extended study of Federated Learning as a viable tool for implementing the data minimisation and storage limitation principles incorporated in Article 5 of the General Data Protection Regulation of 2016. In this paper, we argue several theses:</p><p>1. Data minimisation and storage limitation principles are two discretionary measures that rely heavily on the technological framework applied for data collection &amp; data processing; 2. Distributed learning in general, and Federated Learning especially, are viable tools for the implementation of the data minimisation &amp; storage limitation principles, thus giving data subjects increased control over their personal data; 3. Distributed Learning, especially federated learning (which we consider a subbranch of the former), should be further assessed in the context of minimisation-aware systems for digital servicesoriented datamining. 4. It could be beneficial to the user awareness to implement architecture based on the idea of distributed learning and provide them with a backchannel about the training process and the use of their data. That way, we can not only raise awareness about the value of personal data but also possibly mitigate the adverse effects of using the users' devices to handle part of the model training.</p><p>In connection to thesis no. 2, multiple authors have published on the topic of privacy-preserving federated learning systems <ref type="bibr" target="#b0">[1]</ref><ref type="bibr" target="#b1">[2]</ref><ref type="bibr" target="#b2">[3]</ref><ref type="bibr" target="#b3">[4]</ref><ref type="bibr" target="#b4">[5]</ref><ref type="bibr" target="#b5">[6]</ref><ref type="bibr" target="#b6">[7]</ref><ref type="bibr" target="#b7">[8]</ref>. We propose putting federated learning in the context of data minimisation and storage limitation principles of the General Data Protection Regulation and the upcoming legal framework for data protection and AI regulation. The data protection by design as presented by S. Rosello et al. <ref type="bibr" target="#b8">[9]</ref> in 2021 may be a viable solution to many complicated issues arising from the need to be compliant with an increasing amount of regulations, and the federated learning as a tool for that is gaining increased attention <ref type="bibr" target="#b9">[10]</ref>. We want to contribute to the ongoing discussion by putting the distributed learning systems in the context of two specific principles, namely, the principle of data minimisation &amp; storage limitation. In connection to thesis no. 3 and no. 4, we present here an outline of a possible experiment that may be carried out to assess the performance of the proposed measures. We also briefly argue why the low-user engagement methods such as federated learning may be the best choice for the latter.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Data Minimisation and Storage Limitation Principles in the GDPR</head><p>Data minimisation and data limitation are two terms that belong to the broader set of principles that refers to data quality. Together with the 1) lawfulness, fairness and transparency, 2) purpose limitation, 3) accuracy, and 4) integrity and confidentiality, they are shaping the way personal data should be controlled, processed and discarded throughout the whole knowledge discovery cycle <ref type="bibr" target="#b10">[11]</ref>. According to the Article 5 of the Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) <ref type="bibr" target="#b11">[12]</ref>: (e) kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed; personal data may be stored for longer periods insofar as the personal data will be processed solely for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89 <ref type="bibr" target="#b0">(1)</ref> subject to implementation of the appropriate technical and organisational measures required by this Regulation in order to safeguard the rights and freedoms of the data subject ('storage limitation');</p><p>2. The controller shall be responsible for, and be able to demonstrate compliance with, paragraph 1 ('accountability'). <ref type="foot" target="#foot_0">2</ref>The data minimisation principle is not a new measure, as it was already incorporated in the Article 3(1)(c) of Regulation (Ec) No 45/2001 Of The European Parliament and of the Council of 18 December 2000 on the protection of individuals with regard to the processing of personal data by the Community institutions and bodies and on the free movement of such data <ref type="bibr" target="#b12">[13]</ref>, and the wording of that principles was almost the same as the one incorporated in the GDPR. It primarily concerns which type (and what amount) of data is targeted for extraction, while the storage limitation generally specifies how long and under what condition the personal data may be stored.</p><p>In line with the data minimisation and storage limitation is the principle of purpose limitation, according to which personal data shall be collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes <ref type="bibr" target="#b13">[14]</ref>- <ref type="bibr" target="#b15">[16]</ref>. Although essential in its nature, it must be highlighted that it is generally described as different and independent from the data minimisation and storage limitation principles that are described in this article.</p><p>According to the Information Commissioner's Office (ICO) 3 , the principle of data minimisation requires that the processed personal data should be sufficient to fulfil the stated purpose <ref type="bibr">[</ref></p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>adequate] properly, have a rational link to that purpose [relevant] and was not held in an amount that exceeds what is strictly necessary for that purpose [limitation to what is necessary] [17].</head><p>This matter could be analysed from a solely legal or technological side, depending on the chosen aspect. Providing an example, the adequateness and relevance could be seen as a primarily legal issue connected to the stated purpose of the personal data processing and establishing a rational link between processing and that purpose, while the aspect of data storage limitation is more technical measures, that relies heavily on how we store, preprocess and analyse the collected data. It can be argued that it is almost impossible to implement regulatory measures that could possibly guarantee that the amount of collected data is adequate, as the data subjects have minimal insight into how much of their data is collected and what types of data are extracted. Once the raw data is transferred beyond the users' device into the Date Warehouses and Analytical Databases <ref type="bibr" target="#b17">[18]</ref>, it is almost impossible to guarantee any level of data storage limitation principle, as the users themselves would have to control and oversee the whole data lifecycle, with multiple inquires and requests issued towards the data controller and data processor to ensure, that they do comply with the biding regulations.</p><p>The necessity to enforce better privacy standards while seeking beyond purely legal remedies has inspired some researchers to reconsider their approach to data protection compliance. In 2019 the European Union Agency for Cybersecurity (ENISA) had published Recommendations on shaping technology according to GDPR provisions -Exploring the notion of data protection by default <ref type="bibr" target="#b18">[19]</ref>, while in the upcoming years, the concept of Data protection by design has caught the attention of some authors <ref type="bibr" target="#b8">[9]</ref>. It is worth noting that federated learning was one of those technologies that were distinguished most commonly in the context of GDPR-compliant technology <ref type="bibr" target="#b9">[10]</ref>, <ref type="bibr" target="#b19">[20]</ref>, while some of the publications pointed out also the risk associated with that method of distributed learning <ref type="bibr" target="#b20">[21]</ref>  4 .</p><p>One more thing must be highlighted in the context of the data minimisation and storage limitation principles. While we assume that protecting their personal data is in the best interest of the data subjects, it is the sole responsibility of the data processors and controllers to comply with any possible obligations arising from those principles. At this point, we would also like to briefly elaborate on the choice of relevant technology, as this stays in particular connection with the mentioned above. Over the recent years, many proposals regarding decentralised learning and analytics have been raised, and some of them, such as Personal Data Stores, attracted widespread attention <ref type="bibr" target="#b21">[22]</ref>, <ref type="bibr" target="#b22">[23]</ref>. In our opinion, in the current data economy paradigm, the architecture for decentralised learning should be unintrusive, secure, and effortless from the user's point of view. It is an essential notion that forcing users to opt-out for more time and knowledge consuming solutions could be shifting the burden of data minimisation and storage limitation principles towards the data subjectwhich would be unacceptable from the axiological point of view. While the shift towards a more decentralised ecosystem may result in the adoption of more user-centric methods (such as PDS/PBS), we present here a "one-step-approach" where the data subjects gain more control over their data without directly shifting the paradigm of current data processing right and obligations under the European legislation. In the context of the unintrusive-secure-effortless paradigm, we firmly believe that distributed learning is a viable choice. In this article we are referring to both, guidelines and explanations of the European Data Protection Supervisor (EDPS) as well as those provided by the Information Commissioner's Office (ICO). If they are any discrepancies between GDPR and UK GDPR we are raising and explaining them in advance. We also use those references to put the overviewed principles in a slightly broadly concept. <ref type="bibr" target="#b3">4</ref> Due to the conceptual nature of this article, we will not go into much detail regarding the privacy issues that may be found in FL. However, it is worth highlighting, that FL-based systems may be more prone to some types of attacks that can infringe the participant's privacy. For a short synapsis on that issues see: Inpher: The Privacy Risk Right Under Our Nose in Federated Learning (Part 1), <ref type="bibr" target="#b22">23</ref>  </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Distributed Learning and Federated Learning</head><p>Federated learning originated from the idea of training the model on the dataset distributed over a wide area. Generally, the federated optimisation was proposed to handle the data that is:</p><p>• Massively Distributed (data points are stored across a vast number of nodes); • Not Independent and Identically Distributed (data on each node may be drawn from different distributions); • Unbalanced (Different nodes may vary by many orders of magnitude in the number of training examples they hold) <ref type="bibr" target="#b23">[24]</ref>.</p><p>The experiment performed by J. Konecny et al. in 2015 was conducted under the following assumptions:</p><p>• The data stored on multiple nodes may be privacy-sensitive, so the key objective should be to train the model on local nodes but not to transfer the data to one central node;</p><p>• Some of the nodes connected to the network (or all of them) may not necessarily have stable access to the network, so in real-life circumstances, it will be crucial to minimise the round of communications;</p><formula xml:id="formula_0">•</formula><p>The data is not independent and identically distributed <ref type="bibr" target="#b23">[24]</ref> A few years later, after the proposed experiment, Federated Learning has gained popularity amongst the Data Science community, with much work centered on privacy-related issues. According to the current state-of-the-art, Federated Learning could be defined as a machine learning setting where many clients (e.g. mobile devices or whole organisations) collaboratively train a model under the orchestration of a central server (e.g. service provider) while keeping the training data decentralised.</p><p>Formally, the problem was defined as a minimalisation of the objective function: </p><formula xml:id="formula_1">F(x) = E i~P [F i (x)], where F i (x) = E ε~D i [f i (x,</formula></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.">Distributed Learning and Federated Learning</head><p>In the previous sections, we have placed federated learning in the context of data minimisation and storage limitation principles of GDPR. In this chapter, we want to propose a specific application scenario that could be carried out regarding the assessed framework. Before overviewing the proposition of the experiment, we would like to formulate a few key marks on the characteristics of the system that should be favorable to the implementation of the data minimisation &amp; storage limitation principles. Namely:</p><p>• The system should minimise the amount of raw data transferred beyond users' devices (beyond the realm of the clients). The system should also explicitly allow users to choose whether they want to participate in the training loop while clearly indicating that it may be beneficial but not necessary to participate in the model's training.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>•</head><p>If the users consent to participation in the training, the system should explicitly declare that they can withdraw from it at any time they deem appropriate.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>•</head><p>Users should have a range of information on how the system is trained, what type of data is processed on their local devices, and the hyperparameters of the general model that are being updated in the current (or upcoming) training iterations.</p><p>In accordance with that, we want to realise an analytical framework where collaborative data computation is possible on spatio-temporal data. In particular, we focus on the analysis of individualbased contributed GPS data collected during the movement of personal vehicles.</p><p>The analytical framework will have several capabilities: computation of aggregation-based indicators (i.e. the radius of gyration, CO2 emission estimation); collective patterns (i.e. aggregated traffic flows and models for description and prediction; profiling of user behaviour; sustainability compatible behaviour estimation); global models (i.e. temporal footprint of traffic evolution, learning of predictive models for traffic forecasting, etc.). These three dimensions need to be instantiated into a distributed/federated setting, where several computational challenges need to be addressed:</p><p>• the individual-based choice for participation, managing a range of levels for the collaboration, ranging from full data disclosure to avoid any type of participation, passing through different levels of data perturbation and obfuscation;</p><p>• implementing a one-against-all framework, where the client may share only a local learned model that can be compared with the global one, to give the user feedback and raise self-awareness; • designing mechanisms for allowing opt-out of a client, eventually refreshing the existing models already learned.</p><p>Apart from performing the experiment, many mixed technological and legal issues arise in connection with the distributed learning environment (and federated learning especially). Those problems were not yet thoroughly researched, and they may possess a tremendous challenge when discussing the distributed data processing environment. A few exemplary questions in that regard:</p><p>• What are the legal consequences of opting out by a user who participated in the original training of the model?</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>•</head><p>If the user has opted out of the modelwhat are the measures that can be taken to rectify the model and possibly delete any traces of personal data from that model? • How can we communicate and explain the training process to the users of edge devices? How can we avoid disconnecting them from the network or opting out of the training?</p><p>The successful implementation of the said experiment will allow us to contribute to the discussion on federated learning as well as further explore the concept of using distributed learning as a tool for the implementation of the data minimisation and storage limitation principles.</p><p>Thus far, many experiments have been conducted, and the federated learning was tested in different settings and circumstances. The federated learning was used to deliver experiments on, among other things: recommendation systems <ref type="bibr" target="#b0">[1]</ref>, <ref type="bibr" target="#b1">[2]</ref>, <ref type="bibr" target="#b25">[26]</ref>, meta-learning systems for fraudulent credit card detection <ref type="bibr" target="#b26">[27]</ref> or learning systems for mobile keyboard prediction <ref type="bibr" target="#b28">[28]</ref>.</p><p>One advantage of working on that particular technology is the wide range of tools that may be used to perform simulation of a decentralised environmentallowing researchers to focus on a particular problem rather than on implementing the technological framework from scratch -Tensor Flow Federated (TFF) [29], FedML <ref type="bibr" target="#b29">[30]</ref>, PySyft <ref type="bibr" target="#b30">[31]</ref>, PyVertical <ref type="bibr" target="#b31">[32]</ref>, Leaf <ref type="bibr" target="#b32">[33]</ref> are just a few examples of tools that can be used to work with the concept of distributed learning while conducting experiments.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.">Closing Remarks</head><p>Unquestionably, the decentralisation of data collection and processing is a promising concept that possibly can shift the paradigm toward a more equitable and engaging future of collaborative data science. The idea of distributed learning was born from a strict necessitywith the growing amount of data that must be processed, it is even harder to rely on centralised methods that would require constant expansion of the storage (and computational) resources. However, the major challenge may not necessarily arise from the optimisation problems but from reaching a specific level of compliance with the current legislation and sustaining a high level of collaboration with the users of edge devices.</p><p>We presented our view on the development of that technology, where the strong emphasis is placed on the principle of data minimisation and storage limitations. It is crucial to present users with a clear and well-defined trade-offwithout that, the cost of sacrificing (some of them) their devices' computational power may deter them from such decentralised frameworks. It must be stated that here we have taken into account primarily only one approach to distributed learning, namely, federated learning. In our opinion, it could suffice the unintrusive-secure-effortless paradigm that we shared earlier. Notwithstanding any other benefits coming from such an approach, much research may be conducted to shape that technology fully compliant and user-friendly.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Article 5 Principles</head><label>5</label><figDesc>relating to processing of personal data Personal data shall be: (c) adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed ('data minimisation');</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>3</head><label>3</label><figDesc></figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head></head><label></label><figDesc>ε)] where: • 𝑥 ∈ ℝ 𝑑 represents the parameters for the global model; • 𝐹 𝑖 : ℝ 𝑑 → ℝ denotes the local objective function at client i; • P denotes the distribution of the population of clients [25]. 5</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_0"><head></head><label></label><figDesc>February 2021; and for more detailed analysis see especially: Nguyen Truong et al.: Privacy Preservation in Federated Learning: An insightful survey from the GDPR Perspective, 18 March 2021; or L. Melis et al.: Exploiting Unintended Feature Leakage in Collaborative Learning, 1 November 2018.</figDesc><table /></figure>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="2" xml:id="foot_0">The following principles are also elaborated on in recital no.</note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="39" xml:id="foot_1">of the Regulation (EU) 2016/679.</note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="5" xml:id="foot_2">This definition presented in Jianyu Wang et al. -A Field Guide to Federated Optimization is based on the overview of the existing work on the problemit may be worthwhile noting, that different authors approach the same problem quite differently when putting it in the formal manner.</note>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.">Acknowledgements</head><p>The research is part of the Legality Attentive Data Scientist project. The Project has received funding from the European Union's Horizon Marie Skłodowska-Curie Actions (MSCA) 2020 Innovative Training Networks (ITN). Grant Agreement ID: 956562</p><p>This Word template was created by Aleksandr Ometov, TAU, Finland. The template is made available under a Creative Commons License Attribution-ShareAlike 4.0 International (CC BY-SA 4.0).</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">Privacy-Preserving News Recommendation Model Learning</title>
		<author>
			<persName><forename type="first">Tao</forename><surname>Qi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Fangzhao</forename><surname>Wu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Chuhan</forename><surname>Wu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yongfeng</forename><surname>Huang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Xing</forename><surname>Xie</surname></persName>
		</author>
		<idno type="DOI">10.18653/v1/2020.findings-emnlp.128</idno>
	</analytic>
	<monogr>
		<title level="m">Findings of the Association for Computational Linguistics: EMNLP 2020</title>
				<imprint>
			<date type="published" when="2020-11">Nov. 2020</date>
			<biblScope unit="page" from="1423" to="1432" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Privacy-preserving federated learning framework in multimedia courses recommendation</title>
		<author>
			<persName><forename type="first">Yangjie</forename><surname>Qin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ming</forename><surname>Li</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Jia</forename><surname>Zhu</surname></persName>
		</author>
		<idno type="DOI">10.1007/s11276-021-02854-1</idno>
	</analytic>
	<monogr>
		<title level="j">Wirel. Netw</title>
		<imprint>
			<date type="published" when="2022-01">Jan. 2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Federated Learning With Blockchain for Autonomous Vehicles: Analysis and Design Challenges</title>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">R</forename><surname>Pokhrel</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Choi</surname></persName>
		</author>
		<idno type="DOI">10.1109/TCOMM.2020.2990686</idno>
	</analytic>
	<monogr>
		<title level="j">IEEE Trans. Commun</title>
		<imprint>
			<biblScope unit="volume">68</biblScope>
			<biblScope unit="issue">8</biblScope>
			<biblScope unit="page" from="4734" to="4746" />
			<date type="published" when="2020-08">Aug. 2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Federated Collaborative Filtering for Privacy-Preserving Personalized Recommendation System</title>
		<author>
			<persName><forename type="first">M</forename><surname>Ammad-Ud-Din</surname></persName>
		</author>
		<ptr target="http://arxiv.org/abs/1901.09888" />
	</analytic>
	<monogr>
		<title level="j">Cs Stat</title>
		<imprint>
			<date type="published" when="2019-02-10">ArXiv190109888. Jan. 2019. 10 February, 2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">A Federated Learning Approach for Privacy Protection in Context-Aware Recommender Systems</title>
		<author>
			<persName><forename type="first">Waqar</forename><surname>Ali</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Rajesh</forename><surname>Kumar</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Zhiyi</forename><surname>Deng</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yanshong</forename><surname>Wang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Jie</forename><surname>Shao</surname></persName>
		</author>
		<idno type="DOI">10.1093/comjnl/bxab025</idno>
	</analytic>
	<monogr>
		<title level="j">Comput. J</title>
		<imprint>
			<biblScope unit="volume">64</biblScope>
			<biblScope unit="issue">7</biblScope>
			<biblScope unit="page" from="1016" to="1027" />
			<date type="published" when="2021-07">Jul. 2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">A Blockchain-Based Decentralised Federated Learning Framework with Committee Consensus</title>
		<author>
			<persName><forename type="first">Yuzheng</forename><surname>Li</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Chuan</forename><surname>Chen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Nan</forename><surname>Liu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Huawei</forename><surname>Huang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Zibin</forename><surname>Zheng</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Qiang</forename><surname>Yan</surname></persName>
		</author>
		<idno type="DOI">10.1109/MNET.011.2000263</idno>
	</analytic>
	<monogr>
		<title level="j">IEEE Netw</title>
		<imprint>
			<biblScope unit="volume">35</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page" from="234" to="241" />
			<date type="published" when="2021-01">Jan. 2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">The Distributed Discrete Gaussian Mechanism for Federated Learning with Secure Aggregation</title>
		<author>
			<persName><forename type="first">P</forename><surname>Kairouz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ziyu</forename><surname>Liu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Steinke</surname></persName>
		</author>
		<ptr target="https://proceedings.mlr.press/v139/kairouz21a.html" />
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 38th International Conference on Machine Learning</title>
				<meeting>the 38th International Conference on Machine Learning</meeting>
		<imprint>
			<date type="published" when="2021-07-03">Jul. 2021. 03 March, 2022</date>
			<biblScope unit="page" from="5201" to="5212" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">Federated Learning and Privacy -Building privacy-preserving systems for machine learning and data science on decentralised data</title>
		<author>
			<persName><forename type="first">K</forename><surname>Bonawitz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Kairouz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Mcmahan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Ramage</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Google</forename></persName>
		</author>
		<ptr target="https://queue.acm.org/detail.cfm?id=3501293" />
	</analytic>
	<monogr>
		<title level="j">ACM Queue</title>
		<imprint>
			<biblScope unit="volume">19</biblScope>
			<biblScope unit="issue">5</biblScope>
			<date type="published" when="2021-11">Nov. 2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<monogr>
		<title level="m" type="main">Data protection by design in AI? The case of federated learning</title>
		<author>
			<persName><forename type="first">R</forename><surname>Stephanie</surname></persName>
		</author>
		<idno>3879613</idno>
		<ptr target="https://papers.ssrn.com/abstract=3879613" />
		<imprint>
			<date type="published" when="2021-05-11">May 2021. 11 April, 2022</date>
			<pubPlace>Rochester, NY</pubPlace>
		</imprint>
		<respStmt>
			<orgName>Social Science Research Network</orgName>
		</respStmt>
	</monogr>
	<note>SSRN Scholarly Paper</note>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Privacy preservation in federated learning: An insightful survey from the GDPR perspective</title>
		<author>
			<persName><forename type="first">N</forename><surname>Truong</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Kai</forename><surname>Sun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Siyao</forename><surname>Wang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Guitton</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yike</forename><surname>Guo</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.cose.2021.102402</idno>
	</analytic>
	<monogr>
		<title level="j">Comput. Secur</title>
		<imprint>
			<biblScope unit="volume">110</biblScope>
			<biblScope unit="page">102402</biblScope>
			<date type="published" when="2021-11">Nov. 2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<monogr>
		<ptr target="https://edps.europa.eu/data-protection/data-protection/glossary/d_en" />
		<title level="m">Data Protection Glossary, An official website of the European Union</title>
				<imprint>
			<date type="published" when="2022-04-07">2022. 07 April, 2022</date>
		</imprint>
		<respStmt>
			<orgName>European Data Protection Supervisor</orgName>
		</respStmt>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<monogr>
		<ptr target="https://eur-lex.europa.eu/eli/reg/2016/679/oj" />
		<title level="m">free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)</title>
				<imprint>
			<date type="published" when="2016-02-08">2016. April 2016. 2016. 08 February, 2022</date>
		</imprint>
	</monogr>
	<note>European Commission. Text with EEA relevance</note>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">Regulation (EC) No 45/2001 of the European Parliament and of the Council of 18 December 2000 on the protection of individuals with regard to the processing of personal</title>
		<ptr target="http://data.europa.eu/eli/reg/2001/45/oj/eng" />
	</analytic>
	<monogr>
		<title level="m">data by the Community institutions and bodies and on the free movement of such data</title>
				<imprint>
			<date type="published" when="2000-04-07">2000. 07 April, 2022</date>
			<biblScope unit="volume">008</biblScope>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<title level="a" type="main">The function of the principle of purpose limitation in light of Article 8 ECFR and further fundamental rights</title>
		<author>
			<persName><forename type="first">M</forename><surname>Von Grafenstein</surname></persName>
		</author>
		<ptr target="https://www.jstor.org/stable/j.ctv941v5w.5" />
	</analytic>
	<monogr>
		<title level="m">The Principle of Purpose Limitation in Data Protection Laws</title>
				<imprint>
			<publisher>Nomos Verlagsgesellschaft mbH</publisher>
			<date type="published" when="2018-04-07">2018. 07 April, 2022</date>
			<biblScope unit="page" from="109" to="596" />
		</imprint>
	</monogr>
	<note>1st ed</note>
</biblStruct>

<biblStruct xml:id="b14">
	<monogr>
		<ptr target="https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/principles/purpose-limitation/" />
		<title level="m">Information Commissioner&apos;s Office, GDPR: Principle (b): Purpose limitation&apos;, Information Commissioner&apos;s Office Website</title>
				<imprint>
			<date type="published" when="2022-01-17">17 January, 2022. 07 April, 2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<monogr>
		<author>
			<persName><forename type="first">Inc</forename><surname>Dataguise</surname></persName>
		</author>
		<ptr target="https://www.dataguise.com/gdpr-knowledge-center/purpose-limitation-principle/" />
		<title level="m">GDPR Purpose Limitation Principle -GDPR Knowledge Center</title>
				<imprint>
			<publisher>Dataguise Website</publisher>
			<date type="published" when="2022-04-07">2022. 07 April, 2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b16">
	<monogr>
		<ptr target="https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/principles/data-minimisation/" />
		<title level="m">Information Commissioner&apos;s Office, GDPR Principle (c): Data minimisation&apos;, Information Commissioner&apos;s Office Website</title>
				<imprint>
			<date type="published" when="2021-02-11">11 February, 2021. 07 April, 2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b17">
	<monogr>
		<title level="m" type="main">Practical Applications of Data Mining</title>
		<author>
			<persName><forename type="first">S</forename><surname>Suh</surname></persName>
		</author>
		<imprint>
			<date type="published" when="2012">2012</date>
			<publisher>Jones &amp; Bartlett Publishers</publisher>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b18">
	<monogr>
		<author>
			<persName><surname>Enisa</surname></persName>
		</author>
		<ptr target="https://www.enisa.europa.eu/publications/recommendations-on-shaping-technology-according-to-gdpr-provisions-part-2" />
		<title level="m">Recommendations on shaping technology according to GDPR provisions -Exploring the notion of data protection by default, ENISA website</title>
				<imprint>
			<date type="published" when="2022-04-07">07 April, 2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b19">
	<analytic>
		<title level="a" type="main">Benefits and challenges of federated learning under the GDPR</title>
		<author>
			<persName><surname>Musketeer</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Medium</title>
		<imprint>
			<biblScope unit="volume">04</biblScope>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b20">
	<monogr>
		<author>
			<persName><surname>Impher</surname></persName>
		</author>
		<ptr target="https://inpher.io/journal-blog/the-privacy-risk-right-under-our-nose-in-federated-learning-part-1/" />
		<title level="m">The Privacy Risk Right Under Our Nose in Federated Learning (Part 1)&apos; Inpher Website</title>
				<imprint>
			<date type="published" when="2021-02-23">23 February, 2021. 11 April, 2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b21">
	<analytic>
		<title level="a" type="main">openPDS: Protecting the Privacy of Metadata through SafeAnswers</title>
		<author>
			<persName><forename type="first">Y.-A</forename><surname>De Montjoye</surname></persName>
		</author>
		<author>
			<persName><forename type="first">E</forename><surname>Shmueli</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Samuel</forename><forename type="middle">S</forename><surname>Wang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">S</forename><surname>Pentland</surname></persName>
		</author>
		<idno type="DOI">10.1371/journal.pone.0098790</idno>
	</analytic>
	<monogr>
		<title level="j">PLOS ONE</title>
		<imprint>
			<biblScope unit="volume">9</biblScope>
			<biblScope unit="issue">7</biblScope>
			<biblScope unit="page">e98790</biblScope>
			<date type="published" when="2014-07">Jul. 2014</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b22">
	<analytic>
		<title level="a" type="main">My data store: toward user awareness and control on personal data</title>
		<author>
			<persName><forename type="first">M</forename><surname>Vescovi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Perentis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Leonardi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Lepri</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Moiso</surname></persName>
		</author>
		<idno type="DOI">10.1145/2638728.2638745</idno>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing</title>
				<meeting>the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing<address><addrLine>New York, NY, USA</addrLine></address></meeting>
		<imprint>
			<publisher>Adjunct Publication</publisher>
			<date type="published" when="2014-09">Sep. 2014</date>
			<biblScope unit="page" from="179" to="182" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b23">
	<analytic>
		<title level="a" type="main">Federated Optimization: Distributed Optimization Beyond the Datacenter</title>
		<author>
			<persName><forename type="first">J</forename><surname>Konečný</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Mcmahan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Ramage</surname></persName>
		</author>
		<ptr target="http://arxiv.org/abs/1511.03575" />
	</analytic>
	<monogr>
		<title level="m">Cs Math</title>
				<imprint>
			<date type="published" when="2015-11-11">Nov. 2015. Accessed: 11 February, 2022</date>
			<biblScope unit="page">151103575</biblScope>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b24">
	<monogr>
		<title level="m" type="main">A Field Guide to Federated Optimization</title>
		<author>
			<persName><forename type="first">Jianyu</forename><surname>Wang</surname></persName>
		</author>
		<ptr target="http://arxiv.org/abs/2107.06917" />
		<imprint>
			<date type="published" when="2021-07-12">Jul. 2021. 12 May, 2022</date>
			<biblScope unit="page">210706917</biblScope>
		</imprint>
	</monogr>
	<note>Cs</note>
</biblStruct>

<biblStruct xml:id="b25">
	<analytic>
		<title level="a" type="main">FedRec++: Lossless Federated Recommendation with Explicit Feedback</title>
		<author>
			<persName><forename type="first">Feng</forename><surname>Liang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Weike</forename><surname>Pan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Zhong</forename><surname>Ming</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proc. AAAI Conf. Artif. Intell</title>
				<meeting>AAAI Conf. Artif. Intell</meeting>
		<imprint>
			<date type="published" when="2021-05">May 2021</date>
			<biblScope unit="volume">35</biblScope>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b26">
	<monogr>
		<title level="m" type="main">Federated Meta-Learning for</title>
		<author>
			<persName><forename type="first">Wenbo</forename><surname>Zheng</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Lan</forename><surname>Yan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Chao</forename><surname>Gou</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Fei-Yue</forename><surname>Wang</surname></persName>
		</author>
		<imprint/>
	</monogr>
</biblStruct>

<biblStruct xml:id="b27">
	<analytic>
		<title level="a" type="main">Fraudulent Credit Card Detection</title>
		<idno type="DOI">10.24963/ijcai.2020/642</idno>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence</title>
				<meeting>the Twenty-Ninth International Joint Conference on Artificial Intelligence<address><addrLine>Yokohama, Japan</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2020-07">Jul. 2020</date>
			<biblScope unit="page" from="4654" to="4660" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b28">
	<monogr>
		<title level="m" type="main">Federated Learning for Mobile Keyboard Prediction</title>
		<author>
			<persName><forename type="first">A</forename><surname>Hard</surname></persName>
		</author>
		<ptr target="https://www.tensorflow.org/federated" />
		<imprint>
			<date type="published" when="2019-02-10">Feb. 2019. 10 February, 2022. 2022. 11 February, 2022</date>
			<biblScope unit="page">181103604</biblScope>
		</imprint>
		<respStmt>
			<orgName>Brain Team, TensorFlow Federated, tensorflow</orgName>
		</respStmt>
	</monogr>
	<note>Cs</note>
</biblStruct>

<biblStruct xml:id="b29">
	<monogr>
		<author>
			<persName><surname>Fedai</surname></persName>
		</author>
		<ptr target="https://fedml.ai/" />
		<title level="m">FedML -The Federated Learning/Analytics and Edge AI Platform</title>
				<imprint>
			<publisher>FedML Official Website</publisher>
			<date type="published" when="2022-02-11">2022. 11 February, 2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b30">
	<analytic>
		<title level="a" type="main">PySyft: A Library for Easy Federated Learning</title>
		<author>
			<persName><forename type="first">A</forename><surname>Ziller</surname></persName>
		</author>
		<idno type="DOI">10.1007/978-3-030-70604-3_5</idno>
	</analytic>
	<monogr>
		<title level="m">Federated Learning Systems: Towards Next-Generation AI</title>
				<editor>
			<persName><forename type="first">M</forename><forename type="middle">H</forename><surname>Ur Rehman</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">M</forename><forename type="middle">M</forename><surname>Gaber</surname></persName>
		</editor>
		<meeting><address><addrLine>Cham</addrLine></address></meeting>
		<imprint>
			<publisher>Springer International Publishing</publisher>
			<date type="published" when="2021">2021</date>
			<biblScope unit="page" from="111" to="139" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b31">
	<monogr>
		<title level="m" type="main">PyVertical: A Vertical Federated Learning Framework for Multi-headed SplitNN</title>
		<author>
			<persName><forename type="first">D</forename><surname>Romanini</surname></persName>
		</author>
		<idno>ArXiv210400489</idno>
		<ptr target="http://arxiv.org/abs/2104.00489" />
		<imprint>
			<date type="published" when="2021-04-11">Apr. 2021. 11 February, 2022</date>
		</imprint>
	</monogr>
	<note>Cs</note>
</biblStruct>

<biblStruct xml:id="b32">
	<analytic>
		<title level="a" type="main">LEAF: A Benchmark for Federated Settings</title>
		<author>
			<persName><forename type="first">S</forename><surname>Caldas</surname></persName>
		</author>
		<ptr target="http://arxiv.org/abs/1812.01097" />
	</analytic>
	<monogr>
		<title level="j">Cs Stat</title>
		<imprint>
			<date type="published" when="2019-04-12">ArXiv181201097. Dec. 2019. 12 April, 2022</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
