=Paper= {{Paper |id=Vol-2844/ethics8 |storemode=property |title=Designing Connected and Automated Vehicles around Legal and Ethical Concerns: Data Protection as a Corporate Social Responsibility |pdfUrl=https://ceur-ws.org/Vol-2844/ethics8.pdf |volume=Vol-2844 |authors=Paolo Balboni,Anastasia Botsi,Kate Francis,Martim Taborda Barata |dblpUrl=https://dblp.org/rec/conf/setn/BalboniBFB20 }} ==Designing Connected and Automated Vehicles around Legal and Ethical Concerns: Data Protection as a Corporate Social Responsibility== https://ceur-ws.org/Vol-2844/ethics8.pdf
Designing Connected and Automated Vehicles around Legal and
    Ethical Concerns: Data Protection as a Corporate Social
                        Responsibility
                                   Paolo Balboni∗                                                                        Anastasia Botsi
               paolo.balboni@maastrichtuniversity.nl                                                  anastasia.botsi@ictlegalconsulting.com
       Maastricht University – Faculty of Law – Private Law                                             ICT Legal Consulting International
       Bouillonstraat 3, 6211 LH Maastricht, The Netherlands                                 Piet Heinkade 55, 1019 GM Amsterdam, The Netherlands
               paolo.balboni@ictlegalconsulting.com
                        ICT Legal Consulting
               Via Borgonuovo 12, 20122 Milan, Italy

                                    Kate Francis                                                                  Martim Taborda Barata
                    kate.francis@ictlegalconsulting.com                                           martim.tabordabarata@ictlegalconsulting.com
                            ICT Legal Consulting                                                       ICT Legal Consulting International
                   Via Borgonuovo 12, 20122 Milan, Italy                                     Piet Heinkade 55, 1019 GM Amsterdam, The Netherlands

ABSTRACT                                                                                     limitation4 (which comes at odds with the autonomous processing
Emerging technologies and tools based on Artificial Intelligence                             of personal data through AI in the CAV, which may be based on
(AI), such as Connected and automated vehicles (CAVs), present                               a (re)interpretation of goals, or, possibly, a shift in focus from the
novel regulatory and legal compliance challenges while at the same                           original goal for which personal data was collected). Additionally,
time raising important questions with respect to ethics and trans-                           the overall need for relatively large datasets to properly train and
parency.                                                                                     leverage AI functionalities leads to conflicts with the principle of
   On the one hand, CAVs bring to light theoretical and practical                            data minimization.5 When applied to AI systems, the requirement
challenges to the implementation of the multi-dimensional obliga-                            of data protection by design and by default also presents difficulties,
tions of the current European personal data protection legal frame-                          as data protection by default is possible only when the necessary
work, including the General Data Protection Regulation (GDPR),                               personal data is processed for a specific purpose.6 Moreover, the
the ePrivacy Directive,1 and where applicable, the Directive for a                           ePrivacy Directive has been interpreted by European Supervisory
high common level of security and information systems (NIS Direc-                            Authorities – notably, the European Data Protection Board (EDPB)7
tive or NISD).2 As mere examples, CAV developers currently face                              – as requiring a company wishing to store or access information
multiple legal hurdles to overcome, including the necessity to fulfil                        stored within a CAV to obtain specific consent from CAV users for
controller and/or processor obligations in complex data process-                             these specific activities. Furthermore, an additional legal basis must
ing scenarios3 and tensions with the GDPR’s principle of purpose                             be determined (possibly necessitating those companies to make a
                                                                                             double request for consent) for any subsequent use of the infor-
∗ Prof. Dr. Paolo Balboni is Professor of Privacy, Cybersecurity, and IT Contract Law        mation stored or accessed, such as the analysis of telematics data
at the European Centre on Privacy and Cybersecurity (ECPC) within the Maastricht             collected from a CAV. This interpretation creates challenges at the
University Faculty of Law and Founding Partner of ICT Legal Consulting.
1 Directive 2002/58/EC of the European Parliament and of the Council of 12 July              technical and legal levels in particular where the legal basis defined
2002 concerning the processing of personal data and the protection of privacy in the         for subsequent use of CAV information is not consent, such as in the
electronic communications sector.                                                            case of pay-as-you-drive insurance, where the contract entered into
2 The NISD, applicable to operators of essential services and digital service providers,
ensures the security of network and information systems vital to economic and societal
                                                                                             between the CAV user and an insurance company serves as a legal
activities and to the functioning of the internal EU market. Also see Recital (1) NISD.      basis for the processing of their personal data. A conflict between
3 Under the GDPR there are two main roles that an organization can take on regarding
                                                                                             the legal basis used for information storage/access – consent, which
an activity which involves the processing of personal data: that of controller, or that
of processor. Article 4 (7) GDPR defines controller as “the natural or legal person,
public authority, agency or other body which, alone or jointly with others, determines the
purposes and means of the processing of personal data”; where two or more controllers         4 According to Article 5(1)(b) GDPR, the personal data must be “collected for specified,
jointly determine the purposes and means of a given processing activity, they will
                                                                                              explicit and legitimate purposes and not further processed in a manner that is incompatible
be considered as “joint controllers” under Article 26 GDPR. Article 4(8) GDPR defines
                                                                                              with those purposes”.
processor as “a natural or legal person, public authority, agency or other body which         5 The principle of data minimization according to Article 5(1)(c) GDPR, requires that
processes personal data on behalf of the controller”. Depending on the data protection
role which is applicable to an organization, its obligations will change, as can be better    personal data are processed to the extent to which it is “adequate, relevant and limited
seen in Articles 25 to 28 GDPR.                                                               to what is necessary in relation to the purposes for which they are processed”.
                                                                                              6 Commission Nationale Informatique & Libertes, Compliance Package: Connected
                                                                                              vehicles and personal data. October 2017. Available at: https://www.cnil.fr/sites/default/
                                                                                              files/atoms/files/cnil_pack_vehicules_connectes_gb.pdf.
WAIEL2020, September 3, 2020, Athens, Greece                                                  7 European Data Protection Board, Guidelines 1/2020 on processing personal data in
Copyright © 2020 for this paper by its authors. Use permitted under Creative Commons          the context of connected vehicles and mobility related applications. 28 January 2020.
License Attribution 4.0 International (CC BY 4.0).                                            Available at: https://edpb.europa.eu/sites/edpb/files/consultation/edpb_guidelines_
                                                                                              202001_connectedvehicles.pdf.
must be freely withdrawable under the GDPR8 – and the legal basis                          a Corporate Social Responsibility in accordance to the Maastricht
used for information use – e.g., performance of a contract, which                          methodology in this domain.14
will typically not be compatible with the possibility for the CAV
user to freely prevent the insurance company from continuing to                            KEYWORDS
process their personal data emerges in this context. Concerns from                         Artificial Intelligence, Data Protection, Corporate Social Responsi-
the data security9 perspective are also highly relevant, notably due                       bility, Connected and Automated Vehicles
to the lack of shared security standards in the CAV domain and the
increase of potential attack surface caused by the interconnection
                                                                                           1     INTRODUCTION
of different CAV components.10
   On the other hand, while European data protection legislation                           Emerging technologies and tools based on Artificial Intelligence
such as the GDPR, ePrivacy and NISD provide a minimum level                                (AI), such as Connected and automated vehicles (CAV or CAVs),
of legal safeguards for citizens, they may not suffice to maximize                         present novel regulatory and legal compliance challenges while at
CAV benefits for users while minimizing their potential negative                           the same time raising important questions with respect to ethics
impact on society.11 In order to properly and comprehensively                              and transparency. CAVs bring to light theoretical and practical chal-
address the risks brought about by CAVs, ethics12 and human rights                         lenges to the implementation of the multi-dimensional obligations
concerns must therefore take a central role in every stage of the                          of the General Data Protection Regulation (GDPR), the ePrivacy
CAV development lifecycle, embedding the notions of fairness,                              directive15 (2002/58/EC, revised by 2009/136/EC) and, where ap-
transparency, and security into design processes. Transparency13 is                        plicable, the Directive for a high common level of security and
situated between the legal and ethical dimensions and is challenged                        information systems (NIS Directive or NISD).16 Though briefly
by the complexity of AI systems, as well as the inherent autonomy                          14 The concept of Data Protection as a Corporate Social Responsibility (DPCSR) has
and flexibility of automated decision-making, and is key in the                            been developed and promoted by Prof. Dr. Paolo Balboni (Maastricht University),
development of the framework as a prerequisite for trustworthy,                            after having launched the idea on his blog in 2017. The Maastricht University DPCSR
                                                                                           project (Maastricht DPCSR or MU DPCSR) of the European Centre on Privacy and
ethical, and fair data processing.                                                         Cybersecurity (ECPC) at Maastricht University is a two-year multi-stakeholder re-
   This paper explores the closely linked legal principles and ethical                     search project that commenced in January 2020 and involves both Data Protection
aspects that should be taken into consideration by stakeholders                            and Business Stakeholders. During the first year of the project the researchers have
                                                                                           concretized three rules for each of the Five Principles of Sustainable Data Protec-
in the CAV landscape and provides a roadmap to be used by CAV                              tion previously identified by Dr. Paolo Balboni and explored during his inaugural
researchers, developers, and all those who seek to create and im-                          lecture. The second year of the project will consist of expanding to five rules per
plement technologies to carry out data processing activities within                        principle, for a total of 25 rules, which will form the basis of the Maastricht DPCSR
                                                                                           Framework. The first manifesto of the project detailing the aforementioned princi-
such domain in a compliant, fair and trustworthy manner. As a                              ples and rules, “Data Protection as a Corporate Social Responsibility: From Compli-
result of the inherent link between the legal and ethical concerns,                        ance to Sustainability to Generate Both Social and Financial Value”, is available here:
                                                                                           https://www.maastrichtuniversity.nl/ecpc/csr-project/csr-publications. The research
the authors will present a holistic approach to design and devel-                          project is being developed according to the highest academic and ethical standards in
opment which is intended to overcome the challenges posed to                               full independence. It is intended to benefit of the rights and freedoms of individuals
European personal data protection legal principles and obligations,                        by way of the establishment of data protection practices that are socially responsible
                                                                                           and feasible, and which shall be agreed upon and adhered to by the Stakeholders. The
by involving ethics and fairness. This approach, which goes beyond                         Maastricht DPCSR Framework aims to “trigger virtuous data protection competition
minimum legal requirements and proposes the application of a                               between companies by creating an environment that identifies and promotes data
multidisciplinary framework, can be defined as Data Protection as                          protection as an asset which can be used to help companies to responsibly further
                                                                                           their economic targets.” To learn more about the project, see the University’s dedicated
                                                                                           webpage, available here: https://www.maastrichtuniversity.nl/ecpc/csr-project.
                                                                                           15 The ePrivacy Directive “sets a specific standard for all actors that wish to store
                                                                                           or access information stored in the terminal equipment of a subscriber or user in
                                                                                           the European Economic Area (EEA).” The majority of the provisions in the ePrivacy
8 See Article 7(3) GDPR.
                                                                                           Directive (e.g. Articles 6 and 9) only apply “to providers of publicly available electronic
9 The risk-based approach is promoted by the GDPR, which encourages organizations
                                                                                           communication services and providers of public communication networks, art. 5(3)
                                                                                           ePrivacy Directive is a general provision. It does not only apply to electronic commu-
to evaluate the risks inherent in the processing activities and to then implement a        nication services but also to every entity that places on or reads information from a
framework to mitigate such risks.                                                          terminal equipment without regard to the nature of the data being stored or accessed.”
10 See European Data Protection Supervisor, Connected Cars, TechDispatch, Issue 3, 20
                                                                                           See p. 5 of the European Data Protection Board Guidelines 1/2020 on processing personal
December 2019, p. 2. Available at: https://edps.europa.eu/data-protection/our-work/        data in the context of connected vehicles and mobility related applications, Version 1.0, 28
publications/techdispatch/techdispatch-3-connected-cars_en; and the European Union         January 2020. Available at: https://edpb.europa.eu/sites/edpb/files/consultation/edpb_
Agency for Cybersecurity, Good Practices for Security of Smart Cars, 25 November 2019,     guidelines_202001_connectedvehicles.pdf.
pp. 6-7. Available at: https://www.enisa.europa.eu/publications/smart-cars.                    Also note that according to Art. 1(a) of Directive 2008/63/EC, terminal equipment
11 European Data Protection Board, Guidelines 1/2020 on processing personal data in the
                                                                                           is defined as “equipment directly or indirectly connected to the interface of a public
context of connected vehicles and mobility related applications, Version 1.0, 28 January   telecommunications network to send, process or receive information; in either case
2020, p. 10. Available at: https://edpb.europa.eu/sites/edpb/files/consultation/edpb_      (direct or indirect), the connection may be made by wire, optical fibre or electromag-
guidelines_202001_connectedvehicles.pdf.                                                   netically; a connection is indirect if equipment is placed between the terminal and
12 Horizon 2020 Commission Expert Group to advise on specific ethical issues               the interface of the network; (b) satellite earth station equipment”. See the Directive
raised by driverless mobility (E03659). Ethics of Connected and Automated Ve-              here: https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=CELEX%3A32008L0063.
hicles: recommendations on road safety, privacy, fairness, explainability and re-          Following this logic, the European Data Protection Board has determined that “the
sponsibility. 2020. Publication Office of the European Union: Luxembourg. Avail-           connected vehicle and every device connected to it shall be considered as a ‘terminal
able at: https://ec.europa.eu/info/sites/info/files/research_and_innovation/ethics_of_     equipment’ (just like a computer, a smartphone or a smart TV) and provisions of
connected_and_automated_vehicles_report.pdf.                                               art. 5(3) ePrivacy Directive must apply where relevant.” See European Data Protection
13 Articles 13 and 14 GDPR require controllers to provide clear information to data        Board Guidelines 1/2020, p. 5.
subjects when their personal data is being obtained from them, including, e.g., in-        16 The NISD, applicable to operators of essential services and digital service providers,
formation on the identity and contact details of the controller, the purposes of the       ensures the security of network and information systems vital to economic and societal
processing, the categories, recipients and storage of personal data.                       activities and to the functioning of the internal EU market. Also see Recital (1) NISD.
touching on other legislation, this paper will primarily deal with                             principle of purpose limitation23 (which comes at odds with the
requirements enshrined in the GDPR.                                                            CAV’s autonomous processing of personal data through AI, which
    In the context of CAVs, adherence to the rules set forth in the                            may be based on a (re)interpretation of goals, or, possibly, a shift
GDPR is fundamental for conformity with the applicable legal                                   in focus from the original goal for which personal data was col-
framework,17 which establishes compliance requirements for enti-                               lected).24 Additionally, the overall need for relatively large datasets
ties that process personal data, or information relating to individuals                        to properly train and leverage AI functionalities leads to conflict
which can be either identified or identifiable.18 Consequently, when                           with the principle of data minimization.25 When applied to AI
designing and developing CAVs, it is crucial to have an overview                               systems, data protection by design and by default also presents
of the difficulties that arise as a result of the implementation of                            difficulties, as data protection by default is possible only when the
the GDPR, and to have an approach to adequately address them.                                  necessary personal data is processed for a specific purpose.26 Con-
Furthermore, the ePrivacy Directive is also directly relevant to                               cerns from the data security27 perspective are also highly relevant,
CAVs in that it specifies standards for the storage of and access                              notably due to the lack of shared security standards in the CAV
to information stored in “terminal equipment of a subscriber or                                domain and the increase of potential attack surface caused by the
user”19 in the European Economic Area, notably imposing the need                               interconnection of different CAV components.28
to collect specific GDPR-compliant consent20 from CAV users for                                   At the same time, while European data protection legislation
these activities. This particular requirement, as seen previously,                             such as the GDPR provides a minimum level of legal safeguards for
may generate technical and legal difficulties for certain subsequent                           citizens, they may not suffice to maximize CAV benefits for users
uses of information gathered from CAVs. The NIS Directive in-                                  while minimizing their potential negative impact on society.29 In
stead has established further obligations to ensure the security of                            order to properly and comprehensively address the risks brought
network and information systems of essential services of a given                               about by CAVs, ethics and human rights concerns must therefore
Member State, classified as “operators of essential services” (such as                         take a central role in every stage of the CAV development lifecycle,
telecommunications, healthcare or transportation services). The                                embedding the notions of fairness, transparency, and security into
European Union Agency for Cybersecurity (ENISA) has identified                                 design processes. Transparency30 is situated between the legal
intelligent transport systems (ITS) as Essential Service Operators in                          and ethical dimensions and is challenged by the complexity of
the road transport sub-sector,21 and therefore it can be concluded                             AI systems, as well as the inherent autonomy and flexibility of
that the NISD is applicable also in the context of CAVs. At the mo-                            automated decision-making, and is key in the development of the
ment ENISA is in the process of addressing the security of smart
cars in order to contribute to the existing regulatory framework.                              and means of the processing of personal data”; where two or more controllers jointly de-
It is therefore safe to assume that as the adoption of CAVs reaches                            termine the purposes and means of a given processing activity, they will be considered
                                                                                               as “joint controllers” under Article 26 GDPR. Article 4 (8) GDPR defines processor as “a
a critical mass, that specific ITS operators will be designated as                             natural or legal person, public authority, agency or other body which processes personal
operators of essential services.                                                               data on behalf of the controller”. Depending on the data protection role which is appli-
    As mere examples, CAV manufacturers, including vehicle and                                 cable to an organization, its obligations will change, as can be better seen in Articles 25
                                                                                               to 28. Also see European Data Protection Board, Guidelines 07/2020 on the concepts of
equipment manufacturers, developers, service providers and other                               controller and processor in the GDPR, Version 1.0, Adopted on 2 September 2020. Avail-
relevant third parties (also collectively referred to as “CAV stake-                           able at: https://edpb.europa.eu/our-work-tools/public-consultations-art-704/2020/
                                                                                               guidelines-072020-concepts-controller-and-processor_en.
holders”) currently face multiple legal hurdles to overcome, includ-                           23 According to Article 5(1)(b) GDPR, personal data must be “collected for specified,
ing the necessity to fulfil controller and/or processor obligations in                         explicit and legitimate purposes and not further processed in a manner that is incompatible
complex data processing scenarios22 and tensions with the GDPR’s                               with those purposes”. Also see Article 29 Data Protection Working Party, Opinion 03/2013
                                                                                               on purpose limitation, 2 April 2013. Available at: https://ec.europa.eu/justice/article-29/
                                                                                               documentation/opinion-recommendation/files/2013/wp203_en.pdf.
17 European Data Protection Board, Guidelines 1/2020 on processing personal data in the        24 European Commission, White Paper on Artificial Intelligence – A European approach

context of connected vehicles and mobility related applications, Version 1.0, Adopted on       to excellence and trust, February 2020, p. 17. Available at: https://ec.europa.eu/info/
28 January 2020, p. 5.                                                                         sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf, p. 16.
18 Under Art. 4(1) GDPR, data subject is defined as “an identified or identifiable natural     25 The principle of data minimization requires that personal data is to be processed to

person”. Please see footnote 55 for the explanation provided by the GDPR of an “identi-        the extent to which it is “adequate, relevant and limited to what is necessary in relation
fiable natural person”. Also see a recent landmark case of the Court of Justice of the         to the purposes for which they are processed”, according to Article 5(1)(c) GDPR. See
European Union which clarifies that the concept of personal data is to be extended             European Data Protection Board, Guidelines 1/2020 on processing personal data in the
to cases where even only a third party has additional data necessary to identify the           context of connected vehicles and mobility related applications, Version 1.0, 28 January
data subject (Case C-582/14, Patrick Breyer v Bundesrepublik Deutschland). Avail-              2020, p. 14. Available at: https://edpb.europa.eu/sites/edpb/files/consultation/edpb_
able at: http://curia.europa.eu/juris/document/document.jsf?text=&docid=184668&                guidelines_202001_connectedvehicles.pdf.
                                                                                               26 Commission Nationale Informatique & Libertes, Compliance Package: Connected
pageIndex=0&doclang=en&mode=lst&dir=&occ=first&part=1&cid=1116945.
19 See Article 5(3) ePrivacy Directive.                                                        vehicles and personal data. October 2017. Available at: https://www.cnil.fr/sites/default/
20 See Recital 17 ePrivacy Directive which, with the entering into force of the GDPR,          files/atoms/files/cnil_pack_vehicules_connectes_gb.pdf
                                                                                               27 The risk-based approach is promoted by the GDPR, which encourages organizations
should be read as referring to the GDPR’s requirements on consent (and not those of
its predecessor, Directive 95/46/EC).                                                          to evaluate the risks inherent in the processing activities and to then implement a
21 “In light of the NIS Directive, in which road authorities and intelligent transport         framework to mitigate such risks.
                                                                                               28 European Union Agency for Cybersecurity, Good Practices for Security of Smart Cars,
systems are among the entities identified as Essential Service Operators in the road
transport sub-sector, there is a growing need for addressing the security of smart cars.”      November 2019, pp. 6-7.
                                                                                               29 European Data Protection Supervisor, Report Towards a digital ethics – EDPS Ethics
See European Union Agency for Cybersecurity, ENISA Programming Document 2020-
2022, November 2019, p. 32. Available at: https://www.enisa.europa.eu/publications/            Advisory Group, 25 January 2018. Available at: https://edps.europa.eu/sites/edp/files/
corporate-documents/enisa-programming-document-202020132022.                                   publication/18-01-25_eag_report_en.pdf.
22 Under the GDPR, there are two main roles that an organization can take on regarding         30 Articles 13 and 14 GDPR require controllers to provide clear information to data
an activity which involves the processing of personal data: that of controller, or that of     subjects when their personal data is obtained from them, including, e.g., information
processor. Article 4 (7) GDPR defines controller as “the natural or legal person, public au-   on the identity and contact details of the controller, the purposes of the processing,
thority, agency or other body which, alone or jointly with others, determines the purposes     the categories, recipients and storage of personal data.
framework as a prerequisite for trustworthy, ethical, and fair data                        data protection concerns.34 Furthermore, the absence of a specific
processing.                                                                                applicable legislative framework and appropriate consideration of
   This paper explores the closely linked legal principles and ethical                     the ethical implications of such new technologies presents a chal-
aspects that should be taken into consideration by stakeholders                            lenge both for manufacturers in the development and steering of
in the CAV landscape and provides a roadmap to be used by CAV                              their work and to society with respect to the benefits that can be
researchers, developers, and all those who seek to create and im-                          reaped from such technologies, whether they be increased road
plement technologies to process personal data within such domain                           safety, lessened environmental impact and better mobility, or the
in a compliant, fair and trustworthy manner. The protection of                             potential improvement of European economic strength, growth,
the rights, freedoms and interests of data subjects is at the heart                        and competitiveness.35 As is often the case with new technologies,
of this discussion, though the perspective of technology service                           which make rapid and significant progress in terms of development
developers and providers – who carry the burden of implementing                            and adoption, policymaking and the applicable regulatory frame-
measures to ensure compliance with the existing legal framework –                          works are often less less-advanced than the technology itself, a
are addressed.31 On the basis of the analysis laid out in this paper,                      notion which also holds true in the area of CAVs.36 This, together
we provide suggestions and recommendations in order to assist                              with the fact that such new technologies present both significant
manufacturers, developers and service providers to design and de-                          opportunities, but also risks, underlines the necessity of adequate
velop CAVs. In fact, the adoption of a holistic approach to data                           regulation.37
protection can assist in overcoming both the ethical and legisla-                             CAVs operate in a complex communications ecosystem whose
tive and regulatory challenges in this complex environment. This                           interactions can largely be divided into vehicle-to-vehicle (V2V),
approach, which goes beyond minimum legal requirements and                                 vehicle-to-infrastructure (V2I), and vehicle-to-everything (V2E or
proposes the application of a multidisciplinary framework, can be                          V2X),38 involving actors that range from the driver, passenger,
defined as Data Protection as a Corporate Social Responsibility in                         pedestrian, to smart city infrastructure managers, law enforcement,
accordance to the Maastricht methodology in this domain.32                                 and infotainment service providers.39 Connected vehicles further-
                                                                                           more include numerous additional characteristics and innovative
2     PRIMARY LEGAL AND ETHICAL                                                            technologies, and entail the constant processing of (personal) data
                                                                                           for the improvement of driving, requiring the transmission of data
      CONCERNS IN THE CAV ENVIRONMENT
                                                                                           relating to the car, its surroundings and the individuals inside it.40
The automotive industry of the 21st century has transformed tra-                           Connected vehicles largely function by collecting various types of
ditional cars into intelligent objects of transportation.33 The tech-                      information, depending on their design, through built-in or external
nology of Connected and Automated Vehicles cannot be separated                             sensors (e.g., external devices, such as a smartphone). Autonomous
from the persons that use them, whether for their private use or                           vehicles use such sensors and AI in order to autonomously perform
as part of services of a public system of transportation. It cannot                        driving functions under varied conditions.41 The inherent nature of
be denied that the human-machine relationship, the human to the                            CAVs therefore involves the collection of massive amounts of data,
CAV, itself presents a number of ethical paradoxes and concerns                            for an uncertain number of purposes (which may not be disclosed
which range from safety and even the loss of human life, liability                         from the start of the use of the CAV) and the security includes a
questions, to economic, environmental, and security, privacy and
                                                                                           34 German Federal Ministry of Transport and Digital Infrastructure, Ethics Commission
                                                                                           Automated and Connected Driving. June 2017. Available at: https://www.bmvi.de/
31 Also see Recital 78 GDPR.                                                               SharedDocs/EN/publications/report-ethics-commission.pdf?__blob=publicationFile.
32 The concept of Data Protection as a Corporate Social Responsibility (DPCSR) has         Also see European Parliamentary Research Service, Study of the Panel for the Future
                                                                                           of Science and Technology, The impact of the General Data Protection Regulation
been developed and promoted by Prof. Dr. Paolo Balboni (Maastricht University),
                                                                                           (GDPR) on artificial intelligence. June 2020. Available at: https://www.europarl.europa.
after having launched the idea on his blog in 2017. The Maastricht University DPCSR
                                                                                           eu/RegData/etudes/STUD/2020/641530/EPRS_STU(2020)641530_EN.pdf; and ICT
project (Maastricht DPCSR or MU DPCSR) of the European Centre on Privacy and
                                                                                           Legal Consulting’s contribution to nIoVe Deliverable 2.1. https://niove.eu/.
Cybersecurity (ECPC) at Maastricht University is a two-year multi-stakeholder re-          35 Government of The Netherlands, Self-driving cars, 2020. https://www.government.
search project that commenced in January 2020 and involves both Data Protection
and Business Stakeholders. During the first year of the project the researchers have       nl/topics/mobility-public-transport-and-road-safety/self-driving-vehicles.
                                                                                           36 This concept is furthered in the June 2018 report of the Task Force on Eth-
concretized three rules for each of the Five Principles of Sustainable Data Protec-
tion previously identified by Dr. Paolo Balboni and explored during his inaugural          ical Aspects of Connected and Automated Driving (Ethics Task Force) estab-
lecture. The second year of the project will consist of expanding to five rules per        lished by the 2nd High Level Structural Dialogue in Frankfurt/M. on 14 and 15
principle, for a total of 25 rules, which will form the basis of the Maastricht DPCSR      September 2017. Available at: https://www.bmvi.de/SharedDocs/EN/publications/
Framework. The first manifesto of the project detailing the aforementioned princi-         report-ethics-task-force-automated-driving.pdf?__blob=publicationFile.
                                                                                           37 European Commission, White Paper on Artificial Intelligence – A Eu-
ples and rules, “Data Protection as a Corporate Social Responsibility: From Compli-
ance to Sustainability to Generate Both Social and Financial Value”, is available here:    ropean approach to excellence and trust, European Commission, Febru-
https://www.maastrichtuniversity.nl/ecpc/csr-project/csr-publications. The research        ary 2020, pp. 3, 17. Available at: https://ec.europa.eu/info/sites/info/files/
project is being developed according to the highest academic and ethical standards in      commission-white-paper-artificial-intelligence-feb2020_en.pdf.
                                                                                           38 WSP Canada Group Limited and Ontario Centres of Excellence, Ontario CAV Ecosys-
full independence. It is intended to benefit of the rights and freedoms of individuals
by way of the establishment of data protection practices that are socially responsible     tem Analysis, 2019, p. 4. Available at: https://www.oce-ontario.org/docs/default-source/
and feasible, and which shall be agreed upon and adhered to by the Stakeholders. The       publications/avin-ecosystem-analysis-final-report-2019.pdf?sfvrsn=2.
                                                                                           39 European Data Protection Board, Guidelines 1/2020 on processing personal data in the
Maastricht DPCSR Framework aims to “trigger virtuous data protection competition
between companies by creating an environment that identifies and promotes data             context of connected vehicles and mobility related applications, Version 1.0, 28 January
protection as an asset which can be used to help companies to responsibly further          2020, p. 3.
their economic targets.” To learn more about the project, see the University’s dedicated   40 European Data Protection Supervisor, Connected Cars, TechDispatch, Issue 3, 2019,
webpage, available here: https://www.maastrichtuniversity.nl/ecpc/csr-project.             p. 1.
33 European Union Agency for Cybersecurity, Good Practices for Security of Smart Cars,     41 European Union Agency for Cybersecurity, Good Practices for Security of Smart Cars,
November 2019, p. 7.                                                                       November 2019, p. 13.
considerably large ecosystem of devices (both internal and exter-                               transparency,49 purpose limitation,50 data minimisation, accuracy,
nal to the CAV).42 For such reasons, CAVs have introduced novel                                 storage limitation, the principles of integrity and confidentiality
regulatory and legal compliance challenges43 which remain to be                                 (data security) and the principle of accountability51 are required.
fully addressed by policymakers.44
   Current legislation governing the processing of personal data45                              2.1      Data Processing Scenarios and Data
related to individuals, those who drive or ride in CAVs, is the Gen-                                     Processing Roles
eral Data Protection Regulation (GDPR)46 which applies to “the                                  An initial hurdle in the development of CAVs is the fact that the
processing of personal data wholly or partly by automated means”47                              processing of personal data is often carried out by machines man-
and the ePrivacy Directive, which creates specific consent require-                             aged by different organisations, each of them using computational
ments applicable to the storage of, and access to, information stored                           capacity provided by cloud service developers/providers and that
in CAVs.48 Data subjects are afforded a variety of rights under the                             can also involve analytic software programmes supplied by the
GDPR, which also establishes important principles – this also ap-                               related vendors.52 This exponentially increases the number of par-
plies to CAV manufacturers, service developers/providers and users,                             ties involved in data processing activities and the difficulties in
where such services require the use of personal data. As such, com-                             clearly allocating data processing roles (controller or processor)
pliance with the principles of the GDPR relating to the processing                              to each one. Such grey areas create both compliance and ethical
of personal data, including the principles of lawfulness, fairness and                          complications53 with respect to accountability where stakeholders
                                                                                                feel that the responsibility for data protection compliance lies with
                                                                                                another entity, and thus may feel free to process personal data in
                                                                                                ways that they deem more convenient or beneficial, perhaps to the
42 European Data Protection Board, Guidelines 1/2020 on processing personal data in the
                                                                                                detriment of the individuals concerned.
context of connected vehicles and mobility related applications, Version 1.0, 28 January           Under the GDPR, when processing personal data, there are two
2020, p. 14. Available at: https://edpb.europa.eu/sites/edpb/files/consultation/edpb_           main roles that an organization can take on, that of the data con-
guidelines_202001_connectedvehicles.pdf.                                                        troller and that of the data processor. The data controller is the
43 The Article 29 Data Protection Working Party in its Opinion 08/2014 on the Recent
Developments on the Internet of Things (16 September 2014) has linked IoT to the                party that determines the purposes (the why) and the means (the
notions of “pervasive” and “ubiquitous” computing, thereby “clearly [raising] new and           what and how) of the processing.54 For example, service providers
significant personal data protection and privacy challenges”.
44 European Union Agency for Cybersecurity, Towards a framework for policy
                                                                                                that process vehicle data to send the driver messages, insurance
development in cybersecurity, security and privacy considerations in autonomous                 companies or vehicle manufacturers, tend to take the role of a data
agents, 14 March 2019, p. 17. Available at: https://www.enisa.europa.eu/publications/           controller, since they collect data on vehicle use and for their own
considerations-in-autonomous-agents.
45 Article 4(1) of Regulation (EU) 2016/679 of the European Parliament and of the               purposes (i.e., service provision and quality improvement).55 In this
Council of 27 April 2016, on the protection of natural persons with regard to the               category, two or more data controllers may also jointly determine
processing of personal data and on the free movement of such data, and repealing                the purposes and means of the processing, and they will be consid-
Directive 95/46/EC (available at: https://eur-lex.europa.eu/eli/reg/2016/679/oj) defines
personal data as “any information relating to an identified or identifiable natural person”,
                                                                                                ered as joint controllers.56 The data processor is the entity which
further specifying that “an identifiable natural person is one who can be identified,           49 Article 29 Working Party, Guidelines on transparency under Regulation 2016/679, 29
directly or indirectly, in particular by reference to an identifier such as a name, an
identification number, location data, an online identifier or to one or more factors specific   November 2017, as last Revised and Adopted on 11 April 2018. Available at: https:
to the physical, physiological, genetic, mental, economic, cultural or social identity of       //ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=622227.
                                                                                                50 Article 29 Working Party, Opinion 03/2013 on purpose limitation, 2 April
that natural person”. Article 4(2) of the same Regulation defines processing as “any
operation or set of operations which is performed on personal data or on sets of personal       2013.      Available     at:    https://ec.europa.eu/justice/article-29/documentation/
data, whether or not by automated means, such as collection, recording, organisation,           opinion-recommendation/files/2013/wp203_en.pdf.
                                                                                                51 Article 29 Working Party, Opinion 03/2010 on the principle of accountability,
structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by
transmission, dissemination or otherwise making available, alignment or combination,            13 July 2010. Available at: https://ec.europa.eu/justice/article-29/documentation/
restriction, erasure or destruction”.                                                           opinion-recommendation/files/2010/wp173_en.pdf.
46 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April           52 Article 29 Data Protection Working Party, Opinion 8/2014 on Recent De-

2016, on the protection of natural persons with regard to the processing of personal data       velopments on the Internet of Things, 16 September 2014, p. 11. Available
and on the free movement of such data, and repealing Directive 95/46/EC. Available              at: https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/
at: https://eur-lex.europa.eu/eli/reg/2016/679/oj.                                              files/2014/wp223_en.pdf. See also European Data Protection Supervisor, EDPS response
    Manufacturers, and service developers/providers may further be subjected to rules           to the Commission public consultation on the regulatory environment for platforms, online
arising from the European Union’s Directive on Security of Network and Information              intermediaries, data and cloud computing and the collaborative economy, 16 December
Systems (NISD), depending on the types of services they provide. In particular, when            2015, p. 4. Available at: https://edps.europa.eu/sites/edp/files/publication/15-12-16_
involved in the provision of crucial services for the functioning of a given Member             online_platforms_en.pdf.
                                                                                                53 According to the “a broader and more proactive ethical approach will also help
State, such as telecommunications, healthcare or transportation services, these de-
velopers/providers may be classified as “operators of essential services” (OESs). (See          to reveal new perspectives on the often-asked question of who is responsible for
Directive (EU) 2016/1148 of the European Parliament and of the Council of 6 July 2016           the behaviour of CAVs.” See Horizon 2020 Commission Expert Group to advise on
concerning measures for a high common level of security of network and information              specific ethical issues raised by driverless mobility (E03659). Ethics of Connected and
systems across the Union. Available at: https://eur-lex.europa.eu/eli/dir/2016/1148/oj,         Automated Vehicles: recommendations on road safety, privacy, fairness, explainability
and Arts. 4(4) and 5(2), as well as Annex II NISD.) These operators are subject to              and responsibility. 2020. Publication Office of the European Union: Luxembourg, p. 20.
further obligations under the NISD, intended to promote and ensure the security of              Available at: https://ec.europa.eu/info/sites/info/files/research_and_innovation/ethics_
network and information systems deemed vital to economic and societal activities,               of_connected_and_automated_vehicles_report.pdf.
and in particular to the functioning of the European Union’s internal market. (Recital          54 Article 4(7) General Data Protection Regulation.
(1) NISD).                                                                                      55 European Data Protection Board, Guidelines 1/2020 on processing personal data in the
47 Article 2(1) General Data Protection Regulation.                                             context of connected vehicles and mobility related applications, Version 1.0, 28 January
48 See Article 5(3) ePrivacy Directive and European Data Protection Board, Guidelines           2020, p. 9.
1/2020 on processing personal data in the context of connected vehicles and mobility related    56 On the concept of joint controllers, see European Data Protection Board,
applications, Version 1.0, 28 January 2020, Section 1.2. Available at: https://edpb.europa.     Guidelines 07/2020 on the concepts of controller and processor in the GDPR, Ver-
eu/sites/edpb/files/consultation/edpb_guidelines_202001_connectedvehicles.pdf.                  sion 1.0, Adopted on 2 September 2020; also see relevant Court of Justice of
processes personal data on behalf of the controller, based on the                             In this respect, a level playing field for CAV-collected and shared
instructions of the controller.57 Suppliers and equipment manufac-                            data can create greater certainty between the actors and greater
turers who may process data on behalf of CAV manufacturers may                                assurances for lawful processing towards data subjects.
be processors in this context.58
   The European Commission has clarified that the principle of                                2.2      Data Protection by Design and by Default
accountability lies with the actor(s) best placed to address risks.59                         Adherence to the concept of data protection by design and by de-
Therefore, the criteria60 of who is in the best position to address                           fault65 represents a fundamental prerequisite in the design of CAV,
risks can help CAV manufacturers, service providers, and develop-                             requiring controllers from the design phase to implement tech-
ers take the appropriate data processing role in the different stages                         nical and organisational measures within products and services
of the lifecycle (i.e., developers, users and third parties will there-                       to ensure compliance with the GDPR and the protection of data
fore be responsible at different stages of the lifecycle). According                          subjects’ rights, “both at the time of the determination of the means
to Art. 28(3) GDPR, controllers must regulate their relationship                              for processing and at the time of the processing itself ” (data protec-
with processors through a contractual agreement, which is incon-                              tion by design66 ). Data protection by design and by default should
sistent with the reality that the same actors can have different data                         not simply be considered as a principle, but rather, as a means to
processing roles depending on the stage of the lifecycle.                                     achieve compliance with the specific principles stipulated in Arti-
   More conventional agreements to regulate data processing re-                               cle 5 GDPR, and generally, all duties and obligations set forth in
lationships, such as Data Processing Agreements61 and joint con-                              the GDPR. Art. 25(2) GDPR more specifically requires the imple-
trollership agreements62 may prove impractical to deal with these                             mentation of measures to make the principle of data minimisation
intricate relationships, as they may not suffice to cover all different                       effective, by only allowing the processing of personal data which
roles which each of the parties involved in CAV data processing                               is strictly necessary for the processing purposes which have been
plays. In order to address the grey area and the need to sign sev-                            identified for a given activity (data protection by default67 ). In order
eral Data Processing Agreements, stakeholders should consider                                 to apply data protection by design and by default, organizations
engaging each other through more complex contractual frame-                                   should first develop of a comprehensive risk assessment, where
works (Data Management Agreements63 ), identifying the specific                               the intended activity is mapped out from the personal data per-
CAV data processing activities which they intend to perform and                               spective.68 Additionally, according to the principle of integrity and
their respective roles and obligations for each activity identified.64                        confidentiality, personal data must be processed in a manner that
the European Union Case C-210/16 (Unabhängiges Landeszentrum für Daten-                       ensures appropriate security of the personal data, including the
schutz Schleswig-Holstein v Wirtschaftsakademie Schleswig-Holstein GmbH). Avail-              protection against unauthorised or unlawful and against accidental
able at: http://curia.europa.eu/juris/document/document.jsf?text=&docid=202543&
pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=1206721; Case
C-25/17 (Tietosuojavaltuutettu v Jehovan todistajat). Available at: http://curia.europa.      platform operators and other service providers should identify themselves as data
eu/juris/document/document.jsf?docid=203822&doclang=EN; and Case C-40/17 (Fash-               controllers (or [joint controllers]) in the information they provide to users whose data
ion ID GmbH & Co. KG v Verbraucherzentrale NRW eV), which concerns the joint                  they process. They can identify their position as controllers based on the mere fact
controllership relationship between Facebook and website operators that embed the             that they are processing personal data for their own purposes. This approach ensures
Facebook “Like” button on their site. Available at: http://curia.europa.eu/juris/liste.jsf?   that businesses act responsibly and in compliance with the Directive and that liability
num=C-40/17.                                                                                  is efficiently allocated”.
57 Article 4(8) General Data Protection Regulation.                                           65 For more information on the concept of data protection by design
58 European Data Protection Board, Guidelines 1/2020 on processing personal data in the
                                                                                              and by default, please see the United Kingdom Information Commis-
context of connected vehicles and mobility related applications, Version 1.0, 28 January      sioner’s      checklist,    available     here:     https://ico.org.uk/for-organisations/
2020, p. 9.                                                                                   guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/
59 European Commission, White Paper on Artificial Intelligence – A European approach
                                                                                              accountability-and-governance/data-protection-by-design-and-default/; the Spanish
to excellence and trust, February 2020, p. 22. Available at: https://ec.europa.eu/info/       Data Protection Authority’s (AEPD) Guidelines on Data Protection by Default
sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf.               – “Guía de Protección de Datos por Defecto” (October 2020), available here: https:
60 European Data Protection Board, Guidelines 07/2020 on the concepts of con-
                                                                                              //www.aepd.es/sites/default/files/2020-10/guia-proteccion-datos-por-defecto.pdf; and
troller and processor in the GDPR, Version 1.0, 2 September 2020. Avail-                      the AEPD’s Guidelines on Privacy by design - “Guía de Privacidad desde el Diseño”
able at: https://edpb.europa.eu/our-work-tools/public-consultations-art-704/2020/             (October 2019), available here: https://www.aepd.es/sites/default/files/2019-11/
guidelines-072020-concepts-controller-and-processor_en.                                       guia-privacidad-desde-diseno.pdf.
61 Agreements entered into between a controller and a processor, to regulate the pro-         66 As noted by the European Data Protection Supervisor in his Opinion
cessor’s processing of personal data on behalf of the controller, meeting the minimum         5/2018 – Preliminary Opinion on privacy by design (31 May 2018, available
requirements of Art. 28 GDPR.                                                                 at: https://edps.europa.eu/sites/edp/files/publication/18-05-31_preliminary_opinion_
62 Agreements entered into between joint controllers, to regulate their respective data       on_privacy_by_design_en_0.pdf), the obligation of data protection by design, under
protection responsibilities under the GDPR in a consistent and transparent manner,            Art. 25 GDPR, can be broken down into 4 dimensions: (1) personal data processing
meeting the minimum requirements of Art. 26 GDPR.                                             should always be the outcome of a design project; (2) a risk management approach
63 Multi-part structured agreements which include terms applicable to controller-             must be followed in the selection and implementation of measures for effective protec-
to-processor, joint controllership and independent controllership relationships and           tion; (3) measures selected must be appropriate and effective; and (4) the identified
identify the scenarios in which each relevant party will be bound by each set of terms.       measures/safeguards must be integrated into the processing activity itself.
64 This builds upon the recommendation made by the European Data Protection Su-               67 In particular, as noted by Art. 25(2) GDPR, this obligation “applies to the amount

pervisor in its EDPS response to the Commission public consultation on the regulatory         of personal data collected, the extent of their processing, the period of their storage and
environment for platforms, online intermediaries, data and cloud computing and the            their accessibility”, and measures taken to address this obligation “shall ensure that by
collaborative economy, 16 December 2015, p. 5. Available at: https://edps.europa.eu/          default personal data are not made accessible without the individual’s intervention to an
sites/edp/files/publication/15-12-16_online_platforms_en.pdf: “The most effective reg-        indefinite number of natural persons.”
                                                                                              68 On this, see European Data Protection Board, Guidelines 4/2019 on Arti-
ulatory response, in the above respect, consists of applying in a coherent way the
Data Protection Directive, which identifies the controller as ‘the natural or legal per-      cle 25 Data Protection by Design and by Default, 13 November 2019. Avail-
son, public authority, agency or any other body which alone or jointly with others            able at: https://edpb.europa.eu/sites/edpb/files/consultation/edpb_guidelines_201904_
determines the purposes and means of the processing of personal data’ and assigns             dataprotection_by_design_and_by_default.pdf. In particular, see pp. 13 et seq. which
to it the fulfilment of a number of duties designed to protect the individual’s rights        provide a checklist which organisations can use to measure their level of implementa-
to privacy and data protection. Therefore, before engaging into any data processing,          tion of each Art. 5 GDPR principle.
loss, destruction or damage, using appropriate technical or organi-                         User empowerment by design approach,74 and the monitoring of
sational measures.69                                                                        the AI system.75 Human oversight should ensure the ability for
   The European Commission has established that AI applications                             humans to intervene in real time through deactivation during op-
should be considered high-risk if the application is employed in a                          eration, for example, a stop button or a procedure when a human
sector where significant risks for individuals can be expected, in-                         determines that car operation is not safe.76 In the designing of
cluding the transportation sector.70 Thus, CAV designing should ini-                        the CAV, operational constraints can be implemented, for example
tiate by carrying out a Data Protection Impact Assessment (DPIA),                           for the CAV to stop operating in critical weather conditions. Fur-
which would acknowledge and assess the risks posed to data sub-                             thermore, a supervision centre should interact with the vehicle to
jects. DPIAs should be complemented with extensive Security Risk                            monitor its status, request actions and perform remote administra-
Assessments in order to identify threats and risks on IT systems                            tion tasks.77 The ethical question of connected and autonomous
and assess whether security measures in place provide an adequate                           vehicles, in fact, is also dependent on “the conditions in which they
level of protection, also taking into account the magnitude and seri-                       are used and the way in which they are designed,”78 solidifying the
ousness of the security risks increase with the large attack surface.                       relationship between the legal prescription and the ethical aspects
This integrated approach can be found in the Maastricht DPCSR                               of CAVs.
principle of Data security by design,71 which calls for the imple-
mentation of a risk-based approach to data processing that aids in                          2.3      Fairness by Design
the management of data security in order to optimize economic and                           Following the logic of data protection by design and default, in the
social benefits of product deployment and use. Such an approach to                          development process of CAVs, car manufacturers and relevant tech-
data security is necessary in order for society to take full advantage                      nology providers should closely enforce the concept of “fairness
and benefit from technological advancements in transportation in                            by design” by which fairness is related to balanced and proportion-
the CAV context, by first successfully mitigating the risks posed by                        ate data processing.79 Fairness by design requires the balancing
such technologies in terms of security.72                                                   of fundamental rights freedoms, which should be built into the
   The described regime of Data security by design leads to the obli-                       very design of connected vehicles, their components, and more
gation of developers and manufacturers to have a good approach                              generally, all the related data processing activities in the CAV envi-
to security based to an analysis of the risks associated to individ-                        ronment, forming an integral part of the algorithms that underpin
uals involved, namely drivers, passengers, and pedestrians. The                             the related processing activities. In this way, Fairness by design
mitigations envisioned for such risks should be transposed into                             acts as a further specification of the concept of data protection by
documented policies and procedures, also in view of a comprehen-                            design, complementing both the legal and the ethical dimensions
sive future risk analysis that is aimed at identifying threats that may
                                                                                            74 Maastricht DPCSR Principle 1, Rule 2: Implement Ethics by design and User Empow-
pose risks to CAV systems. It essentially calls for CAVs to be safe by
                                                                                            erment by design, actively empowering individuals with respect to their data, calls for
design, taking into account “known patterns of use by CAV users,                            organizations to 1) establish a multi-stakeholder ethics and user empowerment board
including deliberate or inadvertent misuse, as well as tendencies                           led by the person(s) charged with ensuring compliance with the Maastricht DPCSR
toward inattention, fatigue and cognitive over/under-load.”73                               Framework and involving the C-suite, researchers and developers/engineers, legal and
                                                                                            marketing functions, as well as others deemed to be relevant by the organization; 2)
   Solutions for mitigating the high risks relating to CAVs should                          Establish an internal-external multi-stakeholder ethics, user empowerment, accessi-
include human oversight, the adoption of an Ethics by design and                            bility, and functionality group which, through testing procedures and protocols and
                                                                                            the inclusion of individuals outside of the organization including, e.g. users, ethicists,
                                                                                            consumer and professional associations, disability rights activists, and other relevant
                                                                                            stakeholders, can ensure that the objectives of the established procedures and protocols
69 Art.  5(1)(f) GDPR. For more information on the principle of security,                   concerning ethics and user empowerment are met. 3) Develop personalized ethics
see, e.g., the United Kingdom Information Commissioner’s Office, Security,                  and user empowerment by design policies and procedures (testing and verification
available       at:    https://ico.org.uk/for-organisations/guide-to-data-protection/       protocols/ impact assessments), including ethics and user empowerment impact assess-
guide-to-the-general-data-protection-regulation-gdpr/security/.                             ments which ensure that the objectives of the procedures and protocols with respect
70 European Commission, White Paper on Artificial Intelligence – A European approach        to ethics by design and user empowerment by design are met.
                                                                                            75 Principle 3, Rule 3 of the Maastricht DPCSR framework, calls for organizations to
to excellence and trust, February 2020, p. 17. Available at: https://ec.europa.eu/info/
sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf.             “Establish trusted data processing activities (for example, for use in AI and big data
71 See Maastricht DPCSR Principle 1, Rule 1: Implement Data Security by Design. The         analytics) that actively oppose bias and discrimination. The Organization shall actively
Organization shall implement Data Security by Design into its data processing activities.   seek not only to not cause harm, but to oppose bias and discrimination” to this end
Available at: https://www.maastrichtuniversity.nl/ecpc/csr-project/csr-publications.        provides focuses on establishing trusted data processing activities that actively oppose
72 In the context of CAVs, which make use of Artificial Intelligence, the security of       bias and discrimination, and requires having in place checks and balances to prevent
the algorithm is of fundamental concern, also to protect human life, where malicious        bias and discrimination in all levels of data processing activities, with specific reference
actors could take control of vehicles or slowly divert algorithms to go off course or       to AI and algorithms. It is closely related to the concept of Fairness by design (Principle
even intentionally cause accidents. According to ENISA, specific risks from attacks         1, Rule 3) and can also implicate data sharing, which is further explored in Principle
may include, “vehicle immobilization, road accidents, financial losses, disclosure of       4, Publish relevant findings based on statistical/anonymized data to improve society.
sensitive and/or personal data, and even endanger road users’ safety. Thus, appropriate     Available at: https://www.maastrichtuniversity.nl/ecpc/csr-project/csr-publications.
                                                                                            76 European Commission, White Paper on Artificial Intelligence – A European approach
security measures need to be implemented to mitigate the potential risks, especially as
these attacks threaten the security, safety and even the privacy of vehicle passengers      to excellence and trust, February 2020, p. 21. Available at: https://ec.europa.eu/info/
and all other road users, including pedestrians.” See ENISA Good practices for security     sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf.
                                                                                            77 Ibid.
of Smart Cars, 25 November 2019, p. 5. Available at: https://www.enisa.europa.eu/
                                                                                            78 German Federal Ministry of Transport and Digital Infrastructure, Ethics Commission
publications/smart-cars.
73 Horizon 2020 Commission Expert Group to advise on specific ethical issues                Automated and Connected Driving. June 2017, p. 6. Available at: https://www.bmvi.de/
raised by driverless mobility (E03659). Ethics of Connected and Automated Vehi-             SharedDocs/EN/publications/report-ethics-commission.pdf?__blob=publicationFile.
cles: recommendations on road safety, privacy, fairness, explainability and responsi-       79 See Paolo Balboni, “The Automated Vehicle Consortium and Fairness by De-
bility. 2020. Publication Office of the European Union: Luxembourg, p. 8. Avail-            sign”, May 2019. Available at: https://www.paolobalboni.eu/index.php/2019/05/08/
able at: https://ec.europa.eu/info/sites/info/files/research_and_innovation/ethics_of_      the-automated-vehicle-safety-consortium-and-fairness-by-design/. Also see ICT Le-
connected_and_automated_vehicles_report.pdf.                                                gal Consulting’s contribution to nIoVe Deliverable 2.1.
of privacy and personal data protection for the development of a                                 requirements of Fairness by design as they are established in the
healthy and democratic digitalized society.                                                      Maastricht DPCSR framework, stakeholders in the CAV environ-
   In line with this principle, developers and manufacturers should                              ment can actively seek to embed the seeds of fairness directly into
take into account the interests and reasonable expectations of pri-                              the design of connected vehicles.
vacy of data subjects from the design phase of the CAV. The pro-
cessing of personal data in CAVs should not unreasonably intrude                                 2.4      Principle of Purpose Limitation
on the privacy, autonomy and integrity of data subjects nor pres-                                The principle of purpose limitation states that the personal data
sure data subjects to provide their personal data, collecting only                               must be collected for a specified, explicit and legitimate purposes,
what is strictly necessary for the operation of the vehicle. Fairness                            without further processing in a manner that is incompatible with
by Design80 leverages the highly relevant81 principle of fairness                                those purposes.85 As noted by the Article 29 Data Protection Work-
embedded in Article 5(1)(a) GDPR, aiming to both regulate and to                                 ing Party, “any processing following collection, whether for the pur-
prevent any harm which may arise as a consequence of algorithmic                                 poses initially specified or for any additional purposes, must be con-
processing of the CAV.                                                                           sidered ’further processing’ and must thus meet the requirement of
   On the part of manufacturers and developers, the implementa-                                  compatibility.”86 This notion of compatibility includes a criteria
tion of Fairness by design can be realized by way of integration of                              to be assessed by a controller in order to establish if a further
five recommendations into the CAV lifecycle. These include car-                                  processing purpose is compatible with the initial purpose for data
rying out: 1) Human rights impact assessments;82 2) drawing red                                  collection:87 1) whether there is any link between these purposes; 2)
lines for certain types of processing that due to their nature repre-                            the context in which the personal data was collected; 3) the nature
sent too high of a threat to human rights and risk posing a severe                               of the personal data in question; 4) the possible consequences of the
and irreversible impact on fundamental rights and societal welfare                               intended further processing for data subjects; and 5) the existence
in general; 3) providing for reinforced transparency and the right                               of appropriate safeguards, such as encryption or pseudonymisa-
to algorithmic explanation;83 4) the provision of effective redress                              tion.88 Based on a factual assessment of the initial purpose of data
mechanisms (procedural fairness and algorithmic due process);84                                  collection and the intended further purpose, controllers can theo-
and 5) ensuring independent oversight. By implementing the above                                 retically arrive at a conclusion as to whether the further purpose
80 The Maastricht Data Protection as a Corporate Social Responsibility Working Group             is compatible with the initial one89 – and therefore that it does
has established that Fairness by Design embodies Principle 1, Rules 1 (Data Security             not require an additional, specific legal basis to be identified for it
by Design) and 2 (Ethics by design and User Empowerment by design) and integrates                – or is instead incompatible,90 and must be supported by its own
the legal dimension of the GDPR. ICTLC Senior Associate Davide Baldini has actively
contributed to the definition of Principle 1, Rule 3, Fairness by design and its relative five   specific legal basis. As a result, organizations have an obligation to
requirements as they are described above. Together these three rules form the initial            map the purposes for which they collect personal data, and avoid
triad on which the Maastricht DPCSR Framework is built. Also see Paolo Balboni and
Kate Francis, “Data Protection as a Corporate Social Responsibility: From Compliance
                                                                                                 reuse, combination or repurposing of those data for incompatible
to Sustainability to Generate Both Social and Financial Value.” European Centre on               purposes.91
Privacy and Cybersecurity (ECPC), Maastricht University Faculty of Law website.
27 October 2020. Available at: https://www.maastrichtuniversity.nl/ecpc/csr-project/             to the algorithmic decision should be presented with clear indications on how to make
csr-publications.                                                                                use of the designated redress mechanism made available by the organization.
81 Art. 8(2) of the Charter of Fundamental Rights of the European Union provides                 85 Art. 5(1)(b) GDPR. For more information on the principle of purpose limita-

that “Such data must be processed fairly for specified purposes and on the basis of              tion, see, e.g., the United Kingdom Information Commissioner’s Office, Prin-
the consent of the person concerned or some other legitimate basis laid down by law.             ciple (b): Purpose limitation, available at: https://ico.org.uk/for-organisations/
Everyone has the right of access to data which has been collected concerning him or              guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/
her, and the right to have it rectified.”                                                        principles/purpose-limitation/.
82 Organizations which make use of new technologies such as algorithms for process-              86 Article 29 Data Protection Working Party, Opinion 03/2013 on purpose limitation, 2

ing personal data should be able to demonstrate that the data processing they are                April 2013, p. 21. Available at: https://ec.europa.eu/justice/article-29/documentation/
undertaking does not violate the fundamental rights or legitimate interests and expec-           opinion-recommendation/files/2013/wp203_en.pdf.
                                                                                                 87 Note that Art. 6(4) GDPR generally allows further processing to take place, even in
tations of data subjects. Where this may be the case, it should be established that any
identified impact is adequately offset by an advancement in other rights and interests,          the absence of compatibility with the original processing purposes, where consent is
where the “essence” of all fundamental rights involved is respected (see art. 52 par. 1 of       relied on as a legal basis for the further processing, or where the further processing is
the Charter). The Council of Europe recommends carrying out Human rights impact                  authorised by Union or Member State law.
                                                                                                 88 Please note that the Article 29 Data Protection Working Party Opinion 03/2013 on
assessments (HRIAs) and they also form part of the European Commission legislative
proposal for AI. Fundamental Rights Impact Assessments (FRIAs), on the other hand,               purpose limitation refers to four steps, while the GDPR includes five steps. Nevertheless,
have been suggested in the AI context by the European Commission’s High-Level Ex-                substantially the steps are not changed.
pert Group on AI. In such assessments, the risks and potential impact on human rights            89 For example, by applying the compatibility test factors within the Data Protection
implicit in the use of the AI are identified alongside the relevant mitigatory measures          Directive – Directive 95/46/EC – which are similar to those within Article 6(4) GDPR,
taken by the organization, which have to be documented and updated throughout the                the Article 29 Data Protection Working Party presented a scenario where a car manu-
duration of the processing according to the principle of accountability.                         facturer’s further use of public vehicle registry data to notify car owners of malfunction
83 Algorithmic transparency, meaning the possibility for an individual to understand             and recall affected cars as compatible with the initial purpose for which those vehicle
how and why a decision affecting them was made by an algorithm, is an essential                  registry data were collected. See Example 11 of Annex 4 of Article 29 Data Protec-
prerequisite to guarantee fairness in the related data processing activity. Individuals          tion Working Party, Opinion 03/2013 on purpose limitation, 2 April 2013. Available
who have been adversely affected by an automated process have the right to understand            at: https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/
when a decision that impacts them was taken, also in order for them to challenge the             files/2013/wp203_en.pdf.
decision. Article 13(2)(f) and 14(2)(g) GDPR explicitly mandates for data controllers            90 As noted by the Article 29 Data Protection Working Party in their Opinion 03/2013
who implement automatic decision-making systems referred to in Article 22 GDPR                   on purpose limitation, further processing of personal data for tracking and profiling for
to provide “meaningful information about the logic involved, as well as the significance         marketing purposes can usually only be considered as compatible if there is a lawful
and the envisaged consequences of such processing for the data subject”.                         basis for the processing such as genuine, unambiguous, freely given and informed
84 Players in the CAV environment should provide for easily accessible and transparent           consent (see Example 10 of Annex 4).
solutions to individuals for them to challenge the algorithmic decision. The functioning         91 Purpose limitation and minimisation should “be interpreted in such a way that they
of the redress mechanism should be disclosed in advance, e.g., by means of the privacy           do not exclude the use of personal data for machine learning purposes. They should
policy or within an ad hoc information notice, and individuals who have been subject             not preclude the creation of training sets and the construction of algorithmic models,
   While the compatibility test under Article 5(1)(b) and 6(4) GDPR                          communications enhance the CAV stakeholders’ possibility to col-
appears, in theory, to potentially expand the possibilities under                            lect and process personal data. It therefore becomes a tempting
which collected CAV data may be re-used for subsequent purposes,                             ecosystem for exploiting all plausible data collected by the CAV,
there is a potential conflict between this test and the ePrivacy Di-                         at the risk of violating the rights and freedoms of data subjects.
rective’s specific requirement for consent to be obtained from CAV                           Further, the needs for relatively large datasets to properly train
users for the storage of, and access to, CAV information. Given                              and leverage AI functionalities is problematic because it may be
this strict consent requirement, and the fact that GDPR-compliant                            interpreted as a violation of the principle of data minimisation.
consent must be specific (i.e., referring to a specific, explicit and legit-                 However, there are numerous solutions that can be implemented
imate processing purpose, to the exclusion of other purposes),92 the                         in order to ensure that data minimisation is complied with.
EDPB has interpreted the GDPR’s compatibility test as largely in-                                The starting point of vehicle and equipment manufacturers, ser-
applicable to CAV data processing, with CAV stakeholders needing                             vice providers and developers must be to have a clear overview of
to seek additional consent (or otherwise, to identify an applicable                          the categories of data they need from a CAV by utilising the two
legal obligation) to support any further processing of CAV data for                          following criteria: 1) it should be relevant for the intended specific
subsequent purposes.93                                                                       processing, 2) is it necessary for the intended specific processing.97
   This obligation comes at odds with the CAV’s autonomous pro-                              Although this is a subjective test, all CAV stakeholders should aim
cessing of personal data through AI, which may be based on a                                 to carry out this assessment prior to the collection of personal data
(re)interpretation of goals, or, possibly, a shift in focus from the                         and be in the position of demonstrating that they have done so
original goal for which the personal data was collected. Even though                         – specific obligations to perform data minimisation assessments
this can pose several barriers in the use of personal data for other                         (either specifically, or as part of wide data protection impact as-
purposes, the flexible application of “compatibility” allows for the                         sessments), and to properly document and make those assessments
reuse of personal data, when it is not incompatible with the original                        available, can be assigned to the relevant parties in the context of
purpose. Additionally, the CAV stakeholders can reuse the personal                           the Data Management Agreements which may be entered between
data for statistical purposes, unless it involves unacceptable risks                         them to regulate CAV data processing (as noted above). For exam-
for the data subject.94                                                                      ple, collecting location data is invasive towards a data subject, since
                                                                                             it can reveal essential and sensitive information98 relating to them,
2.5      Principle of Data Minimisation                                                      such as information relating to their personal preferences, travel
The principle of data minimisation requires that personal data is                            habits and relationships with others. On the one hand, manufactur-
processed in an adequate and relevant way, limited to what is                                ers, developers and service providers could differentiate between
necessary in relation to the purposes for which they processed.95                            data used for CAV training99 and that used for the deployment of
This principle requires to only process the personal data which                              the CAV.100 This could ensure that the personal data collected has a
is strictly necessary for the purposes of the processing. The CAV
                                                                                             97 European Data Protection Board, Guidelines 1/2020 on processing personal data in the
collects a great deal of information as a result of the functions it
                                                                                             context of connected vehicles and mobility related applications, Version 1.0, 28 January
offers, such as infotainment systems (e.g., seat entertainment), or                          2020, p. 14.
through the telematics ecosystem (e.g., Global Navigation Satellite                          98 “Sensitive data or data of a highly personal nature: this includes special categories

Systems data), and can exchange that data with any other entity                              of personal data as defined in Article 9 (for example information about individu-
                                                                                             als’ political opinions), as well as personal data relating to criminal convictions or
(V2X communication), such as traffic signals, smart homes, or other                          offences as defined in Article 10. An example would be a general hospital keeping
vehicles.96 The number of sensors, connected devices and network                             patients’ medical records or a private investigator keeping offenders’ details. Be-
                                                                                             yond these provisions of the GDPR, some categories of data can be considered as
                                                                                             increasing the possible risk to the rights and freedoms of individuals. These personal
whenever the resulting AI systems are socially beneficial and compliant with data            data are considered as sensitive (as this term is commonly understood) because they
protection rights.” Furthermore, the use of data in training sets for algorithmic models     are linked to household and private activities (such as electronic communications
should not be precluded when their inclusion is compliant with data protection rights        whose confidentiality should be protected), or because they impact the exercise of
and is socially beneficial. See European Parliamentary Research Service, The impact of       a fundamental right (such as location data whose collection questions the freedom
the General Data Protection Regulation (GDPR) on artificial intelligence, June 2020, p. IV   of movement) or because their violation clearly involves serious impacts in the data
and p. 46. Available at: https://www.europarl.europa.eu/RegData/etudes/STUD/2020/            subject’s daily life (such as financial data that might be used for payment fraud). In
641530/EPRS_STU(2020)641530_EN.pdf.                                                          this regard, whether the data has already been made publicly available by the data
92 European Data Protection Board, Guidelines 05/2020 on consent under Regulation
                                                                                             subject or by third parties may be relevant. The fact that personal data is publicly
2016/679, Version 1.1, 4 May 2020, Section 3.2. Available at: https://edpb.europa.eu/        available may be considered as a factor in the assessment if the data was expected to
sites/edpb/files/files/file1/edpb_guidelines_202005_consent_en.pdf.                          be further used for certain purposes”, as explained in the Article 29 Working Party’s
93 See European Data Protection Board, Guidelines 1/2020 on processing personal data
                                                                                             Guidelines on Data Protection Impact Assessment (DPIA) and determining whether pro-
in the context of connected vehicles and mobility related applications, Section 1.5.3.       cessing is “likely to result in a high risk” for the purposes of Regulation 2016/679, 4
94 European Parliamentary Research Service, The impact of the General Data Protection        April 2017, As last Revised and Adopted on 4 October 2017, pp. 9-10. Available at:
Regulation (GDPR) on artificial intelligence, Study of the Panel for the Future of Sci-      https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=611236.
ence and Technology, June 2020, p. 45. Available at: https://www.europarl.europa.eu/         99 For example, training data sets could include synthetic data – meaning a simulated
RegData/etudes/STUD/2020/641530/EPRS_STU(2020)641530_EN.pdf.                                 environment replicating the relevant features of the real world. Therefore, synthetic
95 See Article 5(1)(c) General Data Protection Regulation. Furthermore, note that this       data allows for CAV manufacturers and service providers to simulate scenarios that
principle necessitates the processing of only the personal data which is strictly neces-     are difficult to observe or replicate in real-life. This approach does not only serve for
sary for the purposes of the processing. For more information on the principle of data       the safety of the CAV, but also for the utmost protection of data subjects’ data subject
minimisation, please see the European Data Protection Board, Guidelines 4/2019 on            rights and freedoms, as explained by the Organisation for Economic Co-operation and
Article 25 Data Protection by Design and by Default, 13 November 2019, p. 19. Avail-         Development (OECD), Artificial Intelligence in Society, 11 June 2019, OECD Publishing:
able at: https://edpb.europa.eu/our-work-tools/public-consultations-art-704/2019/            Paris, p. 98. Available at: https://doi.org/10.1787/eedfee77-en.
guidelines-42019-article-25-data-protection-design_en.                                       100 European Commission, White Paper on Artificial Intelligence – A European approach
96 European Union Agency for Cybersecurity, Good Practices for Security of Smart Cars,       to excellence and trust, February 2020, p. 19. Available at: https://ec.europa.eu/info/
November 2019, p. 13.                                                                        sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf.
specific and legitimate interest and minimising its use in other parts                         “the transparency of the technology and services in how that tech-
of the CAV’s lifecycle. Additionally, limiting the use of the data in                          nology handles data, as well as providing choice for the user”.106
the deployment phase could help mitigate potential harms to data                               Individuals should be able to clearly understand the purposes and
subjects. On other hand, according to recent research, the use of                              the limitations of data processing in the CAV,107 as well as the
synthetic data can be used for the training of the CAVs’ models, if                            expected level of accuracy with respect to the envisioned purpose
it is available and affordable.101 Synthetic input data would be able                          and the conditions under which they can function as intended.108
to train the AI system on complex situations and ensure for safer                              This is particularly complex in a situation where it “must always
and more accurate CAVs.102                                                                     be possible to reduce the AI system’s computations to a form com-
                                                                                               prehensible by humans”109 and advanced technologies “should be
2.6         Transparency                                                                       equipped with a ‘black box’ which records data on every transaction
Transparency103 is closely related to the concept of fairness and                              carried out by the machine, including the logic that contributed to
the principle of accountability under the GDPR, where Article 5(2)                             its decisions.”110
GDPR104 requires that the controller must always be able to demon-                                 A key element of transparency is found in Article 12 GDPR
strate that personal data are processed in a transparent manner in                             on Transparent information,111 communication and modalities for
relation to the data subject. Transparency is furthermore a funda-                             the exercise of the rights of the data subject. Article 12(1), in fact
mental enabler of user-centric processing105 because it allows the                             requires that information relating to the processing should be pro-
data subject to understand and potentially challenge data process-                             vided to the “data subject in a concise, transparent, intelligible and
ing that involves them. Without transparency and awareness of                                  easily accessible form, using clear and plain language”.112 This is
data processing activities data subjects cannot be in control of their                         no easy feat, however, as the actual audience may be different than
data and exercise their rights. Being transparent involves the qual-                           the intended audience of the processing and adjustments may be
ity, accessibility and comprehensibility of the information provided.                          necessary, especially over time, in situations where, e.g., the driver
The information furthermore must be explainable. Transparency                                  may not be the owner or regular user of the CAV. One way through
also acts as a promoter of trust in processes, which is required in                            which this may be mitigated is the use of standardized icons113
order to ensure product uptake.
   The principle of transparency is of particular relevance in the                             106 Baldini et al. Ethical Design in the Internet of Things, p. 905. Sci Eng Ethics (2018)

CAV environment insofar as the rationale behind automated de-                                  24:905–925. https://link.springer.com/content/pdf/10.1007/s11948-016-9754-5.pdf.
                                                                                               107 European Parliamentary Research Service,          The impact of the General Data
cisions may significantly affect individuals and could even lead                               Protection Regulation (GDPR) on artificial intelligence, June 2020. Available
to life and death scenarios. In fact, ethical design is dependent on                           at:      https://www.europarl.europa.eu/RegData/etudes/STUD/2020/641530/EPRS_
                                                                                               STU(2020)641530_EN.pdf.
101 Ibid.                                                                                      108 European Commission, White Paper on Artificial Intelligence – A European approach
102 Organisation for Economic Co-operation and Development, Artificial Intelligence in         to excellence and trust, February 2020. Available at: https://ec.europa.eu/info/sites/info/
Society, 11 June 2019, OECD Publishing, Paris, p. 98. Available at: https://doi.org/10.        files/commission-white-paper-artificial-intelligence-feb2020_en.pdf.
                                                                                               109 See Report with recommendations to the Commission on Civil Law Rules on Robot-
1787/eedfee77-en.
103 See Article 29 Working Party, Guidelines on transparency under Regulation 2016/679,        ics (2015/2103(INL)), A8-0005/2017, 27.1.2017, p. 10. Available at: http://www.europarl.
29 November 2017, as last Revised and Adopted on 11 April 2018. Available at: https:           europa.eu/doceo/document/A-8-2017-0005_EN.pdf.
                                                                                               110 Ibid. Also see ICT Legal Consulting’s contribution to nIoVe deliverable 2.1, section
//ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=622227.
     Furthermore, note that the principle of transparency permeates throughout the             4.3 on the Legal & Ethical Compliance of Defence Concepts. For more information
diverse rules of the Maastricht DPCSR Framework. These specifically include the                about the project, see the nIoVe website. Available at: https://niove.eu/.
rules which fall under Principle 1: Embed data protection and data security in the             111 Note that the Article 29 Working Party in its Guidelines on Transparent information to
design of processes, under which the rules of data security, ethics by design and user         this end suggest making use of user panels to test the “intelligibility of the information
empowerment by design, and fairness by design are situated; and Principle 2: Be                and effectiveness of user interfaces/ notices/ policies etc.” (See Article 29 Working Party,
transparent with citizens about the collection of their data, which suggests using icons       Guidelines on transparency under Regulation 2016/679, p. 8). Further to this requirement,
to signal that data processing activities are taking place. See Paolo Balboni and Kate         the Article 29 Working Party notes the importance of the data subject being “able
Francis, “Data Protection as a Corporate Social Responsibility: From Compliance                to determine in advance what the scope and consequences of the processing entails”
to Sustainability to Generate Both Social and Financial Value.” European Centre on             which fundamentally means explaining the potential actual effects on the rights and
Privacy and Cybersecurity (ECPC), Maastricht University Faculty of Law website.                freedoms of data subject, not limited to best case scenarios, but instead those which
27 October 2020. Available at: https://www.maastrichtuniversity.nl/ecpc/csr-project/           actually may severely affect individuals.
csr-publications.                                                                              112 The requirement of intelligible information “means that it should be understood by
104 Recital 39 GDPR states, “. . . that (1) any information and communication relating to      an average member of the intended audience” (see Article 29 Working Party, Guidelines
the processing of those personal data be easily accessible and easy to understand, and that    on transparency under Regulation 2016/679, p. 7) and therefore requires the organization
clear and plain language be used. That principle concerns, in particular, information to       to “identify the intended audience and ascertain the average member’s level of under-
the data subjects on the identity of the controller and the purposes of the processing and     standing.” (see Article 29 Working Party, Guidelines on transparency under Regulation
further information to ensure fair and transparent processing in respect of the natural        2016/679, p. 8).
persons concerned and their right to obtain confirmation and communication of personal         113 The European Data Protection Board’s Guidelines on CAVs suggest “informing
data concerning them which are being processed. (2) Natural persons should be made             the user that geolocation has been activated, in particular by using icons (e.g., an
aware of risks, rules, safeguards and rights in relation to the processing of personal data    arrow that moves across the screen)”, p. 13. Also see the CNIL’s Compliance Package
and how to exercise their rights in relation to such processing. In particular, the specific   on connected vehicles and personal data, which “recommends that the data subjects
purposes for which personal data are processed should be explicit and legitimate and           be informed by: concise and easily-understandable clauses in the contract of sale
determined at the time of the collection of the personal data. The personal data should be     of the vehicle and / or in the contract for the provision of services; and by using
(3) adequate, relevant and limited to what is necessary for the purposes for which they        distinct documents (e.g., the vehicle’s maintenance record book or manual) or the
are processed. This requires, in particular, ensuring that the period for which the personal   onboard computer; and using standardised icons in vehicles. The Commission strongly
data are stored is limited to a strict minimum.”                                               encourages the implementation of those icons to inform the data subjects in a clear,
105 See Principle 1, Rule 2 Maastricht DPCSR, Implement Ethics by design and User              summarised, and easily-understandable manner of the processing of their data. In
Empowerment by design, actively empowering individuals with respect to their data,             addition, the Commission emphasises the importance of standardising those icons, so
which calls for designers and producers of technologies and services to go beyond              that the user finds the same symbols regardless of the make or model of the vehicle.”
the requirements of user-centric design in order to actively empower individuals with          Available at: https://www.cnil.fr/sites/default/files/atoms/files/cnil_pack_vehicules_
respect to their data.                                                                         connectes_gb.pdf.
and sounds in CAVs pursuant to Maastricht DPCSR Principle 2, Be                           prevent harm as a result of the algorithmic processing of the CAV,
transparent with citizens about the collection of their data, Rule 1:                     embedding fairness into the CAV itself (see sub-section 2.3).
Before processing. The organization shall use icons (and sounds) for an                      Conflicts with the principle of purpose limitation may be bal-
easily visible, intelligible and clearly legible provision of information                 anced by applying compatibility tests when personal data needs to
concerning the intended processing. Electronically presented icons                        be used for other purposes, allowing for the reuse of personal data
should be machine-readable.114                                                            when the further purpose is compatible with the original purpose
                                                                                          (see sub-section 2.4). Hand-in-hand with the principle of purpose
3    CONCLUSION: DATA PROTECTION AS A                                                     limitation is the challenge posed to the principle of data minimiza-
     CORPORATE SOCIAL RESPONSIBILITY                                                      tion; whereby CAV manufacturers, developers and service providers
                                                                                          are faced with the possibility of processing massive amounts of
As reflected in the above sections, the road towards compliance
                                                                                          personal data and the obligation of only processing the personal
and ethical deployment requires CAV manufacturers, developers
                                                                                          data which is relevant and necessary for the envisioned processing
and service providers to come up with ingenious and novel solu-
                                                                                          activity (see sub-section 2.5). Other than carrying out the above
tions when designing and developing CAVs. Consideration of the
                                                                                          test of what categories of personal data are relevant and neces-
GDPR-based obligations and the ePrivacy Directive requirements
                                                                                          sary to process, CAV stakeholders could also differentiate between
on consent and their application to CAVs unveils several unresolved
                                                                                          personal data used for the training stage and for the deployment
issues, as seen in conflicts between restrictive legal principles, rules
                                                                                          and monitoring of the vehicle. Furthermore, transparency require-
and requirements, on the one hand, and innovation on the other.
                                                                                          ments are especially intensified in the CAV environment since the
As a result, organisations may struggle to fully meet their legal
                                                                                          GDPR’s expectations are that a data subject is able to understand
obligations. This paper has therefore sought to help address this
                                                                                          the entire CAV processing (including the purposes, risks, recipients
problem by identifying several areas that are to be considered as
                                                                                          of personal data, etc.), as well as accurately comprehend the AI’s
potential priorities in designing and developing CAVs.
                                                                                          computations and automated decision making. Creative solutions
   Concerning the GDPR predefined data processing roles, it is
                                                                                          will be required on the side of the CAV manufacturers, developers
recommended to mirror the complex data processing relationships
                                                                                          and service providers in order to overcome this transparency ob-
through multi-part Data Management Agreements which should
                                                                                          stacle, by way of standardized icons, sounds, and the possibility
aim to identify and regulate the variety of activities and relation-
                                                                                          of changing the content and provision of information according
ships that exist between CAV stakeholders (see sub-section 2.1) . By
                                                                                          to the audience at hand (e.g., drivers, passengers and children on
properly configuring a Data Management Agreement, stakeholders
                                                                                          board) (see sub-section 2.6).
can 1) map out the different types of CAV data processing activities
                                                                                             While compliance with applicable data protection obligations
which they are to perform, 2) identify the role or roles – indepen-
                                                                                          represents an important starting point towards the lawful deploy-
dent controller, processor or joint controller – which apply to them
                                                                                          ment of CAVs, reliance on existing legal privacy, data protection,
in relation to each processing activity, and 3) set out the obligations
                                                                                          and security frameworks is not enough to ensure a sustainable and
to which each of them are bound as a result of the roles identified
                                                                                          beneficial proliferation of automated vehicles. In the case of new
for each specific activity. This greatly reduces the risk of “grey areas”
                                                                                          technologies and economic models that are constantly propelled
or undefined loopholes and ensures greater comprehensiveness and
                                                                                          into being thanks to perpetually-transforming innovations, regu-
clarity of regulation of these complex processing relationships, for
                                                                                          lation seems to come short in providing genuine protection of the
the benefit of CAV stakeholders and the data subjects concerned.
                                                                                          fundamental rights and freedoms of Europeans and in effectively
   As a way to ensure data protection by design and by default,
                                                                                          mitigating the risks presented by them. Due to the particularly
organizations should prioritise carrying out data protection im-
                                                                                          high-risk nature of transportation and the extensive data process-
pact assessments and IT security risk assessments prior to any
                                                                                          ing operations that take place within the CAV in order to both
processing activity and subsequently map the necessary technical
                                                                                          make it function and make it enjoyable (infotainment) in fact, it is
and organizational security measures that would mitigate any high
                                                                                          necessary that ethical concerns are adequately incorporated into
risks posed towards the rights and freedoms of data subjects (see
                                                                                          the processes of developers, manufacturers, and service providers
sub-section 2.2). A Data security by design approach should entail
                                                                                          active in this environment.
adopting a by-design approach to security by integrating security
                                                                                             The need for such an approach to data protection, one that can
best practices into the practices of the CAV stakeholder’s orga-
                                                                                          be considered as “ethical”, which weds value-based models in the
nization on both the organizational and technical levels; human
                                                                                          development of a newly virtuous form of compliance, going beyond
oversight and monitoring should also be ensured. CAVs should be
                                                                                          what is prescribed by EU data protection law (ePrivacy directive
designed in adherence to an Ethics by design and User empowerment
                                                                                          and the GDPR) has already been confirmed by the European Data
by design approach which aims to actively oppose harm and em-
                                                                                          Protection Supervisor,115 the European Commission,116 and the
power users with respect to their data, ensuring that the positive
societal benefits of CAVs can be reaped. More specifically, CAV
stakeholders must consider Fairness by Design to regulate as well as
                                                                                          115 European Data Protection Supervisor, Report Towards a digital ethics – EDPS Ethics
114 Paolo Balboni and Kate Francis, “Data Protection as a Corporate Social Responsibil-   Advisory Group. 25 January 2018. Available at: https://edps.europa.eu/sites/edp/files/
ity: From Compliance to Sustainability to Generate Both Social and Financial Value.”      publication/18-01-25_eag_report_en.pdf.
European Centre on Privacy and Cybersecurity (ECPC), Maastricht University Faculty        116 European Commission, European Group on Ethics in Science and New Technologies
of Law website. 27 October 2020. Available at: https://www.maastrichtuniversity.nl/       Statement on Artificial Intelligence, Robotics and ‘Autonomous’ Systems. March 2018.
ecpc/csr-project/csr-publications.                                                        Available at: https://ec.europa.eu/research/ege/pdf/ege_ai_statement_2018.pdf
Council of Europe,117 among others. In the area of new technologies,                        [19] Directive 2009/136/EC of the European Parliament and of the Council of 25 No-
such as CAVs, this regulatory gap can be met in the application                                  vember 2009 amending Directive 2002/22/EC on universal service and users’
                                                                                                 rights relating to electronic communications networks and services, Direc-
of the principles that are outlined in the Maastricht University                                 tive 2002/58/EC concerning the processing of personal data and the protec-
Data Protection as a Corporate Social Responsibility Framework.                                  tion of privacy in the electronic communications sector and Regulation (EC)
                                                                                                 No 2006/2004 on cooperation between national Supervisory Authorities re-
By following the Maastricht DPCSR Framework, operators in the                                    sponsible for the enforcement of consumer protection laws (Text with EEA
automated vehicle sector can help ensure compliance not only with                                relevance). Available at: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=
what is enshrined in the law, but also aim to provide added benefits                             CELEX:02009L0136-20091219
                                                                                            [20] Directive 2008/63/EC of 20 June 2008 on competition in the markets in
for society, seeking not only to not cause harm, but to “do good” in                             telecommunications terminal equipment (Codified version) (Text with EEA
the digital arena.                                                                               relevance). Available at: https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=
                                                                                                 CELEX%3A32008L0063
                                                                                            [21] Directive (EU) 2016/1148 of the European Parliament and of the Coun-
REFERENCES                                                                                       cil of 6 July 2016 concerning measures for a high common level of se-
 [1] A Novel Adaptive Cybersecurity Framework for the Internet-of-Vehicles (nIoVe).              curity of network and information systems across the Union. Available
     H2020 Project. Grant agreement ID: 833742. Deliverable 2.1. https://niove.eu/               at: https://eur-lex.europa.eu/legal-content/EN/TXT/?toc=OJ:L:2016:194:TOC&
 [2] Agencia Española de Protección de Datos, Guía de Privacidad desde el Dis-                   uri=uriserv:OJ.L_.2016.194.01.0001.01.ENG
     eño. October 2019. Available at: https://www.aepd.es/sites/default/files/2019-11/      [22] European Commission, White Paper on Artificial Intelligence – A European
     guia-privacidad-desde-diseno.pdf                                                            approach to excellence and trust. February 2020. Available at: https://ec.europa.eu/
 [3] Agencia Española de Protección de Datos, Guía de Protección de Datos por De-                info/sites/info/files/commission-white-paper-artificial-intelligence-feb2020_
     fecto. October 2020. Available at: https://www.aepd.es/sites/default/files/2020-10/         en.pdf
     guia-proteccion-datos-por-defecto.pdf                                                  [23] European Commission, European Group on Ethics in Science and New Technologies
 [4] Article 29 Data Protection Working Party, Guidelines on Data Protection Impact              Statement on Artificial Intelligence, Robotics and ‘Autonomous’ Systems. March
     Assessment (DPIA) and determining whether processing is “likely to result in a              2018. Brussels, Belgium: European Commission. Available at: https://ec.europa.
     high risk” for the purposes of Regulation 2016/679. 4 April 2017. As last Revised           eu/research/ege/pdf/ege_ai_statement_2018.pdf
     and Adopted on 4 October 2017. Available at: https://ec.europa.eu/newsroom/            [24] European Commission, Independent High-Level Expert Group on Artificial Intel-
     article29/item-detail.cfm?item_id=611236                                                    ligence, Ethics Guidelines for Trustworthy AI. 8 April 2019. Available at: https://ec.
 [5] Article 29 Data Protection Working Party, Guidelines on transparency under                  europa.eu/digital-single-market/en/news/ethics-guidelines-trustworthy-ai
     Regulation 2016/679. 22 August 2018. Available at: https://ec.europa.eu/newsroom/      [25] European Data Protection Board, Guidelines 1/2020 on processing personal data
     article29/item-detail.cfm?item_id=622227                                                    in the context of connected vehicles and mobility related applications, Version 1.0,
 [6] Article 29 Data Protection Working Party, Opinion 03/2010 on the principle of               Adopted on 28 January 2020. Available at: https://edpb.europa.eu/sites/edpb/files/
     accountability. 13 July 2010. Available at: https://ec.europa.eu/justice/article-29/        consultation/edpb_guidelines_202001_connectedvehicles.pdf
     documentation/opinion-recommendation/files/2010/wp173_en.pdf                           [26] European Data Protection Board, Guidelines 05/2020 on consent under Regulation
 [7] Article 29 Data Protection Working Party, Opinion 03/2013 on purpose limitation.            2016/679, Version 1.1., 4 May 2020. Available at: https://edpb.europa.eu/sites/edpb/
     2 April 2013. Available at: https://ec.europa.eu/justice/article-29/documentation/          files/files/file1/edpb_guidelines_202005_consent_en.pdf
     opinion-recommendation/files/2013/wp203_en.pdf                                         [27] European Data Protection Board, Guidelines 4/2019 on Article 25 Data
 [8] Article 29 Data Protection Working Party, Opinion 08/2014 on the                            Protection by Design and by Default. 13 November 2019. Available at:
     Recent Developments on the Internet of Things. 16 September 2014.                           https://edpb.europa.eu/sites/edpb/files/consultation/edpb_guidelines_201904_
     Available        at:      https://ec.europa.eu/justice/article-29/documentation/            dataprotection_by_design_and_by_default.pdf
     opinion-recommendation/files/2014/wp223_en.pdf                                         [28] European Data Protection Board, Guidelines 07/2020 on the concepts of controller
 [9] Balboni, Paolo, “The Automated Vehicle Consortium and Fairness by Design.”                  and processor in the GDPR, Version 1.0 , Adopted on 02 September 2020. Available
     Blog post. May 2019. Available at: https://www.paolobalboni.eu/index.php/2019/              at: https://edpb.europa.eu/our-work-tools/public-consultations-art-704/2020/
     05/08/the-automated-vehicle-safety-consortium-and-fairness-by-design/                       guidelines-072020-concepts-controller-and-processor_en
[10] Balboni, Paolo and Kate Francis, “Data Protection as a Corporate Social Re-            [29] European Data Protection Supervisor, Connected Cars, TechDispatch, Is-
     sponsibility: From Compliance to Sustainability to Generate Both Social and                 sue 3, 2019. Available at: https://edps.europa.eu/data-protection/our-work/
     Financial Value.” European Centre on Privacy and Cybersecurity (ECPC), Maas-                publications/techdispatch/techdispatch-3-connected-cars_en
     tricht University Faculty of Law website. 27 October 2020. Available at: https:        [30] European Data Protection Supervisor, EDPS response to the Commission public
     //www.maastrichtuniversity.nl/ecpc/csr-project/csr-publications                             consultation on the regulatory environment for platforms, online intermediaries,
[11] Baldini, Gianmarco et al., “Ethical Design in the Internet of Things.” Science and          data and cloud computing and the collaborative economy. 16 December 2015.
     engineering ethics vol. 24, 3 (2018): 905-925. doi:10.1007/s11948-016-9754-5                Available at: https://edps.europa.eu/sites/edp/files/publication/15-12-16_online_
[12] Charter of Fundamental Rights of the European Union. Available at: https://                 platforms_en.pdf
     eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:12012P/TXT                           [31] European Data Protection Supervisor, Opinion 5/2018 – Preliminary Opinion on pri-
[13] Commission Nationale Informatique & Libertes, Compliance Package: Connected                 vacy by design. 31 May 2018. Available at: https://edps.europa.eu/sites/edp/files/
     vehicles and personal data. October 2017. Available at: https://www.cnil.fr/sites/          publication/18-05-31_preliminary_opinion_on_privacy_by_design_en_0.pdf
     default/files/atoms/files/cnil_pack_vehicules_connectes_gb.pdf                         [32] European Data Protection Supervisor, Report Towards a digital ethics – EDPS
[14] Council of Europe, Guidelines on the protection of individuals with regard to the           Ethics Advisory Group. January 25, 2018. Available at: https://edps.europa.eu/
     processing of personal data in a world of Big Data adopted January 2017. Strasbourg,        sites/edp/files/publication/18-01-25_eag_report_en.pdf
     France: Council of Europe, 2017. Available at: https://rm.coe.int/16806ebe7a           [33] European Parliament. Charter of Fundamental Rights of the European Union.
[15] Court of Justice of the European Union, C-582/14, Patrick Breyer                            Luxembourg: Office for Official Publications of the European Communities. 26 Oc-
     v Bundesrepublik Deutschland, ECLI:EU:C:2016:779. Available at:                             tober 2012. Available at: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=
     http://curia.europa.eu/juris/document/document.jsf?text=&docid=184668&                      celex:12012P/TXT
     pageIndex=0&doclang=en&mode=lst&dir=&occ=first&part=1&cid=1116945                      [34] European Parliament, Report with recommendations to the Commission on Civil
[16] Court of Justice of the European Union, C-210/16, Unabhängiges                              Law Rules on Robotics (2015/2103(INL)), A8-0005/2017. 27 January 2017. Available
     Landeszentrum für Datenschutz Schleswig-Holstein v Wirtschaft-                              at: http://www.europarl.europa.eu/doceo/document/A-8-2017-0005_EN.pdf
     sakademie Schleswig-Holstein GmbH, ECLI:EU:C:2018:388. Available at:                   [35] European Parliamentary Research Service, Study of the Panel for the Future
     http://curia.europa.eu/juris/document/document.jsf?text=&docid=202543&                      of Science and Technology, The ethics of artificial intelligence: Issues and initia-
     pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=1206721                           tives. March 2020. Available at: https://www.europarl.europa.eu/RegData/etudes/
[17] Court of Justice of the European Union, C-25/17, Tietosuojavaltuutettu v Je-                STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf
     hovan todistajat, ECLI:EU:C:2018:551. Available at: http://curia.europa.eu/juris/      [36] European Parliamentary Research Service, Study of the Panel for the Future of Sci-
     document/document.jsf?docid=203822&doclang=EN                                               ence and Technology, The impact of the General Data Protection Regulation (GDPR)
[18] Court of Justice of the European Union, C-40/17, Fashion ID GmbH & Co. KG                   on artificial intelligence. June 2020. Available at: https://www.europarl.europa.eu/
     v Verbraucherzentrale NRW eV, ECLI:EU:C:2019:629. Available at: http://curia.               RegData/etudes/STUD/2020/641530/EPRS_STU(2020)641530_EN.pdf
     europa.eu/juris/liste.jsf?num=C-40/17                                                  [37] European Union Agency for Cybersecurity, Good Practices for Security of Smart
                                                                                                 Cars, November 2019. Available at: https://www.enisa.europa.eu/publications/
117 Council of Europe, Guidelines on the protection of individuals with regard to the
                                                                                                 smart-cars
processing of personal data in a world of Big Data adopted January 2017. 2017. Available    [38] European Union Agency for Cybersecurity, ENISA Programming Document 2020–
at: https://rm.coe.int/16806ebe7a.                                                               2022 Including Multiannual planning, Work programme 2020 and Multiannual
     staff planning. Luxembourg: Publication Office of the European Union, 2020.             [44] Maastricht University. Developing a New Dimension of Data Protection as a
     Available at: https://www.enisa.europa.eu/publications/corporate-documents/                  Corporate Social Responsibility (DPCSR). Website. Available at: https://www.
     enisa-programming-document-202020132022                                                      maastrichtuniversity.nl/ecpc/csr-project
[39] European Union Agency for Cybersecurity, Towards a framework for policy de-             [45] Organisation for Economic Co-operation and Development, Artificial Intelligence
     velopment in cybersecurity, security and privacy considerations in autonomous                in Society, 11 June 2019. OECD Publishing: Paris. Available at: https://doi.org/10.
     agents. March 2019. Available at: https://www.enisa.europa.eu/publications/                  1787/eedfee77-en
     considerations-in-autonomous-agents                                                     [46] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27
[40] German Federal Ministry of Transport and Digital Infrastructure, Ethics                      April 2016, on the protection of natural persons with regard to the processing of
     Commission Automated and Connected Driving. June 2017. Available at:                         personal data and on the free movement of such data, and repealing Directive
     https://www.bmvi.de/SharedDocs/EN/publications/report-ethics-commission.                     95/46/EC. Available at: https://eur-lex.europa.eu/eli/reg/2016/679/oj
     pdf?__blob=publicationFile                                                              [47] United        Kingdom        Information      Commissioner’s      Office,    Data
[41] German Federal Ministry of Transport and Digital Infrastructure, Task Force on               protection by design and by default. Available at: https:
     Ethical Aspects of Connected and Automated Driving (Ethics Task Force) established           //ico.org.uk/for-organisations/guide-to-data-protection/
     by the 2nd High Level Structural Dialogue in Frankfurt/M. on 14 and 15 September             guide-to-the-general-data-protection-regulation-gdpr/
     2017. June 2018. Available at: https://www.bmvi.de/SharedDocs/EN/publications/               accountability-and-governance/data-protection-by-design-and-default/
     report-ethics-task-force-automated-driving.pdf?__blob=publicationFile                   [48] United Kingdom Information Commissioner’s Office, Principle (b): Purpose limita-
[42] Government of The Netherlands, Self-driving cars. Available at:                              tion. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/
     https://www.government.nl/topics/mobility-public-transport-and-road-safety/                  guide-to-the-general-data-protection-regulation-gdpr/principles/
     self-driving-vehicles                                                                        purpose-limitation/
[43] Horizon 2020 Commission Expert Group to advise on specific ethical issues               [49] United Kingdom Information Commissioner’s Office, Security. Avail-
     raised by driverless mobility (E03659). Ethics of Connected and Automated Vehi-              able      at:     https://ico.org.uk/for-organisations/guide-to-data-protection/
     cles: recommendations on road safety, privacy, fairness, explainability and responsi-        guide-to-the-general-data-protection-regulation-gdpr/security/
     bility. 2020. Publication Office of the European Union: Luxembourg. Available           [50] WSP Canada Group Limited and Ontario Centres of Excellence, Ontario CAV
     at: https://ec.europa.eu/info/sites/info/files/research_and_innovation/ethics_of_            Ecosystem Analysis. 2019. Available at: https://www.oce-ontario.org/docs/
     connected_and_automated_vehicles_report.pdf                                                  default-source/publications/avin-ecosystem-analysis-final-report-2019.pdf?
                                                                                                  sfvrsn=2