Skip to main content
Articles

Targeted and ex ante assessments: Protecting the right to privacy and the protection of personal data and enhancing the overall data processing mechanism within and outside the TEMA project

Personal data

Prof. Ioannis Pitas, TEMA-Project Manager at Aristotle University of Thessaloniki

Dr. Georgios Bouchagiar, TEMA-Project Management Support Team at Aristotle University of Thessaloniki

In a recent publication[1], AUTH (TEMA coordinator) proposed a fully-fledged privacy-assessment that could be applied to future uses of Autonomous Systems (AS) for Natural Disaster Management (NDM) purposes. After exploring whether and how certain implementations may interfere with the right to privacy and the protection of personal data, the authors detected key challenges stemming from non-compliance with the General Data Protection Regulation (GDPR). Thereafter, they subjected the use of AS to the European Court of Human Rights’ (ECtHR’s) Legality – Legitimacy – Necessity testing (LLN-check); and, on this basis, they recommended a targeted and ex ante privacy-assessment to address legal uncertainty, resulting from the GDPR’s tech-neutrality, and case law’s ex post (after the harm) adjudication. The recommended scheme, involving independent experts from various disciplines (independent committee), could apply before the actual use of any AS and give a ‘go’, a ‘go-in-condition’ or a ‘no-go’ decision. 

Below is illustrated the structure of the proposed assessment, regarding the example of an autonomous drone, targeted at monitoring urban areas to manage the effects of heavy rain.    

 

Assessing privacy-compliance of an autonomous drone-use in urban areas to manage heavy rain 

______________________________________________________________________________________________

DATA ON THE REQUESTED AS-USE FOR NDM GOALS

Model: Autonomous Drone v.2

Developer: AI Drones Inc 

Requested use: In urban areas prone to be heavily affected by heavy rain

Requesting entity: State (civil protection entities) 

Input: Sensor data capturing public infrastructure and personal data of passers-by  

Output: Prediction of urban locations that could be affected by flood

PRIVACY ASSESSMENT

a. LEGALITY test

Legal basis for requested use: National law permits civil protection authorities to manage natural disasters in an effective and efficient way. This expressly includes the use of autonomous drones. 

Accessibility of the legal basis: The national law has been published in the official journal of the government. 

Foreseeability of the legal basis as to the categories of the people liable to be surveilled: The national law mentions that civil protection entities can, in certain conditions, monitor for NDM purposes in urban areas, as well as gather citizens’ personal data. 

Foreseeability of the legal basis as to the circumstances triggering surveillance: The national law allows for the use of autonomous systems in specific areas that are prone to be affected by certain phenomena (including heavy rain), especially when these phenomena are likely to occur in the near future.

Foreseeability of the legal basis as to limitations on duration of surveillance: The national law only permits the use of autonomous drones within specific time periods during the day.

Foreseeability of the legal basis as to the procedures concerning data processing: The national law describes in detail the procedures on secure storing, requires good encryption mechanisms enhancing confidentiality and provides for adequate privacy-enhancing safeguards.

Foreseeability of the legal basis as to precautions regarding communicating data: The national law imposes restrictions on data sharing. Solely authorised (civil protection) entities can process information collected by drones. 

Foreseeability of the legal basis as to the circumstances for data erasure: The national law demands automated deletion of personal data, but also anonymisation/pseudonymisation of these items of information, where they are necessary to perform the NDM task.

Detailedness of the legal basis: The national law makes detailed reference to examples and scenarios, clarifying drone uses. It also includes annexes with authorised uses in specific pre-determined areas. 

b. LEGITIMACY test

Legitimacy of the goal pursued by the requested AS-use: The requested AS-use is targeted at implementations in specific urban areas that are considered prone to be heavily affected by heavy rain. Therefore, the requested use is serving the protection of public safety and health. 

c. NECESSITY test 

Proportionality of the requested AS-use (balancing the individual right to privacy against the public interest in protecting public safety and health; existence of sufficient legal and technical safeguards): The targeted urban areas have been considered extremely high-risk, since they have been repeatedly affected by rain since last year. In this case, the public interest in protecting public safety and health can be prioritised over privacy, provided that adequate safeguards are in place to protect the right to privacy and the protection of personal data. Concerning safeguards, the law is adequately detailed (e.g., with many privacy-enhancing mechanisms or numerous technical and organisational measures to promote security, integrity and confidentiality). However, the model under assessment, as used elsewhere, has demonstrated certain limitations. In one case, it was found inadequate in fulfilling minimum specifications requirements (there were data flows, resulting in the transfer of data to unauthorised entities in the United States of America). Moreover, in another case, the same model failed to satisfy enhanced encryption-standards (it crashed on the ground and an unknown person decrypted all information, accessed data and published them online). In addition, its developer is a private entity and has not made public its handbooks (including reviews on the drone’s reliability and accuracy in predicting heavy rain). 

In light of these considerations, the use of Autonomous Drone v.2 by civil protection authorities in urban areas prone to be seriously affected by heavy rain and for the goal of managing such rain (via predicting urban locations that may be affected by flood after processing sensor data, capturing public infrastructure and personal data of passers-by) is prohibited, until: 

- The developer (AI Drones Inc) and the (future) users (civil protection entities) demonstrate before this Committee that all specifications requirements are met to avoid data flows; 

- The developer (AI Drones Inc) shows adequate accuracy and reliability of its model’s predictions, and, more precisely, that it satisfies the minimum threshold of flood prediction accuracy; 

- The state (civil protection entities as future data controllers) demonstrates before this Committee compliance with the data processing principles contained in the GDPR. More concretely, on lawfulness, the state must show that the processing is based on one of the grounds of Article 6 of the GDPR. On fairness and transparency, it must show that it uses systems that are, to the extent desirable, accessible and transparent (except where intellectual property rights are justifiable and do not unfairly limit accessibility and transparency); and that human experts are in place to intervene and explicate autonomous operations and decision-making. On purpose limitation, the state must prove adoption of all necessary measures (e.g., good encryption schemes or confidentiality mechanisms) limiting the goals pursued to those that are strictly necessary to manage heavy rain and making it technologically impossible to direct the drone to other incompatible goals (e.g., by automatically deleting personal data or anonymising such items of information). Concerning data minimisation, the state must demonstrate that it only processes data that are considered absolutely necessary to achieve the NDM goal at hand. Regarding accuracy, the state must show that the information it processes is accurate (and that it can rectify erroneous data). On storage limitation, the state must show that data are deleted, when no longer necessary. Last, on security (integrity and confidentiality), it must demonstrate high confidentiality standards and enhanced protection against attackers. 

________________________________________________________________________________________________

Having addressed critical privacy-related challenges, AUTH, along with TEMA partners, will further elaborate a minimum checklist to be applied to intended experiments and missions within the TEMA project. The checklist will rely upon procedures and mechanisms that have already been described in TEMA Data Management Plan. It will be a step-by-step assessment of critical issues, including: 

  • the design/setting up of the data processing experiment/mission or other data processing operation; 
  • the approval of the experiment, mission or other operation by TEMA partners’ legal departments, Data Protection Officers and competent authorities (if so required); 
  • the entry of the experiment, mission or other operation in a mission logfile (internal document per partner); 
  • the preparation of consent forms (if so required);
  • data storage in safe and secure repository (internal per partner);
  • the definition of data sharing procedures and rights (e.g., IPR) for sharing with other TEMA partners and/or third parties; 
  • the entry of data and details in the TEMA data xls files; or  
  • making data publicly available through the project’s trusted repository (ZENODO).

Once finalized, the minimum checklist will be used in all experiments, missions and other data processing operations to protect the right to privacy and the protection of personal data, ensure respect for IPR and other rights and shield and enhance the overall data processing mechanism within and outside the TEMA project. 

[1] Georgios Bouchagiar, Vasileios Mygdalis and Ioannis Pitas, ‘Privacy-Shielding Autonomous Systems for Natural Disaster Management (NDM): Targeted Regulation of the Use of Autonomous Systems for Natural Disaster Management Goals Before the Materialization of the Privacy Harm’ (2023) 29 (4) European Public Law 355 https://kluwerlawonline.com/journalarticle/European+Public+Law/29.1/EURO2023020