Key takeaways:
About SchremsII:
- When engaging a US based provider, set forth, in detail, in the documentation provided the technical and organizational measures used by the processor. It is not enough to say that "they can be obtained by the owner upon request".
- For the exporters: it must be clear from the documents in the file that the exporter has carried out an assessment of the effectiveness and sufficiency of the measures adopted to guarantee compliance with the obligations undertaken by the importer with the signing of the aforementioned clauses, in light of the legislation of the third country to which the data must be transferred.
- Especially when transferring the US, you need to have evidence, in the context of the stipulation of the standard contractual clauses, that you have expressly assessed and envisaged, if necessary, "the adoption of additional measures by the data controller to ensure compliance with this level of protection".
- Encryption by itself, is not a sufficient supplemental measure if the US based provider is able to access the information in clear text, even for a limited time, to do processing, and the information is encrypted after this time.
- Garante notes that “pseudonymisation” is not the same as data anonymization' but doesn't provide an analysis of why /how it can be used as a supplemental measure.
About the DPIA:
- Data minimization: It is not enough to say that you believe that the data processed are adequate, relevant and limited to what is necessary with respect to the purposes pursued, not collecting data beyond those necessary for carrying out the purpose you need to present a timely assessment of the adequacy, relevance and proportionality of each category of data processed, with particular regard to the data relating to the analysis of behavior, and explain why you think it adequate.
- Accuracy: It is not enough to say that the personal data - common and biometric must be exact, otherwise the indicated purposes could not be pursued.". You need to detail an adequate assessment of the actual reliability of the processing tool, with regard, as applicable to both facial recognition functions (for example to verify that malfunctions do not occur due to skin color or somatic traits linked to the ethnic origin of the interested parties), and to the mechanisms by which the risk indices are defined, and to assess the possible repercussions for the interested parties in the event of errors or false positives / negatives. You also have to consider prior assessment of the reliability of the algorithms used by the the system you are using.
- Necessity and proportionality: it is not enough to say “the processing is necessary for the pursuit of the purposes set out. The data collected are, in fact, adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed ". You have to explain the assessments you made regarding the reasons why it was necessary to adopt this tool (a supervisory tool for remote exams equipped with facial recognition functions and able to profile the behavior of students during the test), as well as the reasons why it was not possible to use alternative supervision tools, but less invasive for their rights and freedoms.
- When assessing potential harms and risks by the processing, don't assess only physical damage but also emotional (especially when profiling and facial recognition are concerned).
Article provided by INPLP member: Odia Kagan (Fox Rothschild, United States)
Discover more about INPLP, the INPLP-Members and the GDPR-FINE database
Dr. Tobias Höllwarth (Managing Director INPLP)