If you ask GDPR professionals about how biometric data are regulated in the European Union (EU) it’s pretty sure all of them will tell you that “biometric data for the purpose of uniquely identifying a natural person” is deemed as a special category of personal data (namely “sensitive data”). This is also the case in some Latin American countries, where biometric data is explicitly protected as a sensitive data.
Mexico does not explicitly regulate biometric data as sensitive data in their two main privacy laws: the Federal Law on the Protection of Personal data Held by Private Parties and the General Law on the Protection of Personal Data Held by Obliged Subjects (the Data Protection Laws).
The Data Protection Laws do not provide definitions of “biometric data” but do include definitions of what “sensitive data” means. “Sensitive data” definitions use a mixed approach to explain what sensitive data are:
a) On one hand, the definitions provide that sensitive data INCLUDE this type of information: personal data that may reveal aspects such as racial or ethnic origin, present or future health racial or ethnic origin, present and future state of health, genetic information, religious, philosophical and moral beliefs, trade union affiliation, political opinions and sexual preference.
b) On the other hand, the same “definition” provides that sensitive data include such personal data that affect the most intimate sphere of a data subject, or whose improper use may give rise to discrimination or entails a serious risk for data subject.
Thus, this definition includes three criteria (the “Sensitivity Criteria”) that may be used (or misused) to conclude that certain personal data shall be deemed as “sensitive”:
(i) If a personal data affects the most intimate sphere of a data subject, it shall be deemed as sensitive, or
(ii) If the improper use of a personal data may give rise to discrimination, it shall be deemed as sensitive, or
(iii) If the improper use of a personal data entails a serious risk for the data subject, it shall be deemed as sensitive.
Under these criteria, it can be argued that a last name or a postal code shall be deemed as sensitive data if a given controller may use them to exclude individuals from a recruitment procedure or governmental benefits procedures when data subjects have certain last names or live in certain postal zones.
However, we shall also consider that “discrimination” does not only refer to the most common (and negative) meaning of such action, but also includes “the ability to see the difference between two things or people” and that every selection exercise its basically a discrimination process.
On the other hand, What does it mean or what constitutes “the most intimate sphere” of an individual? Does it mean the same for you and me? Does it mean the same in different countries? The Data Protection Laws (as you may guess) does not provide a definition or an explanation of such concept.
And finally, we want to note that the notion of “serious risk” is also tricky, given that processing personal data comes with an implicit risk that under certain circumstances shall be deemed as low, medium, or high. When is it then an “improper” use of a personal data that entails a serious risk for the data subject?
With this legal definition at hand, in 2018 the Mexican data protection authority (known as the “INAI”) issued the “Guidelines for the Processing of Biometric Data”, where using European sources like the Working Party 29’s “Opinion 3/2012 on developments in biometric technologies” and the GDPR, established a reference document to explain what biometric data are, but remained unclear on specific situations where biometrics data shall also be deemed as sensitive. The most significant advance comes from a series of examples where the Mexican data protection authority states that:
“For example, the biometric data of the iris could be considered sensitive in those cases in which it allows obtaining information about the data subject's health status. Likewise, a fingerprint could be considered sensitive if, through its improper use, it could provide access to privileged information that could jeopardize the security or stability of the data subject's assets.”
In this scenario, and however difficult to explain and apply the Sensitivity Criteria might be, they are jointly used by the Mexican data protection authority to define in several sanctioning procedures that every biometric data is sensitive data, even when it is not used for the purpose of uniquely identifying a natural person (e.g. when using biometric technology for authentication purposes). This has led to several fines when data controllers did not correctly categorize the type of data they were processing and for this reason they did not obtain the relevant and mandatory consent to process sensitive data (in Mexico, an explicit and written consent).
In practice, Mexican data controllers and processors have started to understand (some the hard way) that processing biometric data is a serious matter, and that the Mexican data protection authority will enforce data protection laws by treating this data as sensitive data.
In the meantime, the most ambitious data controllers have opted to carrying out Data Protection Impact Assessments (DPIA) before implementing this technology in their new projects, even when such assessments are not (yet) mandatory for private data controllers under the current legal framework applicable to “private parties”.
In the end, two things are for sure: (i) under the guidance of the INAI, biometric data are and will be protected as sensitive data, following European standards, and (ii) an update of the Data Protection Laws is necessary in order to clarify the scope of protection of certain data and to establish as mandatory specific measures as to carry out DPIAs. After all, Mexico is a non-European signatory country of Convention 108 for the Protection of Individuals with regard to Automatic Processing of Personal Data, and we shall honor our obligations.
Article provided by INPLP member: Héctor E. Guzmán Rodríguez (BGBG, Mexico)
Dr. Tobias Höllwarth (Managing Director INPLP)