Skip to main content

When AI met privacy

|

How to fulfil the data protection obligations when using artificial intelligence? One of the main issues of concern is the use of personal information by the algorithms. The European AI industry encourages self-regulation.

One of the main issues of concern, as far as the development of Artificial Intelligence grows fast, has to do with the use that can be made of personal information through the application of algorithms and machine learning.

In fact, as the General Data Protection Regulation (GDPR) states, the data controller has to implement appropriate technical and organisational measures to ensure the protection of such personal data. Also, the controller must be able to probe that the processing activities carried out are in accordance with the law.

So, which options do we have in order to fulfil with these data protection obligations when using artificial intelligence?

In that sense, articles 40 and following of the GDPR refers to the need of managing the risk. To that end, the GDPR requests that the impact of the data processing must be mitigated through good practices, for instance in the form of approved codes of conduct, as well as other options, such as certifications or indications provided by a data protection representative.

So, as it can be seen, the European legislator seems to trust in the effectiveness of self-regulation, as a system to prove due responsibility during the processing of personal data.

It therefore encourages it, promoting the drawing of codes of conduct, allowing the companies themselves to establish the obligations of data controllers and processors, taking into account the likely risk to the rights and freedoms of natural persons.

In the case of Artificial Intelligence, this way of proceeding must be really interesting when it comes to managing the privacy risks of using such a complex technology, as only the companies involved in the development and application of this artificial intelligence know which are the main issues that it might raise in practice, as well as the problems, risks and threats, and possible controversies that the use of this technology can pose, derived from its application in the market.

Returning to article 40.2 of the GDPR, it can be seen how it contemplates several assumptions that may be of great interest for the purposes of achieving effective self-regulation of some of the issues that have been raised for some time in the field of artificial intelligence. To cite an example, the principles of loyalty and transparency set out in point a) of the aforementioned article, is one of the crucial points also in the development of regulation of artificial intelligence.

Regarding the fining aspects due to the wrong use of that technology, the voluntary assumption of codes of conduct by the offending entities is an element that must be taken into account when sanctioning data processing that infringes any of the data protection principles provided for in the regulation.

This reference is expressly included in Article 24.3 of the GDPR, when it states that adherence to approved codes of conduct may be used as an element to demonstrate compliance with obligations by the data controller, which may be of great help in modulating any liability that the regulator may require of the data controller at any given time.

Finally, it should also be noted that, with regard to the impact assessment on data protection, adherence to codes of conduct is an element that must be taken into account when assessing the impact of processing operations carried out by data controllers or processors. Some countries, as in the case of Spain, have included this provision in their internal regulations on data protection, so that the figure of the codes of conduct is expressly included in Organic Law 3/2018, on the Protection of Personal Data and the Guarantee of Digital Rights.

 

Article provided by INPLP member: Francisco Pérez Bes and Esmeralda Saracíbar (ECIX, Spain)

 

 

Discover more about INPLP, the INPLP-Members and the GDPR-FINE database

Dr. Tobias Höllwarth (Managing Director INPLP)

Cloud Privacy Check (CPC). Data Privacy Compliance in the Cloud Made Easy

Understand Cloud and Data Protection Law in only 4 easy steps. Plus highly relevant legal information for 33 countries. Provided by EuroCloud and 53 European lawyers.

VIEW STREAM

About Us

EuroCloud is an independent non-profit organization and consists of a two-tier setup where organisations form all European countries can apply to participate in as long as they respect the EuroCloud Statutes.

To act as a true European player, all programs that are developed are intended to be European activities. These European programs are the strength of EuroCloud as a whole. Respect to local cultures along with the will to promote a real European spirit.

{$page.footerData}