Skip to main content

The privacy risks of the “virtual friend”: the Italian DPA clamps down Replika

|

The Italian Data Protection Authority (“Garante Privacy”) has ordered the limitation of the data processing activities carried out by the Replika chatbot due to minors’ data protection concerns.

Replika

Replika is a chatbot, with a written and voice interface, based on artificial intelligence that generates a “virtual friend” that user can decide to configure as a friend, a mentor or even as a romantic partner.

There are many benefits that Replika aims to bring to its users: improving mood and emotional well-being, helping them in understanding their thoughts and feelings, calming anxiety and working toward goals such as positive thinking, stress management and finding love. Replika, in fact, falls under the branch of artificial intelligence called “affective computing”, capable of recognizing and expressing human emotions.

The chatbot’s description leaves no doubt about the massive amount of personal data – “very personal”, as defined in a press release by the Italian Data Protection Authority – that Luka Inc., the company that created Replika, collects: confessions, problems, doubts, joys, desires and pains.

These aspects raised the concerns of the Italian DPA, which recognized serious risks to minors and, more generally, to emotionally vulnerable individuals.

The absence of an age verification mechanism

Besides providing children with completely inappropriate answer given their level of development and self-awareness, the App lacked any mechanism for verifying the age of its users.

According to the Italian DPA, it is not sufficient that the App’s terms of service mention that below-13 children are banned from using it and that below-18 users must be authorized in advance by their parent or legal guardians. Rather, it is essential that mechanisms for verifying the age of users are effectively implemented, either in the form of filters for minors or blocking the App in case the child reveals his age. This is an ideal solution to be compliant with Article 8(2) GDPR – entitled “Conditions applicable to child's consent in relation to information society services” – which requires the data controller to make reasonable efforts to verify that consent is given or authorized by the holder of parental responsibility over the child, taking into consideration available technology.

Critical aspects of the privacy policy

Further concerns occurred with regard to the App’s privacy policy, which was not compliant with the transparency principles and obligations set out in the GDPR, as it failed to disclosure whatever information on the key elements of the processing at issue, in particular on the use of children’s personal data.

Indeed, the the privacy policy lacked a proper legal basis for the processing activities carried out by the chatbot, which, in any case, can not be based on:

• consent of the data subject, in the absence of the conditions set out in Article 8(2) GDPR;

• the performance of a contract to which the data subject is party, since children are legally incapacitated under Italian law to conclude a contract for the supply of services, such as the one in the present case, which involves the provision of a significant amount of personal data.

In the light of the aforementioned circumstances, the processing of personal data of Replika users breached Articles 5, 6, 8, 9 and 25 GDPR.

Based on these violations, the Italian DPA urgently ordered Luka Inc. the temporary limitation on the proceedings of all the personal data relating to users in the Italian territory. The Authority also imposed the company to provide information within 20 days on any steps that it may have taken in order to comply with the GDPR, under penalty of a fine up to 20 million or up to 4% of the total worldwide annual turnover.

Conclusions

Guido Scorza, member of the Italian DPA, commented on the matter in these terms: “handing over part of personal identity to an artificial intelligence is something of which a minor is unable to appreciate the impact, effects and consequences”.

The online safety of minors is a particularly important theme to the Italian Authority, who has repeatedly advocated for the introduction of an efficient age verification mechanisms in all platforms and digital services.

In the opinion of the Italian DPA, these mechanisms could be properly implemented by means of different methodologies. For instance, (i) using an artificial intelligence that, through machine learning algorithms, produces a plausible guess of the user's age or, again, (ii) by the so called “crowdsearching”, a scheme that instructs users to ascertain violations of the age threshold and report them to the platform operator.

 

Article provided by INPLP member: Chiara Agostini (RP Legal & Tax, Italy)

 

 

Discover more about the INPLP and the INPLP-Members

Dr. Tobias Höllwarth (Managing Director INPLP)

Cloud Privacy Check (CPC). Data Privacy Compliance in the Cloud Made Easy

Understand Cloud and Data Protection Law in only 4 easy steps. Plus highly relevant legal information for 33 countries. Provided by EuroCloud and 53 European lawyers.

VIEW STREAM

About Us

EuroCloud is an independent non-profit organization and consists of a two-tier setup where organisations form all European countries can apply to participate in as long as they respect the EuroCloud Statutes.

To act as a true European player, all programs that are developed are intended to be European activities. These European programs are the strength of EuroCloud as a whole. Respect to local cultures along with the will to promote a real European spirit.

{$page.footerData}