London – Instagram, which had a relatively quiet period after the crisis caused by a wave of revealing the negative impact of a former employee on young people in 2021, made headlines again with a huge penalty in Ireland, where its European headquarters is located. .
This Monday (5), the platform’s holding company Meta was ordered to pay €405 million (R$2.1 billion) by the Irish Data Protection Commission (DPC) for violating European bloc legislation in its settings. Privacy policy for kids on Instagram.
This was due to the display of phone numbers and email addresses in work accounts for users aged 13-17, and an automatic general account default setting for this age group. Reported that meta resources are no longer valid.
Fine for Instagram is Meta’s biggest award ever in Europe
As a result of a two-year investigation, the fine is the highest ever imposed on Meta, owner of Instagram, Facebook and WhatsApp, and the second largest on a digital platform.
In September of last year and in March of this year, the holding company was fined €225 million (BRL 1.1 billion) and €17 million (BRL 87 million) for “serious” violations in the protection of WhatsApp users’ data.
In July 2021, Amazon was ordered to pay €746 million.
Also Read | WhatsApp fined BRL 1.3 billion for violating data privacy law in Europe
The Irish data protection regulator is responsible for overseeing the activities of platforms that keep their European headquarters in Ireland.
In 2021, Meta faced a storm that severely damaged its image after ex-Instagram executive Frances Haugen conducted a public in-house study that showed that issues like body image and self-esteem had negative effects on the emotional state of young users.
Haugen surreptitiously leaked the presentations to American newspapers and then showed his face and was invited to speak on committees investigating digital platforms in the US Congress, British and European Parliaments.
Also Read | Eight US states open investigations into Facebook and Instagram for causing emotional harm to teens
Faced with the negative impact of the disclosures, Meta backed out on the introduction of end-to-end encryption on Instagram Messenger, which is seen as a risk by authorities and child protection agencies.
Without the ability to identify people who abuse children online, these crimes become nearly impossible to investigate, according to the organizations.
Also Read | Under pressure, Facebook delays launch of end-to-end encryption on Messenger and Instagram
Meta told the newspaper Politico that it intends to appeal the fine from the Irish regulator, saying the investigation focuses on the following. old settings updated over a year ago.
According to the company, resources have since been launched to keep teens safe and their information private. AND Those under 18 now automatically set their accounts to private when they sign up for Instagram.
Alongside Instagram, TikTok has been on the radar of regulars and child protection organizations over issues like privacy and challenges that have already caused deaths.
Also Read | Child deaths in ‘blackout’ challenge and China’s access to data: new TikTok crises in the US
source: Noticias