London – A criminal investigation in London, which concluded on Friday 30, blamed social networks directly for the suicide of 14-year-old Molly Russell after more than five years of a fight with the girl’s father, Ian Russell. influenced the decision she saw.
Molly took her own life in 2017, becoming a symbol of the risks social media poses to children and teens, and a reference for those advocating stricter regulations to punish platforms that allow harmful content to circulate.
Even Prince William, who follows monarchy’s protocol and often does not speak on controversial topics, yesterday called for more online safety for children. He said on Twitter:
“No parent should have to put up with what Ian Russell and his family went through. They were incredibly brave. For our children and teens, online safety needs to be a prerequisite, not an afterthought.”
No parent should have to harden what Ian Russell and his family went through. They were incredibly brave. Online safety for our children and teens should be a prerequisite, not an afterthought. W
– The Prince and Princess of Wales (@KensingtonRoyal) 30 September 2022
The message was signed with the ‘W’ used to mark a post personally written by William, not the aides. He has three young children and has supported Ian Russell’s cause in the past.
Expert says social media leads to suicide
Molly was found dead in her bedroom in November 2017. Although seemingly healthy and balanced, the girl followed topics like suicide and self-harm on social media.
The father embarked on a bold journey to challenge the power of digital platforms, establishing a foundation named after his daughter to raise awareness of the risk of suicide among young people and the danger posed by content that is freely available on social media to people of all ages.
In the two weeks of the forensic investigation, during which evidence was presented and witnesses testified, the family’s lawyers proved that of the 16,300 posts the girl had saved, shared or liked on Instagram six months before her death, 2.1 thousand were about 2.1 thousand. depression, self-harm, or suicide.
Andrew Walker, the expert who led the investigation, chose to give the cause of death as self-injury.
In his last report He said that the girl was “exposed to materials that may have adversely affected her, and in addition, what started as depression turned into a more serious depressive illness.”
Walker also said that the “particularly striking” content he saw were “romanticized acts of self-harm”, “normalizing his situation” and focusing on “limited and irrational vision without any balance of normalcy”.
What do digital platforms say?
Representatives of Instagram owner Meta and Pinterest were among the advocates of the investigation and took different positions in the Molly Russell case.
Pinterest COO Judson Hoffman said she “deeply regrets” Molly’s posts and won’t show them to her kids. When the young woman used the platform, she apologized for admitting that the platform was “not safe”.
The meta has gone the other way. Elizabeth Lagone, the company’s head of Health and Welfare, flew from the US to join the investigation, but Meta’s head of corporate affairs is Nick Clegg, a former British politician living in the UK.
Monday (25) After spending an hour reviewing posts Molly accessed about suicide, drugs, alcohol, depression and self-harm on Monday, the executive of the Instagram holding company said she thought it was safe for people to express themselves and therefore justified the posts. Seen by Instagram. girl.
He rated the posts as “generally permissible” but acknowledged that two of the posts violated the platform’s policies. However, he defended the thesis that “it is important to give a voice to people” with suicidal thoughts.
The head of investigation questioned Meta’s right to decide what material is safe for children to see, during the deposition. The manager said the decisions were made in collaboration with experts.
It also introduced, at the time of Molly’s death, Instagram’s guidelines that authorize content related to suicide and self-harm to “make it easier to get together to support” other users, but “encourage or not encourage” action.
Well-prepared, Lagone was evasive when asked if the content took Molly to the extreme. And it avoided separating teens or adults on the grounds that all Instagram users are over 13.
The content of the testimonies shows differing views on the subject, even across platforms. The goal is to equate adults and youth in education, without taking into account the inherent insecurities of adolescence that can make it difficult to distinguish between “supporting” or “encouraging” suicide.
The understanding of the investigation was clear. Coroner Andrew Walker said the online content that student Molly Russell saw was “not safe” and “should not be open for a child to see”.
Also Read | TikTok announces suicide prevention resources and expands content on eating disorders
Father of girl who committed suicide did not forgive social media
Ian Russell has sharply criticized Meta’s position, as he has done since the battle against social media following his daughter’s suicide:
“We heard that a senior executive at the company described this deadly stream of content sent to Molly by the platform’s algorithms as ‘safe’ and did not violate platform policies.
“If this crazy trail of life-sucking content was safe, my daughter Molly would probably still be alive, and instead of us being a grieving family of four, the five of us would look forward to a life full of purpose and promise. The future of the family, our beautiful Molly.
“It’s time to change the toxic corporate culture at the heart of the world’s largest social media platform.”
Russell accused the platforms of “turning suffering into money.” and asked Meta CEO Mark Zuckerberg said: “Listen to the people using your platform, listen to the coroner’s conclusions, and then do something about it.”
The investigation is not a conviction, but it does have an impact on future court decisions and has had a huge impact in the UK. Local media followed the case daily and gave ample coverage to each statement.
Britain’s children’s commissioner, Rachel de Souza, claimed in an interview with Sky News that this was false, claiming that social media giants should remove such harmful content from their platforms following the findings of the investigation. It’s “despicable” for companies to “put profits before children’s safety”.
While the law pending in parliament to regulate social networks provides for fines and even imprisonment for executives, de Souza questioned the necessity of waiting for penalties:
“Why can’t these companies lift these things soon?
He said that in his biennial meetings with platform leaders, he was often asked: “How many children do you have online? Are you downloading this material?” but they regret avoiding questions and think they “didn’t do enough”.
“They need to have a moral compass and figure it out now, they can.”
Basically, it’s about online security.
On the day the survey result was announced, the Molly Rose Foundation issued a statement highlighting the issue of online safety and the need for regulation:
“The investigation very clearly showed the significant dangers posed by social media platforms such as Instagram, Twitter and Pinterest in the absence of effective regulation.
[…) Se o governo e as plataformas de tecnologia agirem sobre as questões levantadas no inquérito, isso terá um efeito positivo no bem-estar mental dos jovens, que é o principal objetivo da Molly Rose Foundation.
As plataformas devem parar de criar perfis de crianças para fornecer conteúdo prejudicial – mesmo que tenham o direito de hospedar esse conteúdo para usuários adultos, que também podem ser vulneráveis.
As coisas precisam mudar. As plataformas de mídia social provaram que não estão dispostas a se proteger sem ação legislativa, portanto, é vital que a Lei de Segurança Online seja aprovada sem demora e que uma era de responsabilidade seja iniciada. Para as mídias sociais, a era do Velho Oeste acabou.”
Lei de redes sociais parada no Parlamento
As redes sociais são objeto de críticas por motivos diversos. No Reino Unido, a questão principal tem sido o risco a crianças, agravada pelo caso de Molly Russel.
A imprensa acompanha atentamente uma sinalização do novo governo da primeira ministra Liz Truss sobre mudanças no projeto de lei da Online Harm Bill.
O projeto foi apresentado em 2020 como iniciativa ousada para tornar o país o mais seguro do mundo para crianças na Internet, e chegou ao Parlamento em 2021. Mas está empacado.
A possível alteração no texto inicial é um relaxamento que não agradou a entidades que defendem mais rigor em relação a assédio infantil e acesso das crianças a conteúdo nocivo: não obrigar as plataformas a removerem conteúdo de risco, porém legal.
Os que querem a alteração alegam que dar às plataformas a responsabilidade de remover conteúdo representa poder excessivo a elas.
Para a ONG Open Rights, o projeto de lei britânico em sua forma atual “efetivamente terceiriza o policiamento da internet, passando-o da polícia, tribunais e Parlamento para o Vale do Silício”.
Os que são a favor dizem que cabe a elas retirar em nome do interesse maior da sociedade sem depender os poderes constituídos, como defende Rachel de Souza e a fundação Molly Rose, que em seu comunicado sobre o resultado do inquérito disse:
“Já se passaram quase cinco anos desde a morte de Molly e ainda estamos aguardando a prometida legislação governamental. Não podemos esperar mais; não se deve almejar a perfeição, muitas vidas estão em risco.
A estrutura regulatória pode ser aperfeiçoada nos próximos meses e anos.”
O partido Trabalhista, de oposição ao Conservador que está no poder, também faz pressão para uma aprovação rápida da lei. Após o fim do inquérito, a secretária de cultura e mídia digital do partido, Lucy Powell, defendeu a necessidade de aprovar a lei o mais rápido possível.
Labour’s shadow digital and culture Secretary Lucy Powell, said: “Enough is enough. Children must be kept safe online. The government should bring forward the full Online Safety Bill at the earliest opportunity.” https://t.co/HRbjaF4mTo pic.twitter.com/qePGz7i8KK
— Molly Rose Foundation (@mollyroseorg) September 30, 2022
Nem todos estão com paciência para esperar. Há duas semanas, o Ofcom, órgão regulador de mídia, notificou o TikTok por violação da lei e proteção de dados ao não garantir a privacidade de crianças que usam a plataforma, antecipando que vem aí uma multa de £ 27 milhões.
Leia também | Setembro Amarelo: Uma análise sobre como cobrir suicídio infantil de forma responsável
source: Noticias