Social networks must assume their responsibility to protect teenagers

Can social networks be responsible for the deterioration of the mental health of minors? A recent UK investigation determined that, in part, yes. Platforms must make a commitment to protect minor and vulnerable users; and regulation has to demand more measures.

Molly Russell, a teenager from London, committed suicide at 14 years in 2017. A review of her social media found that the girl had been sucked into a black hole online of posts related to depression, self-harm and anxiety.

Molly’s father started a campaign to demand greater protection on the platforms, and a forensic team chaired by lawyer and physician Andrew Walker has just determined that the content the teenager viewed on Instagram and Pinterest contributed to her death.

It is one of the few official acknowledgments — although the companies have not been called to court, but only to attend some forensic sessions — about the negative impact that social networks can have on mental health, a reality that many take it for granted, but others are cautious when establishing causality.

A complex phenomenon

What came first : The egg or the chicken? This is the question that seems to be asked when talking about the impact of social networks on adolescents, because mental health problems — and in particular suicide — are a complex and multifactorial phenomenon, and can almost never be related to just one cause.

For example, a study published in New Media & Society Journal at

found that people who saw self-harming content on Instagram showed “more behaviors related to self-harm and suicide.” The study was not conclusive as to the cause: it may be due to the exposure itself, or the fact that people who see this type of material are already at a higher risk and are more likely to find self-injury publications.

“There is no definitive study that conclusively links social media with suicide risk,” says Matt Navarra, expert and consultant. Navarra worked for the UK government as a digital communication specialist. “There are so many elements that contribute to this that it is difficult to be sure of the level of relationship between these things”, he ponders.

However, he continues: “I believe that it can be said that social networks can have a negative effect on the mental health of young and vulnerable users. I think there are enough examples to suggest that because of the way the apps are designed, the way the algorithm works, and the way human psychology operates, there is a level of addiction to these platforms that can, in itself, be dangerous for certain groups of users”.

That is, if you type the word “suicide” in some social network, it is more likely that they will redirect you or show you resources of help, which is a very positive tool for these platforms.

However, the algorithm is also designed to hook the user, and it does this by weaving a small spider’s web with the content that it believes can to interest. It is not only moved by the interests that the user has explicitly shown, but also by more subtle signals that end up filling the user’s feed with increasingly addictive posts.

And that’s what the British investigation concluded happened in the case of Molly Russell, and that reveals a deeper and more global problem of social networks.

“The content she was looking for and recommended these platforms to her was clearly a contributing factor to her tragic death, which could have been avoided had she not been using these platforms, or if there had been more filters to protect people like Molly Russell. , but what is certain is that these platforms can and should do more to protect all their users, but especially the youngest and most vulnerable”, concludes Navarra.

Harmful content is easily accessible

The thing is, harmful posts are just a click away. A survey by the University of Georgia concluded that Instagram posts related to self-harm have increased by 110. , at the beginning of 2017 more than 110.000 in December.

“Yesterday I went into Instagram and wanted to see how long it took for an explicit image with blood and obvious self-injury to appear”, he explained. Amanda Giordano, lead author of the study and associate professor at the Mary Frances Early School of Education. “It took about a minute and a half.”

Why does this happen? In addition to the functioning of the algorithm, platforms also have problems knowing if the user is a minor.

“One of the many challenges of these platforms is that age verification is ineffective when registering users minors, or to allow the networks to know the real age that the person is, in order to block certain types of content”, explains Navarra.

On the other hand, the moderation systems designed to prevent the display of certain content.

“The systems that exist are a mixture of human moderators and artificial intelligence technologies, which do not act with the efficiency that would be necessary”, he explains. the specialist.

This is nothing new. In September 2021, the leak of some internal reports scandalized many users after acknowledging that Facebook (now Meta) knew that its Instagram network is “toxic ” for teenagers and that protection protocols “are failing”. Among the conclusions, a revealing data: among adolescents who declared having suicidal thoughts, % of users of the United Kingdom and 6% of the United States attributed the problem to the aforementioned social network.

Everyone’s responsibility

In light of the findings of the Molly Russell case, Elizabeth Lagone, Meta’s Health and Wellness Policy Officer, said: “We’re sorry that Molly saw content that violated our policies, and we don’t want that on the platform.”

An executive at Pinterest, another platform Molly spent a lot of time on before her death, also said that the site was not safe when the teenager was using it.

Nevertheless, Navarra assures that it is the regulation that must convert the protection of users into a requirement for platforms, with a sanctions regime that obliges them to change.

Although the report expert witness on the Russell case is recent, his death occurred in 2017, and the platforms claim to have made changes. They created new policies, an increase in certain aspects of the moderation systems and, above all, tools for managing the time users spend on the platforms.

Nevertheless, Navarra questions the real effectiveness of these measures and ensures that platforms continue to need stricter content control, to prevent cases like Molly Russell’s.

“The same algorithms still exist and there are still many underage users who shouldn’t be able to access certain types of content. There is no improvement in terms of age verification, so I think there were changes, but they were small and subtle”, he concludes.

However, parents cannot be relieved of their responsibility to protect minors: “I believe that parents have a role to play with younger users in terms of digital literacy and the ability to have a healthy relationship with electronic devices and social networks. ç trust that platforms will be completely safe at all times, and that regulators will protect their children.”

Finally, Navarra also believes that individual responsibility can force change: “I believe that the relationship of People with social media have changed, and they are more aware than ever of the dangers and risks. The appetite for these platforms, as well as what is expected of them, is changing for the better, and this will lead to a faster pace of transformation than anything else.”

It’s hard to deny that the social networks can harm the mental health of the smallest and most vulnerable users. It is their responsibility to ensure that children are protected on their platforms, but it is also the responsibility of society to reflect on how it got to this point and demand the measures that will drive change.

WHERE TO LOOK FOR HELP

CAPS and Basic Health Units (Family Health, Posts and Health Centers)

UPA 24H, SAMU 192, Emergency Room; Hospitals

CVV – Life Appreciation Center

Website: http://www.cvv.org.br

Email: atendimento@cvv.org.br

Telephone: 188

Vita Alere Institute for Suicide Prevention and Postvention


Website: https://vitaalere.com.br/

email: contato@vitaalere.com.br

Brazilian Association of Suicide Studies and Prevention – ABEPS


Website: https://www.abeps.org.br/

Email: faleconosco@abeps.org.br

September Yellow

Website: http://www.setembroamarelo.org.br/

©2022 ACEPRENSA. Published with permission. Original in Spanish.2021

Recent Articles