Emily started using Instagram when she was in her mid teens and found it useful early on. She used the photo-sharing app to follow fitness influencers, but what started as a constructive relationship with the platform escalated into a body image-centric crisis. At 19, he was diagnosed with an eating disorder.
“I felt like my body wasn’t good enough, because even though I went to the gym a lot, my body still never looked like the bodies of these influencers,” says Emily, now a 20 year old student who is recovering.
Emily, who preferred not to use her real name, now uses Instagram sparingly. She is one of many Instagram users whose suffering rose to prominence this week with revelations that platform owner Facebook appeared to know it was damaging teenage mental health.
According to internal research leaked to the Wall Street Journal (WSJ), the app made body image problems worse for one in three girls and in a Facebook study of teenagers in the UK and US, more than 40 % of Instagram users said they felt “unattractive” said the feeling started while using the app.
Instagram has over a billion users worldwide and around 30 million in the UK, with Kim Kardashian, Selena Gomez and Ariana Grande among the accounts with hundreds of millions of subscribers between them. In the UK, Love Island couple Liam Reardon and Millie Court have already reached almost 3 million combined subscribers since winning the title in 2021.
Two in five (40%) girls aged 11-16 in the UK say they have seen images online that made them feel insecure or less confident. That number drops to half (50%) among girls aged 17 to 21, according to a study conducted by Girlguiding in its annual survey of girls’ attitudes.
Sonia Livingstone, professor of social psychology in the Department of Media and Communications, LSE, describes adolescence for teenage girls as an “arc” that tends to begin with the basic experiences of interest in pets, painting or playing. with younger siblings, to a more confident young woman ready to face the world. But it’s the experience in the midst of this parable that presents a particular challenge, and where Instagram can be most unsettling.
“This is when they are bombarded with many answers to their dilemmas and one important answer right now is that this could be what they look like, whatever they bought matters,” says Livingstone, who is due next week to testify to MPs and peers reviewing the UK’s Online Safety Bill, which places a duty of care on social media companies to protect users from harmful content.
Facebook’s extensive research into the photo-sharing app found that Instagram had a deeper effect on teenage girls, as it focused more on the body and lifestyle, compared to TikTok’s emphasis. on performance videos like dancing and funny face features from Snapchat. “Social comparison is worse on Instagram,” the Facebook study said. The leaked research pointed to the app’s Explore page, where an algorithm adapts the photos and videos a user sees, potentially creating a spiral of harmful content.
“Aspects of Instagram exacerbate each other to create a perfect storm,” the research said.
Livingstone says a key feature of the Online Safety Bill will be its provisions on regulating algorithms that constantly adapt and change what you visualize to suit your perceived needs and tastes – and can push teenage girls into that vortex. of content detrimental to self-esteem. “There is a lot to do on algorithms and AI [artificial intelligence]. «
Beeban Kidron, the interbank counterpart who sits on the Joint Committee for the Online Safety Bill and is behind the recent introduction of a privacy code for children, said Ofcom, the organization UK Communications Monitoring Department will play a key role in reviewing algorithms.
“The value of algorithmic surveillance to regulators is that the decisions tech companies make will become transparent, including decisions made by FB to allow Instagram to target teenage girls with images and features that ended with the. anxiety, depression and suicidal thoughts. Algorithmic monitoring is the key to regaining control of society. “
A spokesperson for the Ministry of Digital, Culture, Media and Sports said the bill would address those concerns. “As part of their due diligence, companies will need to mitigate the risks of their algorithms promoting illegal or harmful content, especially for children. Ofcom will have a range of powers to ensure they do so, including the ability to request information and enter company premises to access data and equipment.
For others, it’s about teaching young people to navigate a world dominated by social media. Deana Puccio, co-founder of The Rap Project, who visits schools across the UK and abroad to discuss issues such as consent, online and offline safety and building trust in the body image and self-esteem, says the bill should be accompanied by an education campaign.
“We parents, educators, politicians need to equip our young people with the tools, the analytical skills to make healthy choices for themselves. Because they will have access to whatever they want. They are better at navigating the online world than we are.
Puccio adds that teens should be encouraged to make their social media posts reflect a more realistic view of the world. “We need to start building people’s confidence to post real highs and lows. “
The Instagram manager risked stoking criticism of the app on Thursday with comments comparing the impact of social media on society to that of cars. “We know more people die than they otherwise would from car accidents, but overall, cars create far more value in the world than they destroy. And I think social media is similar, ”said Adam Mosseri.
Facebook referred the Guardian to a blog post by Karina Newton, public policy manager at Instagram, who said internal research has shown “our commitment to understanding the complex and difficult issues that young people can face, and informs all the work we do to help these people experience these problems ”.
The Instagram revelations came as part of a WSJ investigation into Facebook, in which the newspaper revealed that Facebook is giving special treatment to high profile users, that changes to its News Feed algorithm in 2018 made users of the platform angrier and more confrontational, and employees warned that Facebook was being used by drug cartels and human traffickers in developing countries.
In response to the algorithm and the drug cartel allegations, Facebook said divisions existed in the company long before the appearance of its platform and that it had a “comprehensive strategy” to keep people safe. people in countries where there was a risk of conflict and violence.