Exclusive: Ex-senior engineer asserts social media firm had the means to take more action since Molly Russell’s death

Mark Zuckerberg’s Meta, as per a whistleblower, has failed to adequately protect children post-Molly Russell’s death. The whistleblower, Arturo Béjar, a former senior engineer and consultant for Instagram and Facebook’s parent company, contends that the social media giant already possesses the infrastructure to safeguard teenagers from harmful content.

Béjar asserts that had the company truly learned from Molly’s death and subsequent inquest, it would have implemented measures to create a safer experience for young users. According to Béjar’s research on Instagram users, 8.4% of 13- to 15-year-olds reported witnessing someone engaging in self-harm or threatening self-harm in the past week.

Béjar conveyed to The Guardian that, had the company absorbed the lessons from Molly Russell, they would develop a product designed for the safety of 13-15-year-olds. In this envisioned product, one in 12 individuals within that age group wouldn’t have encountered instances of self-harm or threats to do so in the past week. Additionally, the majority of them would feel supported when encountering self-harm content.

Molly Russell, a 14-year-old from Harrow, north-west London, tragically ended her life in 2017 after being exposed to harmful content related to suicide, self-harm, depression, and anxiety on Instagram and Pinterest. In a significant verdict in 2022, the inquest into her death concluded that Molly “died from an act of self-harm while suffering from depression and the negative effects of online content.”

According to Béjar, Zuckerberg had the means to enhance the safety of Instagram, especially for teenagers. However, the company has opted not to implement those changes.

They either require a different CEO, or he needs to wake up tomorrow and declare, “This kind of content is not allowed on the platform,” as they already possess the infrastructure and tools to make such content impossible to find.

Béjar’s research at Instagram, along with his efforts to prompt action from the company, are highlighted in a lawsuit filed against Meta by Raúl Torrez, the New Mexico attorney general. The lawsuit alleges that Meta falls short in protecting children from sexual abuse, predatory approaches, and human trafficking. Unredacted documents from the lawsuit reveal that Meta employees cautioned the company was “defending the status quo” after Molly’s death, a situation deemed unacceptable by the media, numerous affected families, and likely to be unacceptable to the broader public.

As an engineering director, Béjar held responsibilities related to child safety tools and assisting children in dealing with harmful content like bullying material. After leaving the company as a senior engineer in 2015, he returned as a consultant in 2019 for a two-year term. During this period, he conducted research revealing that one in eight children aged 13 to 15 on Instagram had encountered unwanted sexual advances, one in five had experienced bullying on the platform, and 8% had viewed self-harm content.

The former Meta employee has urged the company to establish goals for reducing harmful content, emphasizing that it would create an incentive structure for long-term efforts. Béjar has called for a series of changes, including facilitating easier flagging of unwanted content with explanations, conducting regular user surveys on Meta platforms, and simplifying the process for users to submit reports about their experiences on Meta services.

Béjar continues to scrutinize the Instagram platform, noting that harmful content, including material related to self-harm, persists on the app, along with evident instances of underage users, despite Instagram having a minimum age limit of 13.

This week, Béjar has engaged with politicians, regulators, and campaigners in the UK, including Ian Russell, Molly’s father, whose Molly Rose Foundation facilitated Béjar’s visit. Last year, Béjar testified before Congress, sharing his experiences at the company and recounting the “awful experiences” of his teenage daughter and her friends on Instagram, encompassing unwanted sexual advances and harassment.

Béjar emphasized that Meta could efficiently address the issue of self-harm content within three months, given that the company possesses all the necessary tools. He stressed that what is required is the determination and policy decision to create a genuinely safe environment for teenagers, with transparent measurement and public reporting.

A spokesperson from Meta commented, stating, “Every day, numerous individuals both within and outside Meta are dedicated to finding ways to enhance the online safety of young people. Collaborating with parents and experts, we have implemented over 30 tools and resources aimed at supporting teenagers and their families in fostering safe and positive online experiences. Our commitment to this ongoing effort remains steadfast.”

Meta highlights various safety initiatives, such as automatically setting accounts for those under 16 to private mode upon joining Instagram, limiting adults from sending private messages to teenagers who are not following them, and providing tools for reporting bullying, harassment, and sexual activity on Instagram.

By admins

Leave a Reply

Your email address will not be published. Required fields are marked *