Meta Takes Measures to Shield Teens: Hides Suicide and Eating Disorder Posts on Instagram and Facebook

Adolescent users’ newsfeeds will no longer contain anything regarding eating disorders, suicide, or self-harm, according to Meta, the parent company of Facebook and Instagram. According to a blog post, Meta is committed to ensuring that teens have age-appropriate experiences, and this decision demonstrates that commitment. Meta is going to take proactive measures to block recommendations for and display of “age-inappropriate” content in teen users’ feeds, including in cases when the content is shared by accounts they follow.

Photo from: MTV Lebanon

Safety Measures for Teen Users

To enhance safety for teen users, Meta will implement stringent settings for their accounts, placing them in the most restrictive mode. Teenagers, provided they have truthfully stated their age during registration, will encounter limitations on search terms that could lead to harmful content. This move aligns with Meta’s goal of creating a secure digital environment for young users.

Meta acknowledges the complexity of certain narratives, such as posts detailing ongoing struggles with self-harm. While these stories can contribute to destigmatizing critical issues, Meta recognizes their potential unsuitability for all young audiences. Critics argue that Meta’s recent actions are insufficient, questioning why the company waited until 2024 to announce these changes. Some view these moves as attempts to evade regulation amid lawsuits from U.S. states, accusing Meta of negatively impacting youth mental health through platform features.

The announcement coincides with Meta facing legal challenges from numerous U.S. states, accusing it of designing features on Instagram and Facebook that contribute to youth mental health issues. Critics, including parents who have experienced the loss of children to online harms, view Meta’s actions as inadequate in addressing the severity of the concerns raised. The company’s attempt to shield teens from certain content is perceived by some as reactive rather than proactive.

READ ALSO: California Bars and Clubs to Offer Date-Rape Drug Tests Following New Legislation

Ongoing Challenges and Future Impact

Fairplay, a children’s online advocacy group, views Meta’s announcement as a “desperate attempt to avoid regulation” and questions the timing of these changes. Specifically, Fairplay Executive Director Josh Golin criticizes Meta for waiting until 2024 to implement measures against pro-suicide and eating disorder content. The critique underscores the ongoing tension between online platforms and advocacy groups seeking more comprehensive protection for young users.

As Meta adopts these initiatives, their effectiveness and long-term consequences are questionable. The developing legislative framework, public criticism, and ongoing legal fights impact Meta’s approach to its responsibility to its users. The company’s ability to find a balance between encouraging open dialogue on complicated themes and protecting vulnerable users will remain a focal point of talks about digital platforms.

READ ALSO: Navigating the Complexities of Assisted Dying: A Moral Conundrum Unfolds

Leave a Comment