NEWS   TOP   TAGS   TODAY   ARCHIVE   EN   ES   RU   FR 
NEWS / 2024 / 01 / 18 / META'S "HISTORICAL RELUCTANCE" TO PROTECT CHILDREN ON INSTAGRAM REVEALED IN COURT DOCUMENTS

Meta's "historical reluctance" to protect children on Instagram revealed in court documents

03:35 18.01.2024

Newly unredacted documents from New Mexico's lawsuit against Meta, the parent company of Facebook and Instagram, have revealed the company's "historical reluctance" to prioritize the safety of children on its platforms. The lawsuit, filed by New Mexico's Attorney General Raul Torrez in December, accuses Meta of failing to protect young users from exposure to child sexual abuse material and allowing adults to solicit explicit imagery from them.

The recently unredacted passages from the lawsuit, which were made public on Wednesday, include internal employee messages and presentations from 2020 and 2021 that highlight Meta's awareness of issues such as adult strangers contacting children on Instagram, the sexualization of minors on the platform, and the dangers of its "people you may know" feature, which recommends connections between adults and children. However, the documents reveal that Meta was slow to address these concerns.

One internal document referenced in the lawsuit shows Meta's urgency in addressing an incident involving an Apple executive whose 12-year-old child was solicited on the platform. The document acknowledges that such incidents could anger Apple to the extent of threatening to remove Instagram from the App Store. According to the complaint, Meta was willing to treat the issue as urgent only when it faced potential consequences.

In a July 2020 document titled "Child Safety - State of Play (7/20)," Meta listed "immediate product vulnerabilities" that could harm children. The document confirmed that safeguards available on Facebook were not always present on Instagram. Meta's reasoning at the time was that it did not want to block parents and older relatives on Facebook from reaching out to their younger relatives on Instagram. However, the report's author criticized this reasoning as "less than compelling" and accused Meta of sacrificing children's safety for the sake of growth.

It was not until March 2021 that Instagram announced restrictions on adults messaging minors. However, the lawsuit reveals an internal chat from July 2020 in which an employee asked about Meta's efforts to combat child grooming, to which another employee responded that child safety was not a priority during that period.

The complaint also highlights Instagram's failure to address the issue of inappropriate comments under posts by minors. Former Meta engineering director Arturo Bejar testified about his own daughter's troubling experiences with Instagram, including receiving unwanted sexual advances and harassment.

A child safety presentation from March 2021 acknowledged that Meta was "underinvested in minor sexualization on Instagram." The presentation emphasized that sexualized comments on content posted by minors not only create a terrible experience for creators and bystanders but also provide an opportunity for bad actors to identify and connect with each other.

These newly unredacted documents further underscore Meta's historical reluctance to implement appropriate safeguards on Instagram, even when similar safeguards were available on Facebook. Meta, based in Menlo Park, California, has faced increasing pressure from lawmakers to improve child safety on its platforms. Last week, the company announced plans to hide inappropriate content related to suicide, self-harm, and eating disorders from teenagers' accounts on Instagram and Facebook.

New Mexico's lawsuit follows a similar legal action taken by 33 states in October, which accuses Meta of contributing to the youth mental health crisis by intentionally designing features on Instagram and Facebook that addict children to their platforms. Meta CEO Mark Zuckerberg, along with the CEOs of Snap, Discord, TikTok, and X (formerly Twitter), is scheduled to testify before the U.S. Senate on child safety at the end of January. Critics argue that Meta has not done enough to address the concerns surrounding child safety on its platforms.

/ Thursday, 18 January 2024 /

themes:  Meta  Apple  Mark Zuckerberg  X (Twitter)  USA  Facebook  TikTok

VIEWS: 221


20/05/2024    info@iqtech.top
All rights to the materials belong to their authors.
RSS