Meta had a 17-strike policy for sex trafficking, former safety leader claims
Share this @internewscast.com

According to testimony from Vaishnavi Jayakumar, the former head of safety and well-being at Meta, the company reportedly allowed accounts involved in “human trafficking for sex” up to 16 violations before suspending them. This claim, along with other allegations suggesting Meta prioritized engagement over addressing significant problems, emerged in a court filing tied to a child safety lawsuit filed by various school districts across the United States.

Jayakumar stated during her deposition that accounts could face 16 instances of violations related to prostitution and sexual solicitation before being suspended on the 17th offense. She described this as an “excessively high strike threshold” compared to industry standards, as noted in the lawsuit. Lawyers also cite internal documentation that purportedly supports the existence of this policy.

Time magazine reported that the unredacted court documents exposed further troubling allegations against Meta. It was claimed that Instagram lacked a specific mechanism for users to report child sexual abuse material (CSAM) on the platform. Jayakumar reportedly raised this concern multiple times but was allegedly told that developing and managing such a system would be too labor-intensive.

The court filing also details various instances where Meta is accused of minimizing the negative impacts of its platforms to enhance user engagement. In 2019, Meta allegedly considered defaulting all teen accounts to private to block unsolicited messages but ultimately dismissed the idea due to potential negative effects on engagement growth. Last year, the company did start defaulting teens on Instagram to private accounts.

Further allegations in the lawsuit suggest that Meta researchers discovered that concealing likes on posts could significantly reduce users’ negative self-perceptions. However, the company reportedly abandoned this initiative after determining it would adversely affect Facebook metrics. Similarly, Meta is accused of reintroducing beauty filters in 2020 despite findings that they could contribute to body dysmorphia in young girls. The lawsuit claims Meta believed removing these filters might harm growth by pushing users to other platforms.

Meta’s spokesperson, Andy Stone, strongly refuted the allegations, stating in an email to The Verge that the claims rely on selective quotes and uninformed opinions to create a misleading narrative. Stone emphasized that Meta has been attentive to parental concerns, conducted research on critical issues, and implemented meaningful changes over the past decade to safeguard teenagers. This includes the introduction of Teen Accounts with inherent protections and parental controls to help manage their children’s online experiences.

Share this @internewscast.com
You May Also Like

Dogecoin Disappears: Unraveling the Aftermath

Describing Elon Musk’s time in Washington as tumultuous barely scratches the surface.…