Newly unsealed court documents reveal that Meta, the parent company of Facebook and Instagram, allegedly employed a lax approach to managing accounts linked to sex trafficking. Testimony from Vaishnavi Jayakumar, the former head of safety for Instagram, indicates that when she joined the company in 2020, Meta operated under a “17x” strike policy for accounts involved in trafficking humans for sex. According to Jayakumar’s testimony, users could accumulate up to 16 violations for activities such as prostitution and solicitation before facing account suspension on the 17th offense. She described this threshold as “very, very high” compared to industry standards.
The lawsuit, filed in the Northern District of California, alleges that Meta was aware of significant risks posed to younger users on its platforms but chose to downplay these threats. The filings claim that millions of adults on Meta’s platforms were actively engaging with minors, exacerbating mental health issues among teenagers. Furthermore, it is alleged that content related to suicide, eating disorders, and child sexual abuse was frequently detected but not adequately removed.
The legal action also implicates other major social media companies, including Snapchat, TikTok, and YouTube, suggesting that these firms ignored the harmful effects their platforms had on children in pursuit of profit. Attorney Previn Warren, representing the plaintiffs, likened the situation to the tobacco industry, stating, “Meta has designed social media products and platforms that it is aware are addictive to kids.” Warren emphasized that these addictions contribute to serious mental health issues, adding, “They did it anyway, because more usage meant more profits for the company.”
Allegedly, Meta has been targeting younger users since 2017, despite internal studies warning that their platforms could be detrimental to children’s wellbeing. Reports suggest that executives within the company actively shut down initiatives aimed at mitigating these identified harms.
In response to the lawsuit, a spokesperson for Meta firmly rejected the claims, stating, “We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions in an attempt to present a deliberately misleading picture.” The representative further noted that the company has been proactive in listening to parents and addressing issues of concern, highlighting the introduction of Teen Accounts with built-in protections and parental controls.
Despite these assertions, the lawsuit raises serious questions about the responsibilities of social media companies in safeguarding their users, particularly vulnerable populations such as minors. As the case unfolds, it may prompt broader discussions about the ethical implications of social media platforms and their impact on society.
Meta’s approach toward user safety will likely be scrutinized as more details emerge from the ongoing litigation. The outcome could have significant implications not only for the company but also for the broader social media landscape and its accountability in protecting users from harm.







































