Mon. Dec 1st, 2025

Imagine a major U.S. company—let’s call it “Big Corp.” It is one of the highest-valued businesses in the world, yet its entire profit model depends on addicting children to its products. And the consequences of this addiction are catastrophic.

Big Corp’s own internal studies show that this addiction fuels anxiety, depression, eating disorders, and suicidal ideation among minors. But when those findings surface, the company buries the research and lies to Congress about the results.

[time-brightcove not-tgx=”true”]

At the same time, Big Corp exposes children to unthinkable harms, failing, and often outright refusing, to stop the abuse.

Such appalling behavior from a U.S. company would demand immediate accountability. Yet, according to a recent brief filed by more than 1,800 plaintiffs, including parents and children, Meta has allegedly committed these very offenses for years. The only difference: Instead of throwing our children to the wolves in the physical world, the Big Tech giant has allegedly sold out children’s safety for profit in the virtual space.

The brief, first reported by TIME, provides just the latest evidence documenting how tech companies such as Meta, which owns Facebook and Instagram, have targeted children with addictive features—no matter the consequences.

According to the lawsuit, and Instagram’s former head of safety and well-being Vaishnavi Jayakumar, Meta has a policy to not remove individuals who engage in sex trafficking on its platforms until users report the offender at least 17 times. 

In September, the Senate Judiciary Subcommittee on Privacy, Technology, and the Law, which I chair, heard from courageous former Meta employees like Jayakumar who worked on youth safety research. In a bombshell hearing, they described how Meta suppressed internal research showing that children using its virtual reality headsets have been sexually propositioned by adults in the company’s Metaverse.

For the child, the psychological and physiological harms of such abuse are no different than if it happened in person. Yet, according to the whistleblowers and hundreds of pages of internal documents, Meta sought “plausible deniability” by eliminating any evidence that would force the company to act. At one point, executives even warned researchers against referring to “kids” on their VR platforms. Their preferred euphemism? “Alleged minors with young sounding voices who may be underage.”

Although the company has said it “stands by its record” and denies any fault, such damaging actions have long been routine at Meta. Earlier this year, the Federal Trade Commission revealed that in 2019, Instagram encouraged known “groomers” to connect with minors through algorithmic follow recommendations. Although the company was aware of these dangerous interactions, CEO Mark Zuckerberg reportedly chose not to strengthen the platform’s safety teams to save money.

With Meta and other Big Tech platforms, we’ve seen this movie over and over: algorithms connect children with drug dealers and flood their feeds with pro-suicide content; AI chatbots sexualize children in role-playing fantasies; design features allow children to share their precise real-time location on a map with anyone, including predators eager to track them down.

The hard truth is that Big Tech companies cannot be trusted to make their platforms safe by design because meaningful safety measures would cut into their bottom lines. Congress must step up and ensure that these companies finally face accountability for the harm they have inflicted on an entire generation of children.

Earlier this year, I reintroduced the bipartisan Kids Online Safety Act (KOSA), which would ensure children are afforded the same protections from harm in the virtual world that they have in the physical world. The Senate’s version of KOSA would establish a clear duty of care for online platforms to prevent specific threats to minors, including sexual abuse, illicit drugs, and the promotion of suicide and eating disorders. Requiring Big Tech companies to take responsibility for making their own products safer is essential to protecting kids and giving parents peace of mind.

The legislation has overwhelming bipartisan support, passing through the Senate last year by a vote of 91-3. This year, the legislation has already regained its veto-proof majority with 67 Senate co-sponsors. 

At a Senate Judiciary Committee hearing last year, Meta’s Zuckerberg faced dozens of parents who had lost their children to harms on social media. “I’m sorry for everything you’ve all gone through,” he told them. “No one should have to go through the things that your families have suffered.”

But it isn’t the first time he has issued such an apology. And for parents living with unimaginable loss, any apology without action is meaningless. 

They deserve accountability. The Kids Online Safety Act could finally deliver it.

By

Leave a Reply

Your email address will not be published.