Episode Details
Back to Episodes
Why tech giants sacrifice children for growth
Description
From Roblox’s CEO calling child predators an “opportunity” to Meta’s internal research showing Instagram harms teen girls, a pattern emerges across every major platform: companies know their products damage children and choose profits anyway. This report examines the evidence across Roblox, Meta, TikTok, and AI companies, revealing why self-regulation has failed and what parents need to understand about the forces shaping their children’s digital lives.
The timing is critical. In November 2025, Roblox CEO David Baszucki’s disastrous interview exposed the mindset driving Silicon Valley’s approach to child safety. But Baszucki isn’t an outlier—he’s representative of an industry that has systematically prioritized growth metrics over children’s wellbeing for over a decade.
The Roblox case reveals an industry-wide pattern
When New York Times journalists Casey Newton and Kevin Roose asked Roblox CEO David Baszucki about “the problem of predators” on his platform—used by 150 million daily active users, most of them children—his response shocked listeners: “We think of it not necessarily just as a problem, but an opportunity as well.”
The interview, published November 21, 2025, became what the hosts called “the craziest interview we’ve ever done.” Baszucki grew increasingly combative, dismissing questions about child safety, interrupting hosts with sarcastic “high-fives,” and suggesting he wanted to discuss “fun stuff” instead. He even floated adding prediction markets—essentially gambling—to Roblox for children, calling it “a brilliant idea if it can be done legally.”
This tone-deaf performance came as Roblox faces nearly 60 lawsuits alleging the platform facilitated child sexual exploitation. Texas Attorney General Ken Paxton’s lawsuit accused Roblox of “putting pixel pedophiles and profits over the safety of Texas children.” Louisiana, Kentucky, and Florida have filed similar suits, while the SEC and FTC have opened undisclosed investigations.
The Hindenburg Research report from October 2024 provided the most damning evidence. The short-seller’s in-game investigation found what they called “an X-rated pedophile hellscape, exposing children to grooming, pornography, violent content and extremely abusive speech.” Key findings include:
* 38 groups openly trading child pornography on the platform
* Games accessible to under-13 accounts titled “Escape to Epstein Island” and “Diddy Party”
* Robux (virtual currency) used by predators as a bargaining tool to exploit children
* Safety moderation outsourced to Asian call centers paying workers $12 per day
* Over 13,000 reported instances of child exploitation in a single year
Roblox dismissed the report, noting Hindenburg was a short-seller (the firm has since shut down). But the company’s response—relying on vague AI promises while cutting trust and safety spending—exemplifies the industry’s playbook: acknowledge problems exist, claim technology will fix them, and resist any accountability.
Meta knew Instagram harmed teens and chose growth anyway
Meta’s internal research, leaked by whistleblower Frances Haugen in 2021,
Listen Now
Love PodBriefly?
If you like Podbriefly.com, please consider donating to support the ongoing development.
Support Us