Podcast Episodes

Back to Search
Tech CTO Has 99.999% P(Doom) — “This is my bugout house” — Louis Berman, AI X-Risk Activist

Tech CTO Has 99.999% P(Doom) — “This is my bugout house” — Louis Berman, AI X-Risk Activist



Louis Berman is a polymath who brings unique credibility to AI doom discussions. He's been coding AI for 25 years, served as CTO of major tech companies, recorded the first visual sighting of what be…


Published on 7 hours ago

Rob Miles, Top AI Safety Educator: Humanity Isn’t Ready for Superintelligence!

Rob Miles, Top AI Safety Educator: Humanity Isn’t Ready for Superintelligence!



Rob Miles is the most popular AI safety educator on YouTube, with millions of views across his videos explaining AI alignment to general audiences. He dropped out of his PhD in 2011 to focus entirely…


Published on 5 days, 23 hours ago

Debate with Vitalik Buterin — Will “d/acc” Protect Humanity from Superintelligent AI?

Debate with Vitalik Buterin — Will “d/acc” Protect Humanity from Superintelligent AI?



Vitalik Buterin is the founder of Ethereum, the world's second-largest cryptocurrency by market cap, currently valued at around $500 billion. But beyond revolutionizing blockchain technology, Vitalik…


Published on 2 weeks, 2 days ago

Why I'm Scared GPT-9 Will Murder Me — Liron on Robert Wright’s Nonzero Podcast

Why I'm Scared GPT-9 Will Murder Me — Liron on Robert Wright’s Nonzero Podcast



Today I’m sharing my interview on Robert Wright’s Nonzero Podcast from last May. Rob is an especially sharp interviewer who doesn't just nod along, he had great probing questions for me.

This intervie…


Published on 3 weeks ago

The Man Who Might SOLVE AI Alignment — Dr. Steven Byrnes, AGI Safety Researcher @ Astera Institute

The Man Who Might SOLVE AI Alignment — Dr. Steven Byrnes, AGI Safety Researcher @ Astera Institute



Dr. Steven Byrnes, UC Berkeley physics PhD and Harvard physics postdoc, is an AI safety researcher at the Astera Institute and one of the most rigorous thinkers working on the technical AI alignment …


Published on 3 weeks, 6 days ago

Top Professor Condemns AGI Development: “It’s Frankly Evil” — Geoffrey Miller

Top Professor Condemns AGI Development: “It’s Frankly Evil” — Geoffrey Miller



Geoffrey Miller is an evolutionary psychologist at the University of New Mexico, bestselling author, and one of the world's leading experts on signaling theory and human sexual selection. His book "M…


Published on 1 month ago

Zuck’s Superintelligence Agenda is a SCANDAL | Warning Shots EP1

Zuck’s Superintelligence Agenda is a SCANDAL | Warning Shots EP1



I’m doing a new weekly show on the AI Risk Network called Warning Shots. Check it out!

I’m only cross-posting the first episode here on Doom Debates. You can watch future episodes by subscribing to th…


Published on 1 month, 1 week ago

Rationalist Podcasts Unite! — The Bayesian Conspiracy ⨉ Doom Debates Crossover

Rationalist Podcasts Unite! — The Bayesian Conspiracy ⨉ Doom Debates Crossover



Eneasz Brodski and Steven Zuber host the Bayesian Conspiracy podcast, which has been running for nine years and covers rationalist topics from AI safety to social dynamics. They're both OG rationalis…


Published on 1 month, 1 week ago

His P(Doom) Doubles At The End — AI Safety Debate with Liam Robins, GWU Sophomore

His P(Doom) Doubles At The End — AI Safety Debate with Liam Robins, GWU Sophomore



Liam Robins is a math major at George Washington University who's diving deep into AI policy and rationalist thinking.

In Part 1, we explored how AI is transforming college life. Now in Part 2, we rid…


Published on 1 month, 2 weeks ago

AI Won't Save Your Job — Liron Reacts to Replit CEO Amjad Masad

AI Won't Save Your Job — Liron Reacts to Replit CEO Amjad Masad



Amjad Masad is the founder and CEO of Replit, a full-featured AI-powered software development platform whose revenue reportedly just shot up from $10M/yr to $100M/yr+.

Last week, he went on Joe Rogan …


Published on 1 month, 2 weeks ago





If you like Podbriefly.com, please consider donating to support the ongoing development.

Donate