Episode Details
Back to EpisodesThe AI Model Built for What LLMs Can't Do
Description
Most AI companies are racing to build bigger LLMs. Eve Bodnia thinks that's the wrong approach.
Eve is the founder and CEO of Logical Intelligence, which is developing an alternative to the transformer-based models dominating the industry. Her argument: LLMs’ architecture makes them fundamentally unsuited for some mission-critical tasks. A system that generates output one token at a time, with no ability to inspect its own reasoning mid-process or guarantee its results, shouldn't be trusted to design chips, analyze financial data, or even fly a plane. Her alternative is the energy-based model (EBM), a form of AI rooted in the physics principle of energy minimization, not language prediction. Rather than guessing the next probable word, an EBM maps every possible outcome across a mathematical landscape, where likely states settle into valleys and improbable ones sit on peaks.
Dan Shipper talked with Bodnia for AI & I about why she believes LLM progress is plateauing, what it means for AI to actually understand data rather than just pattern-match across it, and how her team is building toward formally verified code generated in plain English—no C++ required.
If you found this episode interesting, please like, subscribe, comment, and share!
Head to http://granola.ai/every and get 3 months free with the code EVERY
To hear more from Dan Shipper:
Subscribe to Every: https://every.to/subscribe
Follow him on X: https://twitter.com/danshipper
Timestamps:
00:00:51 - Introduction
00:02:09 - Why correctness and verifiability matter in AI
00:09:33 - What an energy-based model is
00:14:21 - How EBMs construct energy landscapes to understand data
00:19:00 - Why modeling intelligence through language alone is a flawed approach
00:26:54 - What it means for a model to "understand" data
00:37:21 - How EBMs solve the vibe coding problem and enable formally verified code
00:43:21 - Why LLM progress is plateauing
00:49:54 - Mission-critical industries haven't adopted LLMs, and how EBMs could fill that gap