Episode Details

Back to Episodes

Beyond Boosted Trees: Christoph Molnar on the Rise of Tabular Foundation Models

Published 6 hours ago
Description

As the AI landscape evolves, the methods we use to process structured data are undergoing a silent revolution. Join us to explore how Tabular Foundation Models (TFMs) are challenging the decade-long reign of tree-based algorithms, why the traditional "train and predict" workflow is being replaced by "in-context learning," and what this shift means for the future of resilient modeling.

To help us, Christoph Molnar, renowned expert in machine learning interpretability and author of the Mindful Modeler newsletter, joins us to share his perspective on the emergence of tabular transformers, the surprising power of synthetic data, and how to maintain model safety in a world without parameter updates.

  • The decline of the "fit and predict" paradigm in tabular data
  • Transformer architectures vs. traditional models like XGBoost and LightGBM
  • In-context learning: Predicting without traditional training steps
  • The role of Structural Causal Models (SCMs) in generating training data
  • Why models trained on "math and probability" succeed on real-world datasets
  • Hardware accessibility and running foundation models on local MacBooks
  • Integrating SHAP values and conformal prediction for model interpretability
  • The future of the data science workflow: One tool among many or a total shift?

This episode is full of technical insights and forward-looking predictions that are sure to change how you approach your next dataset. As we move into a new era of AI, it’s the perfect time to explore the fundamentals of the next frontier!

What did you think? Let us know.

Do you have a question or a discussion topic for the AI Fundamentalists? Connect with them to comment on your favorite topics:

  • LinkedIn - Episode summaries, shares of cited articles, and more.
  • YouTube - Was it something that we said? Good. Share your favorite quotes.
  • Visit our page - see past episodes and submit your feedback! It continues to inspire future episodes.
Listen Now

Love PodBriefly?

If you like Podbriefly.com, please consider donating to support the ongoing development.

Support Us