Episode Details
Back to EpisodesHow algorithms automate human prejudice
Description
In the early 1980s, a computer at St. George's Hospital Medical School in London was automatically rejecting qualified applicants — not because they lacked credentials, but because their names sounded foreign. The algorithm had learned to discriminate by studying years of human admissions decisions, and nobody caught it for years. This episode examines how automated systems don't just reflect human prejudice — they industrialize it.
We start with the St. George's case as a concrete entry point into the broader problem of algorithmic discrimination, then expand outward to examine how automated decision-making systems across hiring, criminal justice, healthcare, and financial services have been caught replicating and scaling patterns of human prejudice at speeds and volumes that no individual human decision-maker could match.
This deep dive distinguishes itself from surface-level coverage by examining the specific technical pathways through which prejudice enters automated systems. We cover how training data encodes historical discrimination, how proxy variables allow algorithms to discriminate on protected characteristics without explicitly using them, how feedback loops compound initial biases over time, and why the mathematical structure of optimization itself can produce discriminatory outcomes even when designers have good intentions.
We also explore the regulatory and technical responses emerging around the world — from the EU AI Act to algorithmic auditing frameworks — and discuss why transparency, accountability, and diverse development teams are necessary but insufficient conditions for fair AI. Whether you're concerned about being on the receiving end of automated decisions, working in AI development, or studying the ethics of technology, this episode provides a thorough and grounded examination of one of the most consequential problems in modern computing.
Source credit: Research for this episode included Wikipedia articles accessed 4/2/2026. Wikipedia text is licensed under CC BY-SA 4.0; content here is summarized/adapted in original wording for commentary and educational use.