#2
Tokens All the Way Down: How LLMs Actually Think
That chatbot isn't 'thinking' — it's predicting the next tiny chunk of text, billions of times, using a 2017 trick called attention that lets every word spy on every other word simultaneously. We break down transformers, hallucinations, and why GPT-5 now decides how long to think before answering.
April 17, 2026·17:48·Episode 2
