Episode #490 from 48:05
AI Scaling Laws: Are they dead or still holding?
I guess the big question here is—we talked quite a bit here on the architecture behind the pre-training—are the scaling laws holding strong across pre-training, post-training, inference, context size, data, and synthetic data? I'd like to start with the technical definition of a scaling law-
February 1, 2026Unknown26 chaptersLex Fridman
Why this moment matters
I guess the big question here is—we talked quite a bit here on the architecture behind the pre-training—are the scaling laws holding strong across pre-training, post-training, inference, context size, data, and synthetic data? I'd like to start with the technical definition of a scaling law-
Starts at 48:05