Episode #475 from 1:03:01
Scaling laws
Do you think the scaling laws are holding strong on the pre-training/post-training test time compute? Do you on the flip side of that, anticipate AI progress hitting a wall? We certainly feel there's a lot more room just in the scaling. So actually all steps pre-training, post-training, and inference time. So there's sort of three scalings that are happening concurrently. And again there, it's about how innovative you can be and we pride ourselves on having the broadest and deepest research bench. We have amazing, incredible researchers and people like Noam Shazir who came up with Transformers and Dave Silver who led the AlphaGo project and so on.
Why this moment matters
Do you think the scaling laws are holding strong on the pre-training/post-training test time compute? Do you on the flip side of that, anticipate AI progress hitting a wall? We certainly feel there's a lot more room just in the scaling. So actually all steps pre-training, post-training, and inference time. So there's sort of three scalings that are happening concurrently. And again there, it's about how innovative you can be and we pride ourselves on having the broadest and deepest research bench. We have amazing, incredible researchers and people like Noam Shazir who came up with Transformers and Dave Silver who led the AlphaGo project and so on.