Anthropic Head of Pretraining on Scaling Laws, Compute, and the Future of AI

Anthropic Head of Pretraining on Scaling Laws, Compute, and the Future of AI

Published on Oct 1
1小时4分钟
Y Combinator Startup Podcast
0:00
0:00
<p>Ever wonder what it actually takes to train a frontier AI model?YC General Partner Ankit Gupta sits down with Nick Joseph, Anthropic&#39;s Head of Pre-training, to explore the engineering challenges behind training Claude—from managing thousands of GPUs and debugging cursed bugs to balancing compute between pre-training and RL. We cover scaling laws, data strategies, team composition, and why the hardest problems in AI are often infrastructure problems, not ML problems.</p>