Dario Amodei — "We are near the end of the exponential"

达里奥·阿莫迪:"我们接近指数增长的尾声"

Dwarkesh Podcast

2026-02-14

2 小时 22 分钟
PDF

单集简介 ...

Dario Amodei thinks we are just a few years away from AGI — or as he puts it, from having “a country of geniuses in a data center”. In this episode, we discuss what to make of the scaling hypothesis in the current RL regime, why task-specific RL might lead to generalization, and how AI will diffuse throughout the economy. We also dive into Anthropic’s revenue projections, compute commitments, path to profitability, and more. Watch on YouTube; read the transcript. Sponsors * Labelbox can get you the RL tasks and environments you need. Their massive network of subject-matter experts ensures realism across domains, and their in-house tooling lets them continuously tweak task difficulty to optimize learning. Reach out at labelbox.com/dwarkesh. * Jane Street sent me another puzzle… this time, they’ve trained backdoors into 3 different language models — they want you to find the triggers. Jane Street isn’t even sure this is possible, but they’ve set aside $50,000 for the best attempts and write-ups. They’re accepting submissions until April 1st at janestreet.com/dwarkesh. * Mercury’s personal accounts make it easy to share finances with a partner, a roommate… or OpenClaw. Last week, I wanted to try OpenClaw for myself, so I used Mercury to spin up a virtual debit card with a small spend limit, and then I let my agent loose. No matter your use case, apply at mercury.com/personal-banking. Timestamps (00:00:00) - What exactly are we scaling? (00:12:36) - Is diffusion cope? (00:29:42) - Is continual learning necessary? (00:46:20) - If AGI is imminent, why not buy more compute? (00:58:49) - How will AI labs actually make profit? (01:31:19) - Will regulations destroy the boons of AGI? (01:47:41) - Why can’t China and America both have a country of geniuses in a datacenter? Get full access to Dwarkesh Podcast at www.dwarkesh.com/subscribe
更多

单集文稿 ...

  • So, we talked three years ago.

  • I'm curious in your view, what has been the biggest update of the last three years?

  • What has been the biggest difference between what I felt like last three years versus now?

  • Yeah, I would say actually the underlying technology,

  • like the exponential of the technology has gone...

  • Broadly speaking I would say about about

  • as I expected it to go I mean there's like plus or minus you know a couple There's plus or minus a year or two here.

  • There's plus or minus a year or two there I don't know that I was predicted the specific direction of code But but actually when I look at the exponential It is roughly what I expected in terms of the march of the models from like,

  • you know, smart high school student to smart college student to like, you know,

  • beginning to do PhD and professional stuff and in the case of code reaching beyond that.

  • So, you know, the frontier is a little bit uneven.

  • It's roughly what I expected.

  • I will tell you though what the most surprising thing has been.

  • The most surprising thing has been the lack of public recognition of how close we are to the end of the exponential.

  • To me, it is absolutely wild that, you know, you have people,

  • you know, within the bubble and outside the bubble, you know,

  • but you have people talking about these these, you know,

  • just the same tired old hot button political issues and like,

  • you know, or around us were like near the end of the exponential.

  • I want to understand what that exponential looks like right now