Anthropic Leases xAI's Colossus 1 to End Compute Crunch

Matthew Bermango watch the original →

Anthropic secures 300MW from SpaceX's full Colossus 1 (220k GPUs) despite Elon's past criticisms, doubling Claude Code quotas and boosting API limits to meet exploding demand.

Compute Strategies Diverge and Converge

Anthropic's conservative capex approach under Dario Amodei—avoiding massive GPU buys to hedge against uncertain AI demand—backfired as demand exploded beyond estimates. OpenAI's aggressive strategy of maxing out GPUs and leverage proved correct, leaving Anthropic compute-starved. Recent deals flood in: up to 5GW from AWS (1GW by end-2026), 5GW from Google/Broadcom (2027 online), $30B Microsoft/Nvidia Azure capacity, $50B U.S. infrastructure with Fluid Stack. Anthropic trains and infers on diverse hardware—AWS Trainium, Google TPUs, Nvidia GPUs—hinting at chip commoditization. Greg Brockman noted porting models across architectures is straightforward, reducing lock-in despite current shortages.

xAI/SpaceX partnership leases Anthropic the entire Colossus 1 in Memphis (300MW, 220k+ Nvidia GPUs), already online for immediate quota relief. xAI shifted training to Colossus 2, leaving capacity idle and costly. SpaceX's atom-moving prowess fits AI's scale needs. Future orbital compute collaboration eyed, countering Sam Altman's dismissal—Jensen Huang and Elon are bullish.

Quota Relief Unlocks Developer Demand

Anthropic's recent user pain—opaque quotas, peak-hour cuts, subscription throttling, OpenClaw bans—stems from constraints. Deal enables: doubling Claude Code's 5-hour rate limits for Pro/Max/Team/Enterprise; removing peak-hour reductions for Pro/Max; API boosts for Opus (Tier 1: 30k→500k input tokens/min; Tier 2: 450k→2M; Tier 3: 800k→5M; Tier 4: 2M→10M). Subscription users still lack clarity on totals, but API payers win big. Demand outstrips supply for a decade; Nvidia ultimate victor as inference providers scramble.

Cursor's prior Colossus deal raises allocation questions—Anthropic takes "all" capacity, potentially squeezing others. Users like Berman celebrate relief after vocal frustration: "I'm paying you, let me use the tokens."

Elon's Tense Truce with Anthropic

Elon flips from critic—calling Anthropic "misanthropic," "hypocritical," anti-Western, data-thieving, war-enabling—to partner. Past barbs: reposting "quit Anthropic," roasting Dario as "groveling," slamming Pentagon ties via Palantir/Maven, constitution biases. Now: "Impressed by senior team ensuring Claude good for humanity... highly competent, cared about right thing... no one set off my evil detector." Cope evident: leasing excess post-Colossus 2 shift; pained qualifiers like "so long as they engage in critical self-examination."

Motives align: xAI rebuilds from flawed foundations (like Tesla's pre-end-to-end FSD hybrid heuristics), mirroring "Bitter Lesson"—scale end-to-end nets beats heuristics. Excess GPUs burn cash idle; Anthropic's Claude demand surges, letting xAI monetize while iterating Grok. Elon won't be counted out competitively.

Chips and Winners in AI Infrastructure Race

Nvidia/Google dominate: Colossus scale, TPU surplus claims (Thomas Kurian: balanced allocation; Sundar Pichai: more compute = more revenue). Inference commoditizes faster than training; no moat in architectures yet. Orbital compute divides: Elon/SpaceX pursue gigawatts; Altman scoffs. Anthropic's "Constitutional AI" gates access, contrasting OpenAI's open origins (Elon's lawsuit fodder), but competence acknowledged.

"We've agreed to a partnership with SpaceX that will substantially increase our compute capacity... use all the compute capacity at their Colossus One data center." — Anthropic announcement, highlighting full 300MW takeover.

"I spent a lot of time last week with senior members of the Anthropic team... was impressed... everyone I met was highly competent... cared a great deal about doing the right thing... no one set off my evil detector." — Elon Musk, couched praise post-partnership.

"The total cost of ownership of those GPUs goes up as they sit idle... they need to sell that compute." — Berman on xAI's incentive to lease.

"AI demand is going to outstrip supply for probably at least another decade... infinite demand for intelligence." — Berman on endless market.

"Is there a more hypocritical company than Anthropic?" — Elon Musk (March 2026), pre-truce exemplar.

Key Takeaways

  • Anthropic's conservative bet failed; chase compute aggressively like OpenAI.
  • Lease idle capacity: xAI monetizes Colossus 1 excess while rebuilding.
  • Demand quotas now: doubled Claude Code limits, 20x+ API token boosts.
  • Diverse chips work: Train/infer on Trainium/TPUs/GPUs without pain.
  • Elon's flip pragmatic—criticize publicly, partner privately for cash flow.
  • Nvidia/Google win infrastructure; models secondary to silicon scale.
  • End-to-end nets + scale (Bitter Lesson) trumps heuristics; rebuild if needed.
  • Orbital compute viable long-term for gigawatt needs.
  • Transparency lacking: Demand clear quota baselines from providers.
  • Root for capacity unlocks—users win when demand meets supply.
  • #news
  • #rant

summary by x-ai/grok-4.1-fast. probably wrong about something. check the source.