Infra

Anthropic Joins the Chip Wars: Why Claude's Creator Wants Its Own Silicon

With $30 billion in annualized revenue and skyrocketing demand, Anthropic is exploring custom chips. But in a market dominated by NVIDIA, can anyone really go it alone?

2026-04-13 By AgentBear Editorial Source: Tuugo
Anthropic Joins the Chip Wars: Why Claude's Creator Wants Its Own Silicon

The AI chip wars are entering a new phase. Anthropic, the company behind Claude and one of the most respected names in AI safety, is now considering designing its own custom chips. It's a move that would put the San Francisco-based startup in direct competition with NVIDIA, the $3 trillion giant that has dominated AI infrastructure for the past decade.

The news, first reported by The Information and confirmed by multiple sources, reveals that Anthropic is in the early stages of evaluating whether to develop custom silicon for its AI models. The company hasn't committed to the path yet — no designs are finalized, no team has been established, and sources suggest Anthropic may ultimately decide to stick with off-the-shelf chips. But the very fact that one of the most cautious, safety-focused AI labs is even considering this step signals how desperate the chip shortage has become.

Anthropic's financials tell part of the story. The company generated $9 billion in revenue in 2025 and is now running at a $30 billion annualized rate. Claude, its flagship AI assistant, has seen demand explode this year as businesses and consumers flock to alternatives from OpenAI. But that success has created a problem: Anthropic literally cannot get enough chips to meet demand.

Why Custom Chips? The Supply Chain Squeeze

To understand why Anthropic is considering such a drastic step, you need to understand the current chip landscape. Training and running large language models like Claude requires massive amounts of specialized compute. Anthropic currently uses a mix of hardware: Google's Tensor Processing Units (TPUs), Amazon's custom chips, NVIDIA's GPUs, and AWS Trainium processors.

That diversity is intentional. No single supplier can meet Anthropic's needs, so the company has spread its bets across multiple vendors. But even that strategy is hitting limits. Every AI lab in the world is competing for the same limited supply of advanced chips. NVIDIA, which makes the gold-standard GPUs for AI training, has a backlog that stretches months into the future. Google and Amazon are prioritizing their own AI efforts over external customers. And the specialized chips that are available often aren't optimized for the specific architectures that Anthropic uses.

The result is a supply chain squeeze that threatens to limit Anthropic's growth. The company has the demand — businesses are begging for Claude access — but it lacks the compute to serve them. In that context, designing custom chips starts to look less like a luxury and more like a necessity.

Custom silicon would offer several advantages. First, chips designed specifically for Claude's architecture could be more efficient than general-purpose alternatives. Anthropic could optimize for the specific types of matrix operations and attention mechanisms that its models use, potentially achieving better performance per watt than off-the-shelf hardware.

Second, vertical integration would give Anthropic more control over its supply chain. Rather than competing with every other AI lab for NVIDIA's limited production capacity, Anthropic could contract directly with semiconductor foundries like TSMC for its own designs. That wouldn't eliminate supply constraints entirely — foundries are also capacity-limited — but it would give Anthropic more leverage and predictability.

Third, and perhaps most importantly, custom chips could reduce costs. NVIDIA's GPUs carry enormous margins, and those costs flow directly to AI labs' bottom lines. By designing its own chips, Anthropic could capture some of that value and potentially offer Claude at more competitive prices.

The Trend: Everyone Wants Their Own Chips

Anthropic isn't alone in considering this path. The move toward custom AI chips has become one of the defining trends of 2026, with virtually every major AI lab exploring vertical integration.

OpenAI has been rumored to be working with Broadcom on custom chip designs for over a year. The company has also partnered with Microsoft to secure dedicated capacity in the software giant's data centers, but that hasn't eliminated the desire for its own hardware. OpenAI's Sam Altman has reportedly been personally involved in chip sourcing efforts, even exploring the possibility of raising billions to build AI-focused semiconductor factories.

Meta has been the most public about its chip ambitions. The company has already deployed its own MTIA (Meta Training and Inference Accelerator) chips in its data centers and is working on more advanced designs. Meta's scale — it operates some of the largest AI training clusters in the world — makes the economics of custom silicon more favorable than for smaller players.

Google has been building its own TPUs for nearly a decade and is now on the fifth generation of the chips. TPUs power Google's own AI models and are also offered to cloud customers, including Anthropic itself. Google's head start in custom silicon gives it a significant advantage in the current supply-constrained environment.

Amazon has developed Trainium and Inferentia chips for its AWS cloud platform. While these haven't achieved the same market penetration as NVIDIA's GPUs or Google's TPUs, they offer an alternative for price-sensitive customers and give Amazon leverage in negotiations with NVIDIA.

Even Microsoft, which has historically relied on partnerships rather than custom silicon, has reportedly been exploring chip designs with AMD. The company's massive investments in OpenAI have given it unique insight into the economics of AI infrastructure, and it clearly sees value in more vertical integration.

The message is clear: every major player in AI believes that controlling your own silicon is a competitive advantage. And Anthropic, despite its smaller scale and safety-focused culture, is feeling the same pressure.

The Challenges: Why Custom Chips Are Hard

But designing custom chips is easier said than done. The semiconductor industry is one of the most complex and capital-intensive in the world, and success is far from guaranteed.

First, there's the cost. Designing a modern AI chip from scratch costs hundreds of millions of dollars in engineering talent, software tools, and verification. That's before you manufacture a single chip. And the design process takes years — by the time Anthropic's first custom chip would be ready, the state of the art will have moved on.

Second, there's the expertise. Anthropic is an AI research company, not a semiconductor design firm. Building a chip team means hiring away talent from NVIDIA, AMD, Intel, and the handful of other companies that know how to do this work. That talent is already in short supply and commands premium salaries.

Third, there's the software ecosystem. NVIDIA's dominance isn't just about hardware — it's about CUDA, the software platform that makes it easy to write code for NVIDIA GPUs. Any custom chip Anthropic designs would need its own software stack, and getting developers to adopt a new platform is notoriously difficult. Google's TPUs have struggled with this despite years of investment and Google's massive resources.

Fourth, there's the manufacturing bottleneck. Even if Anthropic designs a perfect chip, it still needs someone to manufacture it. TSMC, the world's leading semiconductor foundry, is capacity-constrained and prioritizes its largest customers. Anthropic would be competing for fab space with Apple, NVIDIA, AMD, and Qualcomm — companies that order chips in volumes orders of magnitude larger than Anthropic could commit to.

Finally, there's the risk of obsolescence. AI hardware is evolving rapidly. A chip design that makes sense today might be outdated in two years as model architectures change and new techniques emerge. NVIDIA can spread that risk across thousands of customers. Anthropic would be betting the company on getting it right.

What This Means for the AI Landscape

If Anthropic does move forward with custom chips, it would mark a significant shift in the AI industry's structure. The company has positioned itself as the responsible alternative to OpenAI — more cautious, more focused on safety, less driven by commercial pressure. Designing custom silicon would be a decidedly aggressive, capital-intensive move that looks more like the "move fast" philosophy Anthropic has criticized.

But the reality is that Anthropic may not have a choice. The chip shortage isn't going away. NVIDIA's dominance is creating a tax on the entire AI industry. And as the gap between supply and demand widens, even safety-conscious labs are being forced into vertical integration.

The broader implication is that AI is becoming a hardware game as much as a software one. The companies that control their own silicon — Google with TPUs, Meta with MTIA, potentially Anthropic with its own designs — will have structural advantages over those that don't. They'll be able to train larger models, serve more users, and potentially offer lower prices.

For NVIDIA, the trend is both a threat and an opportunity. The company will lose some business to custom chips, but it will also benefit from the overall growth in AI compute demand. And NVIDIA's ecosystem advantage — CUDA, its developer tools, its established supply chains — will be hard for anyone to replicate.

🔥 Our Hot Take: This Is How NVIDIA Loses

Anthropic's potential move into custom chips isn't just about one company's supply chain strategy — it's a signal that the NVIDIA monopoly is cracking. And that's good for everyone except NVIDIA shareholders.

For the past two years, NVIDIA has been the pick-and-shovel play of the AI boom. Every AI lab, every startup, every enterprise trying to build AI capabilities has had to pay NVIDIA's tax. The company's GPUs carry margins that would make a luxury goods company blush, and customers have had no choice but to pay up.

But monopolies don't last forever in technology. The same dynamics that made NVIDIA indispensable — the need for specialized AI compute, the network effects of CUDA, the massive capital requirements for chip design — are now pushing its customers to find alternatives. And when your customers are companies with $30 billion annualized revenue run rates, they have the resources to build those alternatives.

Anthropic's exploration of custom chips is particularly significant because of what it represents. This isn't some cash-strapped startup trying to save money — this is one of the best-funded, most technically sophisticated AI labs in the world saying that NVIDIA's offerings aren't good enough. When Claude's creators are looking for alternatives, you know the market is broken.

The irony is that NVIDIA created this problem for itself. By charging monopoly prices and failing to meet demand, the company has given its customers both the motivation and the financial means to build alternatives. Anthropic's $30 billion revenue run rate is largely going to NVIDIA right now. If even a fraction of that gets redirected to custom silicon, the economics of chip design start to look very different.

We're entering the fragmentation phase of AI hardware. Instead of one dominant platform, we'll have multiple competing ecosystems — Google's TPUs, Meta's MTIA, Amazon's Trainium, potentially Anthropic's custom designs, and yes, still NVIDIA's GPUs. That fragmentation will create inefficiencies and compatibility headaches. But it will also create competition, and competition will eventually bring prices down.

NVIDIA won't disappear. The company's technology and ecosystem are too entrenched for that. But the days of NVIDIA being the only game in town are ending. Anthropic's chip exploration is just one more data point in that trend. The AI hardware market is opening up, and the winners will be the companies that can build the best models on whatever silicon they can get — whether it comes from Santa Clara or their own design labs.

📚 Related Reading

Enjoyed this analysis?

Share it with your network and help us grow.

More Intelligence

Infra

Japan Bets $4 Billion on Rapidus: Can a Government-Backed Startup Beat TSMC?

Infra

DeepSeek V4: China's 1 Trillion Parameter Bet on Huawei Chips

Back to Home View Archive