AI & TechArtificial IntelligenceBigTech CompaniesBusinessNewswire

Amazon’s Chip Business Valued at $50 Billion, Could Expand Sales

▼ Summary

– Amazon’s custom chip business (Graviton, Trainium, Nitro) generates over $20 billion in annualized revenue and is growing at triple-digit rates.
– CEO Andy Jassy defends a planned $200 billion in 2026 capital expenditure, stating it is backed by existing customer commitments like a major deal with OpenAI.
– The custom silicon, particularly Trainium AI accelerators, offers significant price-performance advantages and is largely sold out or pre-subscribed for future versions.
– Jassy signals Amazon may begin selling its custom chips directly to third parties, a shift from only offering them through its cloud services.
– The letter frames this within broader AI growth, noting AWS’s AI revenue run rate exceeded $15 billion and its Bedrock service saw rapidly increasing use.

In a recent communication to shareholders, CEO Andy Jassy revealed the staggering scale and strategic importance of Amazon’s internal silicon division. The custom chip operation, encompassing the Graviton, Trainium, and Nitro product lines, now generates over $20 billion in annualized revenue and is expanding at a triple-digit pace. Jassy posited that if this unit operated as an independent entity, its market value could approach $50 billion annually. He further hinted at a potential strategic pivot, suggesting Amazon may eventually sell these chips directly to other companies, a move that would fundamentally alter its competitive stance in the semiconductor industry.

Jassy opened his financial discussion by directly confronting skepticism around the company’s massive spending plans. He firmly stated that Amazon’s projected $200 billion capital expenditure for 2026 is not speculative. This investment, he argued, is a calculated bid for market leadership that will ultimately drive significantly larger future business, operating income, and free cash flow. This justification comes after a year where Amazon’s free cash flow declined substantially, primarily due to a major increase in capital spending focused on AI infrastructure.

The foundation for this aggressive investment, according to Jassy, is pre-existing customer demand. He indicated that a large portion of the planned 2026 capital expenditure is already backed by firm commitments. A key example cited is OpenAI, which has committed over $100 billion to AWS. This agreement expands a prior partnership and includes OpenAI’s plan to utilize roughly two gigawatts of Trainium capacity through Amazon’s cloud. With SoftBank, OpenAI’s majority stakeholder, helping finance this infrastructure build-out, a significant portion of the demand Jassy points to is effectively underwritten, validating the company’s expenditure strategy.

Amazon’s custom silicon strategy is built on three pillars. The Graviton custom CPU offers what Jassy claims is 40% better price-performance than comparable x86 processors from Intel and AMD. Adoption is widespread, with 98% of the top EC2 customers now using it. Demand is so robust that two major clients recently inquired about purchasing all available Graviton capacity for the coming year, a request Amazon denied.

The Trainium accelerator is Amazon’s answer to Nvidia in the AI training and inference arena. The Trainium2 chip is largely sold out, while the newer Trainium3, offering further performance gains, is nearly fully subscribed with clients like Uber migrating workloads. Even Trainium4, which is about 18 months from broad release and will feature compatibility with Nvidia’s NVLink technology, has seen substantial advance reservations. The third component, Nitro, is the custom chip that secures and manages AWS’s virtualization layer. Jassy emphasized that at scale, using Trainium will save the company “tens of billions” in annual capital expenditure and provide a major operating margin advantage compared to relying on third-party chips for inference. This structural cost advantage is central to the rationale for massive infrastructure spending, especially as the proportion of inference workloads in AI continues to grow.

While outlining Amazon’s ambitions, Jassy was measured in discussing Nvidia. He affirmed a “strong partnership” and acknowledged that many customers will continue to choose Nvidia. However, he stated that a “new shift has started,” driven by customer desire for better price-performance. Amazon’s chips are competing from within its own cloud ecosystem rather than the open market. The design of Trainium4, which incorporates Nvidia’s NVLink Fusion, is telling, it creates a bridge for customers to use both Trainium and Nvidia GPUs together, preserving flexibility for those invested in Nvidia’s software.

Perhaps the most significant signal in the letter was a brief comment on future strategy. Jassy noted that due to intense demand, it is “quite possible” Amazon will begin selling racks of chips to third parties. Currently, the company only monetizes its silicon through its EC2 cloud services. A shift to direct sales would place Amazon squarely in the merchant silicon market alongside competitors like Nvidia and AMD, allowing its chip business to be evaluated separately from its cloud revenue.

The shareholder letter positioned the chip business within a broader AI narrative. Amazon Bedrock, the service providing access to foundation models, processed more tokens in the first quarter of this year than in all previous periods combined, with inference volumes surging monthly. AWS’s AI revenue run rate surpassed $15 billion, growth Jassy described as 260 times faster than AWS experienced at a similar stage in its history.

Jassy also highlighted Amazon Leo, the satellite internet service, as a direct competitor to SpaceX’s Starlink, noting contracts with major airlines, telecoms, and NASA. The disclosures around both satellites and chips support a common argument, that Amazon is constructing infrastructure at a scale and across domains that the market has not fully appreciated. While the letter did not address emerging legal challenges, such as a class action regarding AI training data, its core message is clear. As AI infrastructure became the paramount capital allocation question for the tech industry last year, Jassy’s communication asserts that Amazon identified and committed to the winning strategy earlier and more decisively than most observers have recognized.

(Source: The Next Web)

Topics

amazon custom chips 100% capital expenditure plan 95% AI Infrastructure Investment 93% chip business valuation 90% nvidia competition 88% customer commitments 85% price-performance advantage 83% ai inference growth 80% chip direct sales 78% aws ai revenue 75%