AI & TechArtificial IntelligenceNewswireStartupsTechnology

Peak XV Backs C2i to Solve AI’s Power Bottleneck

▼ Summary

– Power, not compute, is now the main bottleneck for scaling AI data centers, creating a market for efficiency solutions.
– The Indian startup C2i Semiconductors has raised $15 million to develop integrated, system-level power delivery systems to reduce energy losses.
– Data center power demand is projected to surge dramatically, with current power conversion processes wasting 15-20% of energy.
– C2i’s integrated “grid-to-GPU” platform aims to cut energy losses by about 10%, significantly improving data center economics and total cost of ownership.
– The startup’s technology and market thesis will be tested in the coming months as its first silicon designs return for validation with potential customers.

The race to build more powerful artificial intelligence is hitting a fundamental wall: electricity. Power, rather than raw computing power, is fast becoming the primary bottleneck for scaling AI data centers. This critical shift has drawn the attention of major investors like Peak XV Partners, who are now backing C2i Semiconductors, an Indian startup developing integrated power solutions aimed at slashing energy waste and improving the financial viability of massive AI infrastructure.

C2i, which stands for control, conversion, and intelligence, recently secured $15 million in a Series A funding round. Peak XV Partners led the investment, with contributions from Yali Deeptech and TDK Ventures. This brings the two-year-old company’s total funding to $19 million. The urgency behind this investment is clear. Global data center energy demand is skyrocketing. According to a BloombergNEF report, electricity consumption from these facilities could nearly triple by 2035. Goldman Sachs Research offers a similarly staggering projection, estimating a 175% surge in data center power demand by 2030 compared to 2023 levels, an increase equivalent to adding a new top-ten power-consuming nation to the global grid.

A significant portion of this strain stems not from generating electricity, but from inefficiently converting it within the data center itself. High-voltage power must be stepped down thousands of times before it can safely reach the GPUs that drive AI workloads. This complex conversion process currently wastes between 15% and 20% of all incoming energy, according to C2i’s co-founder and CTO, Preetam Tadeparthy. He notes that power delivery voltages are already climbing from 400 volts to 800 volts and are likely to go even higher, exacerbating the efficiency challenge.

To tackle this problem, C2i is taking a holistic approach. Founded in 2024 by a team of former Texas Instruments power executives, the company is redesigning power delivery as a single, plug-and-play “grid-to-GPU” system. This integrated platform spans the entire journey from the data center’s power bus to the processor itself. By treating power conversion, control, and advanced packaging as one cohesive unit, C2i estimates it can reduce end-to-end energy losses by approximately 10%. In practical terms, this means saving about 100 kilowatts for every megawatt consumed, which also lowers associated cooling costs and can improve overall GPU utilization. These efficiency gains translate directly into lower total cost of ownership, higher revenue potential, and improved profitability for data center operators.

For investors like Peak XV Partners, the economic imperative is undeniable. After the massive upfront capital investment in servers and facilities, energy costs become the dominant ongoing expense for data centers. Rajan Anandan, Managing Director at Peak XV, emphasized that even incremental efficiency improvements hold enormous value at scale. Reducing energy costs by 10 to 30 percent represents tens of billions of dollars in potential savings, making it a critical frontier for AI infrastructure economics.

The startup’s claims are about to be put to the test. C2i expects its first two silicon designs to return from fabrication between April and June. Following this, the company plans to validate performance with data center operators and hyperscalers who have already expressed interest in reviewing the results. Based in Bengaluru, C2i has assembled a team of roughly 65 engineers and is establishing customer-facing operations in the United States and Taiwan to prepare for early deployments.

Breaking into the data center power delivery market is notoriously difficult. It is a segment dominated by large, entrenched incumbents with deep financial resources and qualification cycles that can last for years. While many newer companies focus on optimizing individual components, C2i’s end-to-end redesign requires the simultaneous coordination of silicon design, packaging, and system architecture, a capital-intensive strategy few startups attempt. Anandan acknowledges that execution is now the key question, with all startups facing inherent technology, market, and team risks. However, he believes the feedback loop for C2i will be relatively short, suggesting that the coming six months, marked by silicon returns and early customer validation, will be decisive.

This investment also highlights the maturation of India’s semiconductor design ecosystem. Anandan draws a parallel to the early days of e-commerce in the country, suggesting the semiconductor sector is just beginning a similar growth trajectory. He points to India’s deep pool of engineering talent, home to a growing share of the world’s chip designers, and government-backed design-linked incentives that have lowered the cost and risk of producing new silicon. These conditions are making it increasingly viable for Indian startups to build globally competitive semiconductor products from the ground up, rather than serving solely as offshore design centers for foreign companies.

The coming months will determine if these favorable conditions can yield a world-class product. As C2i begins validating its system-level power solutions with potential customers, the industry will be watching closely to see if this ambitious approach can successfully loosen AI’s tightening power bottleneck.

(Source: TechCrunch)

Topics

power efficiency 98% semiconductor startup 97% data centers 96% ai infrastructure 95% power conversion 92% industry evolution 90% energy demand 89% venture capital 88% system-level design 87% indian semiconductor ecosystem 86%