Infineon Raised Its Full-Year Guidance. The AI Power Ramp Made It Real.
Infineon just gave the clearest revenue proof yet that the AI power infrastructure buildout is real — and it's betting its future on it.
The German chipmaker reported Q2 FY2026 revenue of €3.812 billion on May 6, up 6 percent year-on-year, and raised its full-year guidance to above €16 billion with segment margins expected to reach around 20 percent — up from the high-teens previously forecast. Adjusted free cash flow guidance moved up to €1.65 billion from €1.4 billion. These aren't the numbers of a company struggling to find its footing in the AI era.
The more interesting number is in the forward guidance: Infineon expects around €1.5 billion in revenue from AI data center applications in fiscal 2026, rising to approximately €2.5 billion in fiscal 2027. That's a 67 percent jump in twelve months. For a segment that didn't exist as a named line item until recently, that's a rate of build that makes the power-infrastructure layer a concrete business, not a slide.
"The expansion of power infrastructure is gaining momentum and is becoming an increasingly important growth driver for our industrial business," CEO Jochen Hanebeck said on the earnings call. He wasn't underselling it: "Our power supply solutions for AI data centers are in very high demand."
The structural signal came in the same release. Effective July 1, Infineon will reorganize from four divisions to three — Automotive, Power Systems, and Edge Systems. Power Systems, which will account for roughly 30 percent of revenue on a pro-rated basis, is the new center of gravity. The company is folding what used to be Green Industrial Power and the old Sensing Systems divisions into a structure built around where the growth is.
This matters because of what power semiconductors actually do in an AI data center. A facility running tens of thousands of GPUs requires a cascade of power conversion: from grid AC down to the precise DC voltages that GPUs, memory, and networking hardware need, with each conversion stage requiring switching components that manage heat, efficiency, and reliability. It's not a glamorous supply chain, but it's a critical one, and Infineon is one of the few companies that makes the full stack.
The materials question — GaN versus SiC — is where the real engineering trade-offs live. Gallium nitride switches faster and handles higher frequencies, which matters for the dense power delivery networks that AI racks require. Silicon carbide operates at higher voltages and temperatures, better suited for the grid interconnection side. Both are in Infineon's portfolio, both are ramping, and the earnings report is the first concrete revenue evidence that the supply chain for these materials actually exists at commercial scale.
Not everything is running at full speed. High-voltage power semiconductors for e-mobility — roughly 7 percent of automotive revenue — were described on the call as generating unacceptable profitability. The broader automotive business is growing, driven by software-defined vehicles, but the EV power electronics segment is carrying drag that Infineon hasn't yet solved.
The geopolitical dimension is worth noting. Infineon is a European company with manufacturing in Germany, Austria, and Malaysia. Unlike Nvidia or AMD, it doesn't sell into AI data centers through a single flagship product — it sells the components that make the infrastructure work. As the AI race drives capital expenditure at Microsoft, Amazon, Google, and Meta into the hundreds of billions, the companies supplying the power conversion layer are the picks-and-shovels play. Infineon just reported that the picks-and-shovels business is growing at 67 percent per year.
The reorganization takes effect July 1. The AI power ramp takes longer to build out — but it's no longer a projection. It's a line item.