AMD CEO: processors with Xilinx AI engines expected in 2023

This article is part of TechXchange: Cutting-edge AI.

AMD said it plans to integrate Xilinx’s differentiated IP into future generations of its central processing units (CPUs), in a bid to better compete with Intel and NVIDIA in data centers and even in the PC market. embarked.

The Santa Clara, Calif.-based company will integrate Xilinx’s artificial intelligence (AI) engines into its processor portfolio to bolster their AI inference capabilities, said AMD CEO Lisa Su at AMD’s conference call with analysts in the second quarter. The first chips in the series are expected to be released in 2023.

The move is the latest step in AMD’s ambitions to become an even more sprawling semiconductor giant, following its $35 billion deal to buy Xilinx, which became part of the company in January. Su said Xilinx helps diversify its offerings with a wide range of computing engines — those suitable for artificial intelligence, for example, or data center networking — and gives it a longer list of customers to he can sell.

“We now have the industry’s best portfolio of high-performance, adaptive computing engines, and we see opportunities to leverage our broad technology portfolio to deliver even better products,” she said.

Smart engines

AI engines already ship in Xilinx’s Versal family of Adaptive Compute Acceleration Platforms (ACAPs) to take on Intel, Marvell and NVIDIA in markets such as cloud servers and networking.

AI engines are also equipped with purpose-built chips to run real-time workloads such as image recognition in embedded and peripheral devices, ranging from cars and industrial and medical robots to aerospace and satellite systems. defense like satellites.

The Versal series contains scalar engines (Arm CPU cores), smart engines (AI accelerators and DSP blocks), and the same kind of programmable logic at the heart of its FPGAs. The programmable network-on-chip (NoC) connects everything on a system-on-chip (SoC), with different hard cores for connectivity (PCIe and CCIX), networking (Ethernet), memory, security and I/O.

by Xilinx Vivado ML software stack is used to reconfigure programmable logic in Versal chips, while its Vitis and Vitis IA the development tools aim to fine-tune the software to run on the Versal platform.

“We have this AI engine that is already deployed in production in a number of embedded applications and edge devices,” said Victor Peng, former CEO of Xilinx and president of the Adaptive and Embedded Computing business. from AMD. “The same architecture can be scaled and integrated into the CPU product portfolio.”

He said the combined company is working to build a unified software stack to help customers leverage the AI ​​prowess of Xilinx and AMD chips for inference and training in data centers and at home. periphery.

The Xilinx deal gives the semiconductor giant a “much broader set of offerings” in the AI ​​hardware market, particularly on the inference side, complementing the AI ​​acceleration offered by its CPUs and GPUs for data centers, Su added. “AI is a huge opportunity for us.”

Gain territory

Business is booming at AMD. It has gained performance over Intel with new generations of processors, in part by outsourcing production to TSMC, giving it access to process technologies that are years ahead of Intel’s fabs.

Overall, the company posted revenue of $5.9 billion in the second quarter, up 71% from the same quarter a year ago. Su pointed out that each of its businesses saw double-digit growth in the last quarter.

But it has gained some of its brilliance for investors in recent years by gobbling up the market for PC processors and GPUs as well as video game consoles, including Microsoft’s Xbox Series X, Sony’s PlayStation 5 and Steam. Valve Deck.

AMD is also gaining traction in the data center market, where Intel has long dominated. It has stolen market share from Intel as its biggest rival scrambles to reclaim its chipmaking prowess, industry analysts say.

AMD says sales of its server processors have more than doubled in eight of the past 10 quarters, highlighting increased demand for EPYC processors from customers in the cloud, enterprise and high-computing market. performance (HPC).

The company’s sales in the cloud computing market are also soaring as tech giants such as Amazon, Google and Microsoft in the United States and Alibaba and Baidu in China increase their investments in computing hardware. server.

While Intel has struggled to move to more advanced technology nodes, AMD is throwing its weight behind TSMC and other third-party foundries to ramp up production of its most advanced PC and data center chips. The Silicon Valley company is also gearing up to roll out its new generation of processors, codenamed Genoa, later this year. He hopes the upgrade will help him gain more market share from Intel.

The prospect of strong demand for server processors along with the addition of Xilinx prompted the company to raise its full-year revenue growth outlook to 61%. He previously predicted sales would rise 31% in 2022.

Su said the company’s core business is still core processors and graphics chips used in laptops, desktops and the data center. But Xilinx’s families of programmable chips give it “many levers for growth” now and in the future.

Diversification plan

One of its broader strategies lately has been to amass a wide range of chips that can be intertwined to improve performance and power efficiency for customers with a wide range of compute needs.

She explained, “What we see in terms of growth going forward is that there will be more customization around solutions for large customers, whether they are cloud enterprises, large telecom operators or even edge opportunities.”

To continue its diversification strategy, AMD is also acquiring network chip startup Pensando Systems in a $1.9 billion deal. The deal would add a line of data processing units (DPUs) and a software stack that would give it a wide range of storage, security and networking technologies suitable for the data center.

“AMD’s whole strategy is to have the best compute engines and then assemble them into solutions for specific end markets,” Su said. “I think [having] The processors, GPUs, FPGAs, adaptive SoCs, and then the DPUs that we add from Pensando give us an incredible range of capabilities.

Su added, “Having all these compute engines will allow us to optimize these solutions together.”

The company still faces questions about whether it can continue to compete with Intel and execute on its strategy given continued supply chain turmoil and capacity limitations at TSMC and other foundries.

But it works through supply challenges, AMD executives said. “We are working with the larger scale of AMD to try to bring more supply on board and continue to increase overall capacity to support the next few very strong quarters,” Su said.

This article is part of TechXchange: Cutting-edge AI.

Comments are closed.