Source – marktechpost.com
AMD’s AI Chip Launch Targets Nvidia’s Stronghold
On Thursday, AMD introduced a new artificial intelligence (AI) chip, the Instinct MI325X, marking a strategic move to compete with Nvidia’s dominant data center graphics processors (GPUs). The chip is expected to begin production by the end of 2024, AMD announced during the product launch event. This new release is seen as a direct challenge to Nvidia, which has maintained a commanding lead in the GPU market, especially as demand for AI technology continues to rise.
GPUs, like those produced by Nvidia, are essential for powering advanced generative AI models such as OpenAI’s ChatGPT. These models require large data centers filled with GPUs to handle massive amounts of processing. While Nvidia currently dominates this sector, AMD holds the second position and is determined to gain a larger share of the market, which it predicts will reach $500 billion by 2028.
During the event, AMD’s CEO, Lisa Su, expressed optimism about the growing demand for AI technology. “AI demand has actually continued to take off and exceed expectations. It’s clear that the rate of investment is growing everywhere,” Su said. While no new major cloud or internet customers were revealed, AMD has previously mentioned that companies like Meta, Microsoft, and OpenAI use its AI GPUs for specific applications.
Competition Heats Up Between AMD and Nvidia
The launch of the MI325X is a significant step in AMD’s efforts to close the gap with Nvidia, especially as both companies continue to push the boundaries of AI chip technology. The MI325X will compete directly with Nvidia’s forthcoming Blackwell chips, set to start shipping in early 2025. However, a key hurdle for AMD lies in Nvidia’s proprietary software programming language, CUDA, which has become the industry standard for AI developers. This language effectively locks developers into Nvidia’s ecosystem, making it difficult for them to transition to AMD products.
In response, AMD has been enhancing its own software, known as ROCm, which aims to simplify the process of shifting AI models to its chips. According to AMD, their AI accelerators offer a competitive edge in use cases where AI models are generating content or making predictions. One of the highlights of AMD’s new chip is its ability to outperform some Nvidia chips in serving Meta’s Llama AI model, delivering up to 40% more inference performance on specific tasks, Su noted.
AMD’s strategy to release new AI chips annually, starting with the MI325X, is part of its broader effort to capitalize on the AI boom and compete more aggressively with Nvidia. In the coming years, AMD plans to release the MI350 in 2025 and the MI400 in 2026.
Financial Impact and AMD’s Broader Plans
Despite the excitement surrounding the new chip, AMD’s stock dropped by 4% following the announcement, while Nvidia’s stock rose by 1%. AMD’s market presence in the AI chip sector remains significantly smaller than Nvidia’s, with Nvidia controlling over 90% of the data center AI chip market. However, investors may be drawn to AMD’s expanding role in AI, particularly if the MI325X launch proves successful.
In addition to AI chips, AMD remains a major player in central processing units (CPUs), which are at the core of nearly every server. The company reported that its data center sales more than doubled over the past year, reaching $2.8 billion in the June quarter. Of that, AI chips accounted for roughly $1 billion.
As part of its broader strategy to expand its footprint in data centers, AMD also introduced a new line of CPUs, called EPYC 5th Gen, which comes in various configurations. These CPUs range from low-cost, 8-core chips priced at $527 to 192-core processors intended for high-performance supercomputers, costing up to $14,813 per chip. According to AMD, these new CPUs are particularly well-suited for feeding data into AI workloads, a critical function for the growing AI landscape.
AMD’s efforts to compete with Nvidia, especially in AI technology, may face challenges, but with new innovations and product releases, the company is positioning itself to play a larger role in the future of AI and data centers.