AMD Announced AI Chips, Rival to Nvidia

[ad_1]

AMD has officially announced that the MI300X, designed for AI-oriented servers, will take on NVIDIA. Next year it will be an important milestone. AMD is undoubtedly well positioned to benefit from the AI ​​industry. The new chip, which will have the most distinctive features, will be integrated into 190 gigabytes (GB) of ultra-advanced memory, known as HDM3.

Many companies, including Microsoft, Open oneMe and Meta have already committed to using the chip for training AI and AI processing. By comparison, AMD looks like a second-tier AI chipmaker, reminding investors of their company’s strong position as a powerhouse in the technology industry. While the underlying technology isn’t as rich, NVIDIA is up 225% since the start of the year.

AMD has announced new artificial intelligence chips.

NVIDIA has been leading the AI ​​industry for some time with its cloud service, and many tech companies are looking for similar alternatives, but at a lower cost and with more flexibility. Chip Giant Nvidia’s clear challenger, OpenAI, would use the MI300 in its programming system, Triton 3.0, and Meta in data centers, and Microsoft would use it in its cloud computing segment, Azure.

MI300X is said to accelerate the memory bandwidth of generative AI and, again, accelerate leadership performance for LLM training and inference. The latest hardware solution is AI-enabled for personal computers (PCs). The PC market has been weak for some time in recent years; However, AMD expects a recovery in this area due to its popularity in the AI ​​industry. For this, AMD has shared the details of the Ryzen 8040 series for next-generation video games and the Ryzen 8 8945HS Soc, which offers 64% faster video editing and 37% faster rendering than its competitors.

The industry-leading capacity and bandwidth are on par with existing competition in running data model training, and AMD Soc is expected to outperform existing AI chips. Finally, Oracle will also integrate Advanced Micros, which could be enough to increase confidence in AI development and processing. Overall, AMD will target $128.77, down less than 1%.

AMD will challenge Nvidia

Furthermore, Bloomberg reported that the AI ​​chip industry could surpass $400 million in four years. With this in mind, AMD aims to be a key player in the AI ​​ecosystem by offering hardware solutions that enable the efficient processing of AI tasks. These tasks include generative AI tasks with realistic images and videos to develop personalized chatbots that enhance machine learning.

The event highlights the potential of the fast-growing AI markets and the company’s potential to drive new growth opportunities in the coming years. It highlights the ongoing transformations in the computing industry, especially in computing and AI generation, with the expected shift of trillions of dollars in global data center infrastructure from general purpose to accelerated computing through AI to business operations.

These new advanced AI SoCs are designed to handle massive workloads in AI applications. AMD is directly challenging the H100 AI chip, which is currently the market leader. According to reports, the US AI market reached a valuation of $103.7 billion last year and is expected to reach $594 billion by 2032, representing a compound annual growth rate of 19.1% from 2023.

M1300A accelerated processing unit (APU)

The company said it trains and educates LLMs. It has 1.5x more memory capacity compared to the latest M1250X chip and offers the highest performing accelerator on par with NVIDIA’s H100 chip, at least 1.4x better, and 1.6x times the memory capacity while working with meta-LLaMA 2 70 billion parameters. Since it is primarily focused on data centers, it is also expected to combine CPU and GPU for faster processing, expanding the total addressable market to $45 billion.

On the other hand, there are 30x improvements in energy efficiency. It is now in production and being built into data centers such as the EL Capitan Supercomputer, built by HP Enterprise at Lawrence Livermore National Laboratory. The next generation of Strix Point NPUs is planned for 2024.

CEO Lisa Su called the MI300 SoC the most powerful accelerator in the world of generative AI, and she also predicted that the market for AI accelerator chips in data centers will grow 70% annually and reach over $400 billion by 2027. The company is already set to face more challenges as Intel hosts an event for AI Everywhere next week, the main aim of which is to showcase the new processor for data centers.

Machine learning dominates financial headlines. Following the announcement, AMD shares rose 9.9% to $128.37, which would be the highest since May this year. AMD could capture around 10% of the total AI chip market, compared to NVIDIA, which is estimated to have more than 80% of the market share. NVIDIA reportedly forecast $2 billion in AI GPU sales by 2024, and in addition, NVIDIA posted more than $16 billion in data center sales in the current quarter.

Leave a Comment