AMD Instinct MI325X Accelerators Lead AI Performance
AMD Instinct MI325X accelerators are engineered to deliver outstanding performance and efficiency for challenging AI activities that include inferencing, fine-tuning, and foundation model training
Industry-leading memory capacity and bandwidth are provided by AMD Instinct MI325X accelerators; 256GB of HBM3E supporting 6.0TB/s offers 1.8X more capacity and 1.3X more bandwidth than the H200
Up to 1.3X, 1.2X, and 1.4X the inference performance on Mistral 7B at FP16, Llama 3.1 70B at FP8, and Mixtral 8x7B at FP16 of the H200, respectively, may be obtained with this leadership memory and compute
AMD Instinct MI325X accelerators are anticipated to be widely available for use in systems from a variety of platform providers beginning in Q1 2025
In addition, the AMD Instinct MI325X has 1.3X higher peak theoretical compute performance for FP16 and FP8 than the H200
The AMD Instinct MI350 series, which offers up to 288GB of HBM3E memory per accelerator, will continue to lead the market in memory capacity
In order to provide the AMD ROCm open software stack with powerful new features and capabilities, AMD keeps investing in expanding software capabilities and the open ecosystem