Instinct™ MI325X
Supermicro unleashes the power of large-scale infrastructure with a server built with our proven AI building-block system and powered by 5th Gen AMD EPYC™ processors and AMD Instinct™ MI325X GPUs. an industry-standard-based universal baseboard (UBB 2.0) model with 8 AMD Instinct MI325X accelerators and a total of 2 TB of HBM3 memory to help process the most demanding AI models.
Resources
MI325X Air- and Liquid-Cooled Systems
- MI325X
Air Cooled - MI325X
Liquid Cooled

DP AMD 8U System with AMD Instinct MI325X 8-GPU

DP AMD 4U System with AMD Instinct MI325X 8-GPU and liquid cooling
Instinct™ MI300X/A
MI300X 8-GPU system with AMD Infinity Fabric™ Links enabling up to 896 GB/s of peak theoretical P2P I/O bandwidth on the open standard platform with industry-leading 1.5 TB HBM3 GPU memory in a single server node, as well as native sparse matrix support, designed to save power, lower compute cycles and reduce memory use for AI workloads.
Data Center optimized multi-processor systems for HPC and AI Inferencing. Supermicro’s air-cooled 4U and liquid-cooled 2U quad-APU systems supporting AMD Instinct™ MI300A, which combine CPUs and GPUs, leverage Supermicro’s expertise in multiprocessor system architecture and cooling design finely-tuned to tackle the convergence of AI and HPC.
Resources
MI300X/MI300A Air- and Liquid-Cooled Systems
- MI300X
Air Cooled - MI300X
Liquid Cooled - MI300A
Air Cooled - MI300A
Liquid Cooled

DP 8U 4-GPU OAM architecture with CPU-to-GPU interconnect

DP 4U 4-GPU OAM architecture with liquid cooling solution

DP 4U 4-APU architecture with AMD Infinity Fabric™ Link
