At Computex 2024, AMD provided greater insights across its AI hardware capabilities in both data center and PCs. On the accelerator side, AMD introduced an annual cadence (like NVIDIA) where its plans to make the Instinct MI325X available in Q4 (focus on memory capacity with 288GB of ultra-fast HBM3E). AMD also highlighted that the MI350 Series will be available in 2025, with up to 35x AI inference boost versus the MI300 Series, while the MI400 accelerators will launch in 2026. Its Turin EPYC CPUs will be available in the second half of 2024. On the device side, AMD also highlighted its Ryzen AI 300 Series for Copilot+ PCs. Although AMD remains well behind peer NVIDIA, it is expected to generate at least $4B in GPU server revenue in CY 24 during its first full year selling the offering, largely supported by its partnership with Microsoft. We await whether AMD’s GPU accelerators can penetrate with other cloud providers (e.g., Alphabet and AWS), which would help support greater revenue in the category.