Qualcomm announced its AI200 and AI250 accelerators for data-center inference in a company announcement, with commercial availability targeted for 2026 and 2027, respectively.

Qualcomm says AI250 introduces near-memory computing for greater than 10x effective memory bandwidth.

The products are positioned for rack-scale inference deployments optimized for efficiency, with support for direct liquid cooling, PCIe, and Ethernet, according to Qualcomm’s data-center product materials. Each accelerator card supports up to 768 GB of LPDDR memory.

In a separate company release, Qualcomm said Saudi AI firm HUMAIN plans to deploy up to 200 megawatts of AI200/AI250 rack solutions in Saudi Arabia starting in 2026.

Qualcomm said the launch expands its enterprise AI infrastructure offerings alongside existing platforms; no pricing or third-party benchmarks were disclosed in the press materials.