Microsoft Azure's AI inference accelerator Maia 200 aims to outperform Google TPU v7 and AWS Inferentia with 10 Petaflops of FP4 compute power.
Nvidia’s data center chips have become the default engine for modern artificial intelligence, but they are not just faster versions of gaming graphics cards. The company’s AI accelerators strip away ...
Advanced Micro Devices is rapidly gaining share in the AI accelerator market, with 2024 data center revenue nearly doubling to $12.6B. The MI350 series, launching mid-2025 with 35x inference ...
China’s race to build homegrown artificial intelligence chips has collided head on with Nvidia’s H200, the United States company’s latest workhorse for training and running large models. The result is ...
Qualcomm has announced its new AI accelerator chips, the AI200 and AI250. With this transitional move towards a new segment in the semiconductor industry, the company wants to compete directly with ...