Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
While much attention has been on AMD's traditional server chips lately, investors will still be focused on what AMD says ...
On Wednesday, China approved imports of Nvidia’s H200 artificial intelligence chips for three of its largest technology ...
The Maia 200 AI chip is described as an inference powerhouse — meaning it could lead AI models to apply their knowledge to ...
These chip designers have been great investment options.
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the economics of AI token generation. Maia 200 is an AI inference powerhouse: an ...
Intel's latest results make it clear that it still has work to do to become a notable player in AI chips.
OpenAI is racing to bring its own artificial intelligence chips into production, a move that could loosen Nvidia’s grip on ...
Microsoft is taking on Amazon and Google with the launch of its Maia 200 AI accelerator chip, calling the processor “the most performant, first-party silicon from any hyperscaler.” ...
Here are our picks for the top 10 edge AI chips with a bright future across applications from vision processing to handling multimodal LLMs.
Last year, Nvidia agreed to install Navitas' GaN and SiC chips into its own AI data centers to support its next-gen 800 V ...
Data centers are eating up computing resources and pushing chipmakers toward AI-grade memory, tightening supply for Nintendo ...