Amazon is designing custom microchips, Inferentia and Trainium, for training and accelerating Generative AI. These chips offer an alternative to Nvidia GPUs for training large language models. However, Microsoft and Google have made faster moves in capturing business from the generative AI boom, with Microsoft investing $13 billion in OpenAI and Google launching its own large language model, Bard. Despite this, Amazon's custom silicon could give it an edge in generative AI in the long run. AWS is the world's largest cloud computing provider, with 40% market share in 2022.
Related Articles
The post How Amazon is racing to catch Microsoft and Google in generative A.I. with Custom Aws Chips appeared first on Balanced News Summary.