Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

How Amazon is racing to catch Microsoft and Google in generative A.I. with custom AWS chips

Amazon is designing custom microchips, Inferentia and Trainium, for training and accelerating Generative AI. These chips offer an alternative to Nvidia GPUs for training large language models. However, Microsoft and Google have made faster moves in capturing business from the generative AI boom, with Microsoft investing $13 billion in OpenAI and Google launching its own large language model, Bard. Despite this, Amazon's custom silicon could give it an edge in generative AI in the long run. AWS is the world's largest cloud computing provider, with 40% market share in 2022.

The post How Amazon is racing to catch Microsoft and Google in generative A.I. with Custom Aws Chips appeared first on Balanced News Summary.



This post first appeared on Balanced News Summary, please read the originial post: here

Share the post

How Amazon is racing to catch Microsoft and Google in generative A.I. with custom AWS chips

×

Subscribe to Balanced News Summary

Get updates delivered right to your inbox!

Thank you for your subscription

×