Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Nvidia launches chip aimed at data center economics

Semiconductor manufacturer Nvidia Corp has announced a new chip that can be digitally split up to run several different programs on one physical chip, a first for the company that matches a key capability on many of AMD and Intel Corp's chips. 

KEY POINTS

  • Nvidia announces a new chip that can be split to run different programs on one chip
  • The new functionality matches that of Intel Corp

The notion behind what the Santa Clara, California-based company calls its A100 chip is simple: Help the owners of data centres get every bit of computing power possible out of the physical chips they purchase by ensuring the chip never sits idle. The same principle helped power the rise of cloud computing over the past two decades and helped Intel build a massive data centre business.

When software developers turn to a cloud computing provider such as Amazon.com or Microsoft Corp for computing power, they do not rent a full physical server inside a data centre. Instead they rent a software-based slice of a physical server called a "virtual machine."

Such virtualisation technology came about because software developers realised that powerful and pricey servers often ran far below full computing capacity. By slicing physical machines into smaller virtual ones, developers could cram more software on to them, similar to the puzzle game Tetris. Amazon, Microsoft and others built profitable cloud businesses out of wringing every bit of computing power from their hardware and selling that power to millions of customers.

But the technology has been mostly limited to processor chips from Intel and similar chips such as those from Advanced Micro Devices Inc. Nvidia has said that its new A100 chip can be split into seven "instances."

For Nvidia, that solves a practical problem. Nvidia sells chips for artificial intelligence tasks. The market for those chips breaks into two parts. "Training" requires a powerful chip to, for example, analyse millions of images to train an algorithm to recognise faces. But once the algorithm is trained, "inference" tasks need only a fraction of the computing power to scan a single image and spot a face.

Nvidia is hoping the A100 can replace both, being used as a big single chip for training and split into smaller inference chips.

Customers who want to test the theory will pay a steep price of $200,000 for Nvidia's DGX server built around the A100 chips. In a call with reporters, CEO Jensen Huang argued the math will work in Nvidia's favour, saying the computing power in the DGX A100 was equal to that of 75 traditional servers that would cost $5,000 each.

"Because it's fungible, you don't have to buy all these different types of servers. Utilisation will be higher," he said. "You've got 75 times the performance of a $5,000 server, and you don't have to buy all the cables."

Via our content partners at Reuters. Reporting by Stephen Nellis in San Francisco. Editing by Leslie Adler.

This article was published by Platform Executive, the home of the platform economy.



This post first appeared on Platform Executive, please read the originial post: here

Share the post

Nvidia launches chip aimed at data center economics

×

Subscribe to Platform Executive

Get updates delivered right to your inbox!

Thank you for your subscription

×