Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

IBM Develops New Chip for AI Inference at Greater Efficiency

IBM has announced the development of a new mixed-signal chip that combines phase-change memory and digital circuits. The company claims that this chip will match GPU performance in AI inference while operating at a much higher efficiency. The potential implications of this development are significant. If this chip gains popularity, it could help alleviate the increasing demand for GPUs used in AI processing and make them more available for gaming purposes.

This is not the first time IBM has produced a chip of this nature, but it is on a much larger scale. The chip showcases many of the building blocks required to create a low-power analogue AI inference accelerator chip.

One of the major challenges in AI inference is the slow data transfer between memory and processing units, which consumes power and hampers processing speed. IBM’s chip tackles this problem differently by using phase-change memory cells to store inferencing weights as an analogue value and perform computations. This approach, known as analogue in-memory computing, allows for compute and memory storage to take place in the same location, resulting in reduced data transfer, lower power consumption, and improved performance.

The scale and scope of the weighting matrices supported by the chip are complex. Nonetheless, the efficiency and potential impact of this development are clear. AI processing power consumption has become excessive, with AI inferencing racks reportedly consuming nearly 10 times the power of a regular server rack. Therefore, a more efficient solution would be highly beneficial in the market.

As for gamers, the immediate implication is that if this in-memory computing technology takes off for AI inference, companies like Microsoft and Google may purchase fewer GPUs from Nvidia. This development could rekindle Nvidia’s interest in gaming and gamers.

The timeline for turning this chip into a commercial product available for purchase instead of GPUs remains uncertain as IBM has not provided specific guidance. However, considering the growing prominence of AI, an alternative to GPUs would be enthusiastically welcomed by gamers who have transitioned from the cryptocurrency mining GPU era but find themselves dealing with the power-intensive demands of AI inference.

The post IBM Develops New Chip for AI Inference at Greater Efficiency appeared first on TS2 SPACE.



This post first appeared on TS2 Space, please read the originial post: here

Share the post

IBM Develops New Chip for AI Inference at Greater Efficiency

×

Subscribe to Ts2 Space

Get updates delivered right to your inbox!

Thank you for your subscription

×