Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Energy Consumption Dramatically Increase in Data Centers and AI: Can AI Go Green?

Artificial intelligence, is anticipated to wield a transformative influence on virtually all aspects of life, rivaling the profound impact of the internet’s emergence. This sentiment is notably echoed on Wall Street, where the tech-centric Nasdaq index has surged by 26% year-to-date, driven by the fervor surrounding AI-related stocks.

Nevertheless, the remarkable ascent of AI is accompanied by a significant drawback: a substantially increased Energy consumption.

Consider OpenAI’s ChatGPT, for instance. Research conducted by the University of Washington indicates that the usage of ChatGPT for hundreds of millions of queries can consume approximately 1 gigawatt-hour daily, equivalent to the energy consumption of 33,000 households in the United States.

The Growing Energy Problem

According to Sajjad Moazeni, a professor of electrical and computer engineering at UW, when comparing the energy consumption of a ChatGPT inquiry to that of an email inquiry, the former is likely to be 10 to 100 times more energy-intensive, as he mentioned to Yahoo Finance.

Industry insiders argue that we are just scratching the surface of what lies ahead. Arijit Sengupta, the founder and CEO of Aible, a company specializing in enterprise AI solutions, remarked, “We are perhaps only at 1% of the AI adoption that will occur in the next two to three years. The world is on the brink of a significant energy crisis due to AI unless we address a few critical issues.”

Data centers serve as the central hubs of advanced computing operations, serving as physical facilities housing numerous processing units and servers essential to the cloud computing sector, primarily overseen by industry giants like Google, Microsoft, and Amazon.

An energy-hungry Facebook data center under construction. Image Credit: Separ Blog

Energy Demands of Data Centers and the Role of GPUs

According to Angelo Zino, VP and senior equity analyst at CFRA Research, when contemplating the transition towards these larger foundational models, it becomes evident that data centers will ultimately demand significantly greater energy resources as a collective entity. He shared this opinion with Yahoo Finance.

In recent years, data centers have progressed from simple central processing units (CPUs) to more complex graphics processing units (GPUs). It is known that these GPU devices produced by companies such as Nvidia are the devices that need the most power among them.

Over the next years, GPUs will continue to be an important part of the AI ​​infrastructure. It’s notable that Patrick Ward, vice president of marketing copy for intelligence technology consultancy Formula Monks, points out: CPUs. Each processing cycle of GPUs requires 10 to 15 times the power of the CPU, making it very energy intensive.

Ward also explained: “The world’s energy consumption will increase well, mainly due to the power of intelligence. People’s efficiency, Excellent work is also seen in the activities.

Huge Massive Infrastructure Cost

Research conducted by Benjamin C. Lee, a professor specializing in electrical engineering and computer science at the University of Pennsylvania, along with Professor David Brooks from Harvard, has demonstrated that between 2015 and 2021, energy consumption in data centers increased by an average of 25% annually.

This increase occurred prior to the widespread recognition of generative AI and the surge in ChatGPT usage.

In the same timeframe, data from the US Energy Information Administration indicated a 7% annual growth rate in renewable energy deployment, although this figure is anticipated to rise with initiatives like the Inflation Reduction Act.

Addressing Energy Challenges in Cloud Computing and the Quest for Carbon Neutrality

According to Lee, “There is already a significant disparity in growth rates between data center energy consumption and the deployment of renewable energy.”

“We call it cloud computing; it feels like there’s no cost associated with it,” said Lee. “There’s a huge massive infrastructure cost.”

To counteract such consumption, the major cloud providers like Google Cloud, Microsoft Azure, and Amazon Web Services all invest in renewable energy to match their annual electricity consumption. They hold net-zero pledges, meaning they remove as much carbon as they emit.

Microsoft’s Azure has touted its 100% carbon neutral status since 2012 and says that by 2030 it will be carbon negative. Amazon has said it expects to power its operations with 100% renewable energy by 2025, as part of its goal to reach net-zero carbon emissions by 2040. For its part, Google aims to achieve net-zero emissions across all of its operations by 2030.

Net zero doesn’t necessarily equate to complete carbon neutrality. There will be moments during the day when there is insufficient sunlight or wind, necessitating drawing energy directly from the grid, which will supply a blend of energy sources,” explained Lee.

He is currently researching methods by which data centers can adapt or reschedule their computing tasks based on the availability of carbon-free energy.

“For instance, one approach could involve performing more intensive computations during daylight hours when ample solar energy is available, and reducing computational activity during nighttime,” Lee suggested.

Companies Pioneering Energy-Saving Solutions

As expected, the challenge of energy consumption has spurred the emergence of a new wave of companies dedicated to finding more efficient ways of utilizing AI models.

“We can achieve a remarkable one-third reduction in energy consumption for these kinds of AI workloads simply by transitioning to serverless technology,” explained Aible’s Sengupta to Yahoo Finance. Serverless technology operates by utilizing resources on-demand, enhancing energy efficiency.

Analysts emphasize that the drive to reduce expenses is naturally pushing the industry towards energy-efficient solutions.

“Whether it’s driven by environmental concerns, financial optimizations, investor pressures, or any other factor, we observe companies increasingly exploring strategies for improved efficiency. It’s a matter of operational costs. The more efficient you become, the lower your operational expenses,” noted Tegan Keele, the US climate data and technology leader at KPMG.

As noted by CFRA Research’s Zino, the beneficiaries in this sector are the operators of data centers. Zino emphasized that the aggregation of actual data utilization and the integration of these processes will likely be dominated by a select few companies.

He stated, “An increasing number of businesses are essentially opting to lease cloud space rather than undertaking the capital-intensive endeavor of constructing their own data centers. This trend is primarily due to the anticipated future increase in costs.”

Conclusion

The rapid rise of artificial intelligence (AI) comes with a significant energy consumption challenge, as evidenced by the growth of AI models like ChatGPT and data centers. The energy-intensive nature of AI raises concerns about sustainability.

Major cloud providers are investing in renewables, aiming for net-zero emissions, but challenges remain in matching energy supply with demand. Efforts to reduce energy consumption through serverless technology are emerging, and companies are focusing on energy-efficient AI solutions.

The industry is at the cusp of an energy-efficiency revolution to address the impending energy crisis.

The post Energy Consumption Dramatically Increase in Data Centers and AI: Can AI Go Green? first appeared on Business d'Or.



This post first appeared on Bugatti Chiron Successor To Don A More Athletic Shape, Says Designer, please read the originial post: here

Share the post

Energy Consumption Dramatically Increase in Data Centers and AI: Can AI Go Green?

×

Subscribe to Bugatti Chiron Successor To Don A More Athletic Shape, Says Designer

Get updates delivered right to your inbox!

Thank you for your subscription

×