The substantial energy demands of AI processing were identified early as a constraint, leading Alphabet’s Google Cloud to strategize about energy procurement and consumption, as stated by Google Cloud CEO Thomas Kurian.
TL;DR
- Google Cloud CEO Thomas Kurian identified energy as a major constraint for AI processing.
- AI data centers can require electricity equivalent to 100,000 residences, with some facilities using 20 times that amount.
- Google Cloud's strategy includes energy source variety, maximizing efficiency, and developing new energy creation technologies.
- Power availability is crucial for AI advancement, with data center construction also posing a bottleneck.
During his address at the Coins2Day Brainstorm AI event in San Francisco on Monday, he highlighted that the firm, a crucial facilitator in the AI infrastructure domain, had been engaged with AI long before the advent of large language models, adopting a strategic, long-term perspective.
“We also knew that the the most problematic thing that was going to happen was going to be energy, because energy and data centers were going to become a bottleneck alongside chips,” Kurian told Coins2Day’sSenior Editor. “So we designed our machines to be super efficient.”
The International Energy Agency has estimated indicated that certain AI-focused data centers require the same amount of electricity as 100,000 residences, and a few of the most substantial facilities currently being built might utilize up to 20 times that quantity.
At the same time, worldwide data center capacity will increase by 46% over the next two years, equivalent to a jump of almost 21,000 megawatts, according to real estate consultancy Knight Frank.
During the Brainstorm event, Kurian outlined Google Cloud's three-part strategy to guarantee sufficient energy for all anticipated needs.
The firm aims for maximum variety in the energy sources fueling AI operations. Although many believe any energy type suffices, this assertion is inaccurate, he stated.
“If you’re running a cluster for training and you bring it up and you start running a training job, the spike that you have with that computation draws so much energy that you can’t handle that from some forms of energy production,” Kurian explained.
He further stated that a key component of Google Cloud's approach involves maximizing efficiency, such as through the repurposing of energy inside its data centers.
The firm actually employs AI within its oversight mechanisms to track the thermodynamic transfers essential for capturing the power already supplied to data centers.
Furthermore, Google Cloud is developing “some new fundamental technologies to actually create energy in new forms,”, according to Kurian, who offered no additional details.
On Monday, NextEra Energy, a utility provider, and Google Cloud announced an extended collaboration to construct new U.S. Data center facilities, which will also feature new power generation sites.
Technology executives have cautioned that the availability of power is essential for advancing AI, in addition to breakthroughs in semiconductors and more sophisticated language processing systems.
The ability to build data centers also presents a possible bottleneck. Nvidia CEO Jensen Huang recently highlighted China's edge in that area when contrasted with the U.S.
“If you want to build a data center here in the United States, from breaking ground to standing up an AI supercomputer is probably about three years,” he said at the Center for Strategic and International Studies in late November. “They can build a hospital in a weekend.”












