Monday, November 03, 2025

Can We Afford to Power AI?

 

That is the question posed by Stephen Witt's article in this week's New Yorker, "Information Overload."  Witt, a biographer of Nvidia founder Jensen Huang, has toured several of the giant data centers operated by CoreWeave, the leading independent data-center operator in the U. S.  He brings from these visits and interviews some news that may affect everyone in the country, whether or not you think you use artificial-intelligence (AI) services:  the need to power the growing number of data centers may send electric-power costs through the roof.

 

Witt gives a fascinating inside view of what actually goes on in the highly secure, anonymous-looking giant buildings that house the Nvidia equipment used by virtually all large-scale AI firms.  Inside are rack after rack of "nodes," each node holding four water-cooled graphics-processing units (GPUs), which are the workhorse silicon engines of current AI operations.  Thousands of these nodes are housed in each data center, and each runs at top speed, emphasizing performance over energy conservation. 

 

Gigawatts of power are used both in the training phase of AI implementation, which feeds on gigabytes of raw data (books, images, etc.) to develop the weights used by AI networks to perform "inferences," which basically means answering queries.  Both training and inference use power, although training appears to demand more intense bursts of energy. 

 

The global economy has taken a headlong plunge into AI, making Nvidia the first company worldwide to surpass $5 trillion in market capitalization.  (Just in case anybody's thinking about a government takeover of Nvidia, $5 trillion would pay off only about one-eighth of the U. S. government's debt, but it's still a lot of money.)  With about 90% of the market share for GPU chips, Nvidia is in a classic monopoly position, and significant competitors are not yet on the horizon.

 

Simple rules of supply and demand tell us that the other commodity needed by data centers, namely electric power, will also rise in price unless a lot of new suppliers rush into the market.  Unfortunately, increasing the supply of electricity overnight is well-nigh impossible. 

 

The U. S. electric-utility industry is coming off a period when demand was increasing slowly, if at all.  Because both generation and transmission systems take years to plan and build, the industry was expecting only gradual increases in demand for both, and planned accordingly.  But only in the last couple of years has it become clear that data centers are like the hungry baby bird of the utility industry:  mouths always open demanding more. 

 

This is putting a severe strain on the existing power grid, and promises to get worse if the rate of data-center construction keeps up its current frenetic pace.  Witt cites an analysis by Bloomberg showing that wholesale electricity costs near data centers have about doubled in the last five years.  And at some point, the power simply won't be available no matter what companies like CoreWeave are willing to pay.  If that happens, the data centers may move offshore to more hospitable climes where power is cheaper, such as China.  As China's power still comes largely from coal, that would bode no good for the climate.

 

Witt compares the current data-center boom to the U. S. railroad-building boom of the nineteenth century, which consumed even more of the country's GNP than data centers are doing now.  That boom resulted in overbuilding and a crash, and there are signs that something similar may be in the offing with AI.  Something that can't go on forever must eventually stop, and besides limitations in power production, another limit that may be even harder to overcome is the finite amount of data available for training.  Witt says there are concerns that in less than a decade, AI developers could use up the entire "usable supply of human text."  Of course, once that's done, the AI systems can still deal with it in more sophisticated ways.  But lawyers are starting to go to work suing AI developers for copyright infringement.  Recently, one AI developer named Anthropic paid $1.5 billion to a class-action lawsuit group of publishers whose material was used without permission.  That is a drop in the bucket of trillions that are sloshing around in the AI business, but once the lawyers get in their stride, the copyright-infringement leak in the bucket might get bigger.

 

The overall picture is of a disruptive new technology wildly boosted by sheep-like business leaders to the point of stressing numerous more traditional sectors, and causing indirect distress to electricity consumers, namely everybody else.  Witt cites Jevons's Paradox in this connection, which says increasing the efficiency with which a resource is used can cause it to be used even more.  A good example is the use of electricity for lighting.  When only expensive one-use batteries were available to power noisy, inefficient arc lamps, electric lighting was confined to the special-effects department of the theatrical world.  But when Edison and others developed both efficient generators and more efficient incandescent lamps, the highly price-sensitive market embraced electric lighting, which underwent a boom comparable in some ways to the current AI boom. 

 

Booms always overshoot to some degree, and we don't yet know what overbuilding or saturation looks like in AI development.  The market for AI is so new that pricing structures are still uncertain, and many firms are operating at a loss in an attempt to gain market share.  That can't go on forever either, and so five years from now we will see a very different picture in the AI world than the one we see now. 

 

Whether it will be one of only modest electric-price increases and a stable stock of data centers, or a continuing boom in some out-of-the-way energy-rich and regulation-poor country remains to be seen.  Independent of the morality and social influences of AI, the sheer size of the hardware footprint needed and its insatiable demand for fresh human-generated information may place natural limits on it.  After the novelty wears off, AI may be like a new guest at a party who has three good jokes, but after that can't say anything that anybody wants to listen to.  We will just have to wait and see.

 

Sources:  Stephen Witt's article "Information Overload" appeared on pp. 20-25 of the Nov. 3, 2025 issue of The New Yorker.  I also referred to an article from the University of Wisconsin at https://ls.wisc.edu/news/the-hidden-cost-of-ai and Wikipedia articles on Nvidia and Jevons Paradox.

No comments:

Post a Comment