Recently, the Lincoln Journal Star reported that a proposed Google data center could require more than three times the amount of electricity the entire city of Lincoln uses at peak demand during the summer.

I know… this is an article AI, not data centers. But stay with me.

Most of us have come to understand that data centers consume an incredible amount of electricity. But the amount of electricity consumed by AI is comparatively next-level. So, if a single large data center has people so concerned, then the energy demand of AI itself should be a four-alarm fire.

AI is projected to drive electricity growth not seen since the post-World War II era – and much of that demand will come quietly, from existing customers, not just new facilities.
The massive electricity consumption from AI is attributable to one simple reason – AI servers are not normal servers.

The processers in AI servers are GPUs (graphical processing units) that process thousands of tasks at one time (parallel) and consume far more electricity when they are running than traditional CPUs – which handle tasks sequentially.

In addition, AI servers operate continuously and near full load 24×7. Traditional servers operate only when tasks are launched – and typically are idle for many hours per day.

Finally, AI servers cannot be cooled by a simple fan. Anyone who has felt their laptop overheating understands the basic problem. To manage the incredible heat generated by AI servers, data centers increasingly rely on liquid cooling, immersion cooling, or highly specialized HVAC systems, all of which require substantial additional electricity.

To make this concrete, consider something we all understand: a Google Search. According to widely cited industry estimates, an AI-powered prompt that analyzes data, compares sources, and synthesizes meaning can require dozens—or in some cases over a hundred times—more computing power than a Google Search which primarily retrieves information from an index.

Behind that Google Search are the AI servers. In the data center environment, a rack of AI servers can consume 10 to 20x more than a traditional server rack dedicated to business applications, websites, or data storage.

The fire alarms should be going off because the spike in demand is already here.

Multiple large organizations in Lincoln that own and operate their own data centers are already adding AI servers to existing racks. In addition, dozens of smaller businesses renting space in Lincoln’s colocation data centers are also rapidly adding AI servers or requesting the capacity to do so. While power usage per rack in these facilities has traditionally been capped by the data center, those limits are beginning to face pressure as AI adoption accelerates.

The immediate concern for businesses is simple: where is this additional electricity going to come from? New generation from LES is years away from producing power. And historical options – such as purchasing power from Central Nebraska Public Power or the Southwest Power Pool may also prove to be difficult.

Trying to buy power in an environment where everyone is facing increased demand could be like trying to buy water during a drought. It raises uncomfortable questions:

• Why would neighboring utilities sell power they also need?

• If they did, what price would they demand?

• How volatile could regional power markets become in an AI-driven electricity shortage?

The bottom line is this: AI is not just a software revolution. It is an energy revolution. And like all revolutions that push rapid, fundamental, and disruptive change it is not only changing what businesses can do – or how they do it — it is changing what it costs to do business.

Electricity is rapidly becoming a strategic constraint. If Lincoln wants to remain competitive, we must treat energy planning with the same seriousness we apply to workforce, infrastructure, and capital investment.

The time to ask hard questions is now — while we still have choices.