Cutline for photo: During the Federal Reserve Bank of Dallas Powering AI conference, Assistant Vice President of Energy Programs Garrett Golding, left, moderates a panel with Clark O’Niell, Boston Consulting Group; Bernadette Johnson, Enverus; and Ning Lin, Bureau of Economic Geology at the University of Texas at Austin.
BY GARRETT GOLDING | ASSISTANT VICE PRESIDENT, FEDERAL RESERVE BANK OF DALLAS
The Federal Reserve Bank of Dallas hosted a two-day conference March 4-5 examining how the United States will power its artificial intelligence (AI) ambitions. A leading constraint to the buildout of AI infrastructure is access to reliable and affordable power. A surge in power generation and transmission investment at this scale has not occurred in decades, and the stakes of winning the AI arms race could not be higher.
The Powering AI event brought together energy, technology, and financial executives and experts to address the intersection of explosive AI demand for electricity and constraints in power generation, financing, and grid management.
The conference revealed that powering AI is less about technological impossibility and more about coordination challenges across infrastructure, finance, policy, and communities. The industry has the generation capacity, the capital, and increasingly the demand flexibility to continue its growth. But unlocking this potential requires transparent commitments, regulatory reform, community engagement, and workforce development. As one panelist emphasized: "This is a national security issue." The ability to deploy AI infrastructure will determine competitive positioning in the global technology race.
Below are the five most critical takeaways.
1. Timing risk exceeds volume risk; supply constraints are the bottleneck.
There is remarkable consensus that AI will drive a historic increase in demand for power, with realistic demand from data centers estimated at 50-60 gigawatts by 2030 (far below the speculative 400-plus gigawatts that is under review in interconnection queues nationwide). Aligning the building timelines of data centers, generation and transmission to meet that demand will be challenging.
2. Demand flexibility is the golden nugget for grid integration.
Data centers are evolving from inflexible baseload consumers to flexible loads that can respond to price signals and grid conditions.
3. Existing infrastructure has the potential to meet near-term demand through higher utilization.
Contrary to headlines suggesting looming power shortages, existing generating capacity coupled with load flexibility can accommodate substantial growth.
4. Capital is abundant but requires firm commitments to unlock investment.
Infrastructure investment for AI is unprecedented in scale: Hyperscalers plan $700 billion to $900 billion in annual capex (2026-2027). These investments are financially sustainable—hyperscalers generate operating cash flows covering around 80 percent of capex spend. The $1.5 trillion financing gap beyond hyperscaler balance sheets can be filled through diversified debt markets.
5. Political and social dynamics will determine success or failure.
The greatest risk may not be technical but political, as NIMBY resistance intensifies. Public opinion runs against AI by a 2.5:1 margin, with 31 percent saying it is not a net positive for society versus 12 percent saying it is a net positive.