While AI has undoubtedly cemented its position as the main driver of technology, one tends to forget the immense energy and infrastructure required to train, deploy, and scale its systems. 

And here’s the truth: AI’s hunger for energy is unsustainable at current rates, and the infrastructure supporting it is racing to adapt. 

The energy problematic

​​Training modern AI models, especially large ones like foundation models or generative transformers, requires huge amounts of computing power. 

Training a single large language model (LLM) like GPT-3 consumes 1,300 megawatt-hours (MWh), or enough to power 130 U.S. homes for a year. By this year, AI could account for 10-15% of global electricity demand, rivaling small nations.

As AI models grow larger—from billions to trillions of parameters—energy needs to double every 3–4 months; far outpacing Moore’s Law.

Comparison of energy use: AI vs traditional computing

AI workloads use much more energy than traditional computing tasks. While everyday applications like word processing or web hosting only need modest CPU power, AI tasks, especially deep learning, require massive parallel processing. For example, a simple search query in the cloud may consume only a small amount of energy, while an AI-powered query using a large language model uses much more. Training a large model like GPT-3, for instance, consumed hundreds of megawatt-hours of energy—roughly the same as the total energy use of several cars over their lifetimes. Even though inference (using models for real-time tasks) uses less energy than training, it still requires far more power than typical software, especially at a global scale. This growing energy gap is why traditional infrastructure is being redesigned to meet AI's needs more sustainably and efficiently.

Growth trajectory: exponential increase in energy requirements

The energy requirements of AI aren’t just increasing—they’re growing at an exponential rate. As AI models scale from millions to billions, and now trillions, of parameters, their training demands rise sharply. Research by OpenAI shows that the compute power needed to train the largest AI models is doubling roughly every 3.4 months, a pace much faster than Moore’s Law. This rapid growth isn't limited to training; AI’s widespread use across industries is also increasing real-time inference workloads. As more businesses adopt AI, the overall energy demand continues to grow, raising important concerns about power supply, data center efficiency, and environmental sustainability. In short, AI’s rapid growth is outpacing the development of traditional infrastructure, creating both challenges and major investment opportunities.

The critical role of data centers

AI-centric data centers are specifically designed to meet the heavy computational needs of machine learning workloads, setting them apart from traditional data centers. While standard centers handle tasks like web hosting or email services with CPUs, AI-focused centers rely on GPUs and accelerators (like TPUs) for large-scale parallel processing. These systems generate more heat and consume much more power per rack—often 30–50 kW per rack, compared to 5–10 kW in legacy setups. 

To tackle this, AI data centers use advanced cooling solutions such as liquid cooling and immersion cooling. They also require power redundancy, low-latency connections, and high-speed networks, as AI training involves massive data movement. This infrastructure is more complex, energy-hungry, and capital-intensive, making the choice of location, energy sources, and facility design key factors for investment.

Geographic and strategic considerations

  • Proximity to renewable energy - Co-locating AI-centric data centers near renewable energy sources such as solar, wind, hydro, or nuclear power provides both environmental and financial benefits. It helps reduce the carbon footprint, aligns with sustainability goals, and ensures stable, long-term energy costs. Renewable energy sources tend to have more predictable pricing than fossil fuels, protecting data centers from market fluctuations. This strategic placement also positions data centers to meet future regulatory standards and increasing public demand for green energy, making it a strong investment for those focused on both environmental impact and cost efficiency.

  • Latency-sensitive applications - Facilities supporting latency-sensitive AI applications, such as autonomous systems, financial trading, or real-time consumer services, must be located close to end-users or edge nodes. This minimizes latency, ensuring fast, reliable performance. For investors, this presents opportunities in regions with high demand for low-latency services, such as financial hubs or tech centers, where real-time data processing is crucial for maintaining a competitive advantage.

  • Regulatory environments - Favorable zoning laws, tax incentives, carbon regulations, and energy policies are key to the long-term success and ROI of AI-centric data centers. A supportive regulatory environment can lower operational costs, speed up development, and attract investment. For investors, regions with favorable policies offer a competitive edge, reducing risks related to compliance and supporting sustainable, cost-effective growth.

Sustainability pressures 

Sustainability pressures are playing a growing role in the future of AI-centric data centers, emphasizing energy efficiency, emissions reduction, and the integration of green energy. As environmental concerns rise, companies are under pressure to reduce their carbon footprint, optimize power usage, and adopt renewable energy sources. Meeting these demands often involves substantial investments in energy-efficient hardware, advanced cooling systems, and renewable energy contracts. For investors, backing green initiatives not only supports global sustainability goals but also positions data centers to comply with stricter regulations and attract eco-conscious clients, offering a competitive edge in an increasingly ESG-driven market.

Risks & benefits 

  • Risks - AI's rapid growth presents several risks, especially for grid operators and power providers. The surge in energy demand from AI-driven data centers is putting strain on existing infrastructure, with many grids struggling to keep up with the rising power consumption. Insufficient power capacity and outdated infrastructure can lead to reliability issues and potential blackouts, particularly in regions with already high energy demands. Additionally, there are gaps in the infrastructure needed to distribute clean, renewable energy for these energy-intensive operations. Private capital has a crucial role in addressing these challenges by investing in grid modernization, energy storage solutions, and decentralized power systems. Such investments can help balance AI’s growing energy demands with sustainability goals, ensuring a stable and reliable energy future.

  • Benefits - The increasing demand for AI is driving the need for next-generation data centers and supporting energy assets, such as microgrids and on-site renewable energy sources. These innovations improve efficiency, reduce reliance on traditional power grids, and help meet sustainability targets by lowering carbon footprints. As AI continues to transform industries, it is also pushing the evolution of energy infrastructure, promoting the development of smart grids, energy storage, and predictive systems. For investors, this presents significant opportunities to support both the AI and energy sectors, fostering innovation while ensuring scalable, sustainable growth.

The bottom line 

As AI continues to evolve and reshape industries, the infrastructure that supports it—especially data centers and energy systems—must also adapt. The demand for efficient, sustainable, and scalable solutions is driving innovation in both technology and energy sectors. As adoption of AI accelerates, the need for cutting-edge infrastructure will only grow, creating substantial opportunities for investors - the intersection of AI and energy infrastructure is set to be a cornerstone of the next wave of innovation, with immense potential to drive economic growth and operational efficiencies across industries.