Data centers are the new factories of the 21st century. Like any factory, they need power—a lot of energy. Big technology companies are building and operating large data centers that allow them to provide services such as streaming video or video games and train various AI models. The problem is that they need more and more energy, which translates into a growing demand for power.
While some are betting on nuclear energy and the revival of fossil fuels, one study argues that the future lies in off-grid solar energy. The problem is that while the solution sounds great, it doesn’t yet exist.
Hyperscalers. This is an important term. Technology companies that operate global cloud computing infrastructure are known as hyperscalers. Their data centers are critical to developing digital services, big data and AI systems. The term “hyperscaler” reflects that these data centers can scale quickly and on demand.
Depending on the business’s needs and the reach it wants to cover, this scalability translates into more storage, faster processing, and more bandwidth on the network.
Demand. The major players are Google, Microsoft, Meta, and Amazon. While they can scale their data centers, they face a huge problem: the resources they consume. In large server centers, water consumption has always been a problem that companies have tackled in various ways to be more environmentally responsible. Still, the arrival of AI is a revolution.
Training and maintaining these models consume a large amount of energy resources. In addition to the water used to dissipate the servers’ heat, they require large amounts of power. While companies are building more structurally sustainable data centers, the energy demand is so intense that they still need coal and natural gas to meet the load. Some companies, like Google and Meta, plan to use nuclear power to meet their needs.
Off-grid energy. Contextualizing the problem and recognizing that these energy demands work against decarbonization, researchers from companies like Stripe, Paces, and Scale Microgrids set out to determine the best solution to power these data centers sustainably. They presented their findings in a study estimating the total energy demand for AI development to be between 30 and 300 GW by 2030.
The demand for centers where AI training will take place could range from 15 to 150 GW. This is a huge span, but the solution they propose and consider optimal is the creation of off-grid, solar-powered microgrids. According to the study, 44% of solar systems are already cost-competitive with gas-only systems, and those that reach 90% renewables can be even more profitable than nuclear projects like Microsoft’s Three Mile Island.

Build where the sun shines. The advantage of this system is that it can be built quickly—there’s no need to restart a nuclear plant. Companies aren’t tied to what the energy market demands. Geopolitical conflicts won’t leave them without power. Clean energy is available. Solar panels are getting cheaper. Most importantly, it’s easily scalable. If more power is needed, it’s as simple as adding more panels. Still, the most essential part of the equation is that companies can build these centers in optimal locations.
Unlike server centers, which need to be close to end users to provide better service, data centers for AI training have geographic flexibility. That means they can be located in areas with optimal sunlight and inexpensive land.

Optimal areas. The study identified parcels of land in the U.S. with potential for up to 1,200 GW of off-grid solar power, gas backup and favorable geography—large expanses with strong radiation for much of the year. For example, California, Nevada, Arizona, New Mexico, and East Texas would be ideal for 90% off-grid solar data centers, with the remaining 10% backed by gas.
In addition, the study notes that most suitable land is privately owned and could be purchased for development. Many sites also fall within areas eligible for subsidies. If started today, the construction cycle would take between 12 and 24 months. It all looks promising—but it’s not happening.
If it’s so good, why isn’t it being done? According to researchers, three issues come into play. Two are closely related to the fact that AI training is still a recent phenomenon. Data center designers have historically hesitated to untether from the grid because they want to ensure reliability. They can’t afford to lose power for even a second.
The third reason is inertia—it has never been done before. Even though today’s technology makes it possible to run entirely on renewable energy (as some countries already do), the legacy mindset lingers. Then there’s the cost: $23 per megawatt hour. While panels are becoming more affordable, not buying them is still cheaper. Researchers point out, however, that this cost would be offset by avoiding emissions and offset expenses in the short term.
So, these off-grid solar microgrids appear to be a fast track to powering large data centers. While the technology is mature, it looks like it will be a while before we see data centers powered 90% by renewable energy. In short, the technology is ready—but the technologists? Not so much.
Image | Jackery Power Station (Unsplash)
View 0 comments