TheCryptoUpdates

Akash founder warns AI training may trigger global energy crisis

The Growing Energy Crisis in AI Training

Artificial intelligence development is hitting a serious energy wall, and the situation appears to be getting worse faster than many realize. According to Greg Osuri, founder of Akash Network, training large AI models may soon require energy output comparable to nuclear reactors. He shared these concerns during an interview at Token2049 in Singapore, noting that the industry consistently underestimates how quickly computing demands are doubling.

Data centers already consume hundreds of megawatts of fossil fuel power, and this trend could potentially trigger broader energy crises. Household power bills are already feeling the impact, with some areas seeing wholesale electricity costs surge by 267% over five years near data center locations. Osuri made the striking observation that “We’re getting to a point where AI is killing people,” pointing to health impacts from concentrated fossil fuel use around data hubs.

Decentralization as a Potential Solution

The alternative, Osuri suggests, lies in decentralization. Instead of concentrating computing power and energy consumption in massive data centers, distributed training across networks of smaller GPUs could unlock both efficiency and sustainability. This approach would utilize everything from high-end enterprise chips to gaming cards in home computers.

“Once incentives are figured out, this will take off like mining did,” Osuri predicted. He envisions a future where home computers could earn tokens by providing spare computing power for AI training tasks. This concept bears similarities to early Bitcoin mining, where ordinary users contributed processing power and received rewards, though this time the work would involve training AI models rather than solving cryptographic puzzles.

Technical and Incentive Challenges Remain

While the potential benefits are clear, significant challenges still exist. Training large-scale models across diverse GPU networks requires breakthroughs in software coordination and distributed computing technology. Osuri noted that while several companies have demonstrated aspects of distributed training in recent months, no one has successfully integrated all components to run a complete model.

Perhaps the bigger challenge lies in creating fair incentive systems. “The hard part is incentive,” Osuri explained. “Why would someone give their computer to train? What are they getting back? That’s a harder challenge to solve than the actual algorithm technology.”

Despite these obstacles, Osuri insists that decentralized AI training isn’t just an option—it’s becoming a necessity. By spreading workloads across global networks, AI development could ease pressure on energy grids, reduce carbon emissions, and create a more sustainable foundation for the AI economy. The timeline for these developments remains uncertain, but the urgency of addressing AI’s growing energy footprint seems increasingly clear.

Loading

Close No menu locations found.