Stanford University’s AI Lab has chosen to harness the power of decentralized cloud computing with Theta Labs’ EdgeCloud for their large language models (LLMs) research. The lab, under the leadership of Assistant Professor Ellen Vitercik, is set to utilize the platform for discrete optimization and algorithmic reasoning of LLMs.
This move is indicative of a growing trend in academia, with an increasing number of institutions turning to decentralized platforms for research. Theta Labs reports that other adopters of their EdgeCloud include Seoul National University, Korea University, the University of Oregon, and Michigan State University, among others.
Traditionally, big tech companies like Microsoft, Amazon, and Google have been the primary investors in computing infrastructure, particularly AI-centric infrastructure. Microsoft invested $3.3 billion in a Wisconsin-based data center in 2024, earning the backing of the Biden administration. Amazon, meanwhile, has announced plans to spend $11 billion on data centers in Indiana. Google is expanding globally, investing $1.1 billion in a data center in Finland and building another in Malaysia for $2 billion.
However, the traditional model of centralized data centers owned by big tech is no longer the only one vying for AI workloads. Theta EdgeCloud, unlike most traditional LLM services, operates as a decentralized cloud computing platform. Its infrastructure is geographically distributed, which means it does not rely on massive centralized data centers for computing power.
Theta EdgeCloud employs blockchain technology to reward smaller GPU providers based on the revenue they generate from users. This model allows Theta to operate with lower capital expenditures and scale more quickly than traditional models. As a result, it can offer a more affordable infrastructure for users, making it an attractive option for academic institutions with tight budgets.
Originally, the Theta Network was a blockchain protocol designed for decentralized video streaming. However, it has since evolved to provide decentralized infrastructure for cloud computing, with a specific emphasis on AI applications. With the adoption of such decentralized platforms by prestigious institutions like Stanford University, it’s clear that the future of AI research may well be in the decentralized cloud.