2 min read

LambdaClass and Nous Research: Distributed Systems for Decentralized AI

LambdaClass collaborates with Nous Research on decentralized AI training infrastructure. Nous raises $65M from Paradigm.

Psyche Opening the Golden Box by John William Waterhouse

Nous Research, an open-source AI lab focused on decentralized training, has raised $65 million in a round led by Paradigm.12 LambdaClass is collaborating with Nous on the distributed systems infrastructure underlying decentralized AI training.3

The problem

Training a frontier AI model requires tens of thousands of GPUs concentrated in a few data centers controlled by a few companies. Decentralized training distributes this computation across many participants. Instead of one organization owning a massive cluster, a network of contributors provides compute and is compensated for participation.

The demand for AI compute is growing faster than any single organization can build capacity. Distributed training is one way the field scales past the current constraint.

What LambdaClass contributes

Decentralized training is a distributed systems problem: consensus on model state, efficient communication between nodes with varying bandwidth, detection of faulty participants, coordination protocols that work at scale.

LambdaClass has been solving these problems in the blockchain context for years—Ethereum execution clients, consensus implementations, zero-knowledge proof systems. The collaboration with Nous applies this expertise to a different domain. The underlying engineering challenges are similar: reliable distributed computation with economic incentives.

  1. https://www.finsmes.com/2025/04/nous-research-raises-65m-in-funding.html

  2. https://www.theblock.co/post/352000/paradigm-leads-50-million-usd-round-decentralized-ai-project-nous-research

  3. https://nousresearch.com/nous-psyche/