Paradigm Invests $50 Million in Decentralized AI Startup Nous Research
(Originally posted on : Crypto News – iGaming.org )
Paradigm, a major name in venture capital, is making a major move into decentralized artificial intelligence by backing Nous Research. Reports from Fortune say Paradigm is investing close to $50 million into the AI startup, pushing its token valuation up to $1 billion.
Good to know
- Paradigm is investing about $50 million into Nous Research.
- Nous Research focuses on decentralized AI training using global spare computing capacity.
- The project runs on Solana but has not yet confirmed if rewards will be paid in SOL.
Nous Research is building human-centric language models with the goal of challenging large players like OpenAI and DeepSeek. Instead of relying on traditional centralized data centers, Nous Research plans to use spare computing power from around the world to train its AI models.
Karan Malhotra, co-founder of Nous Research, explained the idea to Fortune, highlighting that he worked alongside a former founding member of OpenAI to develop a way to decentralize AI training. Malhotra stated, “We think of the incentive mechanism behind crypto to push people to actually utilize their idle compute [of] less as a donation but more as a transaction… We do not want to get kind of bogged down by the traditional view of how crypto operates when we are a very serious research lab and an academic lab. This is really the only way in which we can make such a massive training run and such a democratic thing possible.”
Building on the Solana blockchain adds another layer to the project’s structure. However, Nous Research has not yet confirmed a timeline for when its AI training platform will go live. It is also undecided whether rewards for contributors will be distributed in SOL tokens.
New players only. Exclusive Welcome Bonus of 350% + 150 Free Spins
Paradigm’s investment into Nous Research reflects growing interest in projects that combine decentralized networks with AI innovation. The idea of tapping into unused computing power from a wide range of users could potentially make training large language models cheaper, more democratic, and less reliant on a few major tech companies.