Ritual, An AI Infrastructure Firm, Raises $25 Million To Close Cryptocurrency Gaps

0
190
Ritual

Ritual, a decentralized AI network, announced that it has secured a $25M Series A fundraising round headed by Archetype, emerging from stealth status. The business provides an AI-driven system that seeks to carry out sophisticated reasoning that is now outside the scope of smart contracts. The full capacity of the present AI stack is hampered by problems like high costs of computing, centralized APIs, and restricted hardware access, even while AI use continues to trend upward across all industry sectors. As Ritual’s introduction post explains: “The goal is to develop Infernet through an extensible suite of carrying out layers that interface with other fundamental infrastructure within the ecosystem, enabling Ritual to be used as an AI Coprocessor by any protocol or application, thereby becoming the selling entry point for AI in the web3 space.”

The Money Raised Would Go Into Ritual’s Initiating Network And Developer Network

Such an AI template can assist in addressing new cases, like automatically controlling risk criteria for borrowing protocols according to real-time circumstances of the market. These models can be included throughout crypto, from the fundamental infrastructure to apps. Their protocol diagram demonstrates how modular layers of execution centered on AI models are used. The GMP layer “facilitates interop among current Ritual Superchain and blockchains, and acts as an AI coprocessor for all blockchains.” It is composed of layer 1, rollups, and sovereign. The $25 million Series A fundraising round attracted investors, which include Balaji Srinivasan, Robot Ventures, Accomplice, Dialectic, Accel, Avra, Anagram, and Hypersphere. The money will go toward expanding Ritual’s initiating network seeding and developer network. The AI community was concerned that the Biden administration’s recent decree on AI safety would stifle innovation because of its ambiguous language. In addition to broad mandates like “accelerating the development and use of privacy-preserving techniques” and “providing authorities for emerging enterprises with access to safety test findings.