Decentralized AI learning platform FLock (no, that capital “L” is not a mistake) has teamed up with io.net to develop the world’s first Proof-of-AI (PoAI) consensus mechanism for validating the integrity of nodes operating on a distributed compute network. The release is set to advance the resource-efficiency of AI-powered computations for various use cases.
FLock, io.net announce partnership, tease Proof-of-AI concept
GPU management platform io.net and federated AI learning service FLock shared the details of their long-term strategic collaboration. The new initiative is expected to provide AI and Web3 segments with entirely new development and computation instruments.
1/ @ionet X FLock partnership breakthrough 🚀
Developing world-first Proof of AI (PoAI) consensus mechanism together.
Why? To validate integrity of DePIN nodes in decentralised compute networks.
👇Find out more about this AI-native Proof of Work.
— FLock.io (@flock_io) August 29, 2024
Namely, the two will be working together on the world’s first Proof of AI (PoAI) consensus mechanism for validating the integrity of nodes operating on a decentralized compute network.
With PoAI, decentralized physical infrastructure networks (DePINs) can verify the integrity of DePIN nodes through completing compute-intensive AI training tasks. PoAI is an AI-native Proof of Work, directing the resources of verification to meaningful AI tasks. It allows nodes to earn block rewards from DePIN as well as AI training networks – in this case, both IO.net and FLock.io.
Jiahao Sun, founder and CEO of FLock, highlights that the new release will be of critical importance to the DePIN, AI and Web3 segments:
AI engineers and end users alike need to trust the quality of the compute resources they are provided with, and Proof of AI is the key to achieving that. Compute underpins the entire AI development process, which is why we’re starting with that, and we are delighted to join forces with io.net, a true leader in its field.
The mechanism, which substantiates the integrity of DePIN nodes in a decentralized and AI-native way, involves an engine that continuously provides challenges, aggregates responses and supplies necessary stats (e.g., latency, score deviation, data correctness) to io.net nodes to compile judgments.
Pushing barriers of AI model training with Web3
io.net CEO and cofounder Tory Green is excited by the scope of opportunities the new collaboration unlocks for AI’s various use cases:
The arrival of Proof of AI will assuredly lead to tremendous improvements in AI model training and inference over decentralized compute networks. I am confident GPU node operators, as well as the wider AI/ML developer community, will welcome Proof of AI with open arms.
Synthetic data has been proven to be extremely useful for model training, although synthesizing and cleaning 15 trillion tokens (the number used in LLama3 training) is a nontrivial undertaking. Consequently, FLock Data Generation will utilize idling GPU resources to perform batch inference on LLMs requested by FLock Task Creator and Training Node.
Decentralized AI GPU networks are critical to decentralized AI’s long-term success, yet many dishonest actors still seek to game the system. A common way of doing so is to maliciously trick the network into believing they have more computing resources than they do.
Without robust deterrence measures being put in place, node operators can act dishonestly to win network rewards, even if their contributions are minimal. Solving the node integrity verification problem is a challenge, as bad actors can create false representations of the resources they have and collect rewards without doing any of the work.