Your Position Home News

Focus on Allora’s structure and vision: How can blockchain solve the long tail problem of artificial intelligence?

Allora aims to achieve a self-improving, decentralized AI infrastructure and supports projects that want to securely integrate AI into its services.

Author:Tranks,DeSpread

Focus on Allora’s structure and vision: How can blockchain solve the long tail problem of artificial intelligence?插图

Disclaimer: The content of this report reflects the views of the respective authors and is for reference only and does not constitute a recommendation to purchase or sell tokens or use agreements. Nothing in this report constitutes investment advice and should not be understood as investment advice.

1. introduction

Since the emergence of generative artificial intelligence represented by ChatGPT, AI technology has developed rapidly, and enterprises ‘participation and investment in the AI industry have also continued to increase. Recently, AI has not only performed well in producing specific outputs, but also performed well in large-scale data processing, pattern recognition, statistical analysis, predictive modeling, etc., making the application scope of AI in all walks of life continue to expand.

  • JP Morgan: Hire more than 600 ML engineers to develop and test more than 400 AI technology use cases, including algorithmic trading, fraud prediction, and cash flow prediction.

  • Walmart: Analyze seasonal and regional sales history to forecast product demand and optimize inventory.

  • Ford Motors: Analyze vehicle sensor data to predict part failures and notify customers to prevent accidents caused by part failures.

Recently, the trend of combining the blockchain ecosystem with AI has become increasingly obvious, with the DeFAI field where DeFi protocol and AI has attracted particular attention.

In addition, there are more and more cases of directly integrating AI into the operating mechanism of the protocol, making the risk prediction and management of the DeFi protocol more efficient and introducing new financial products and services that were previously impossible to achieve.

Extended reading: “AI narratives are heating up, how can DeFi benefit from it?”

However, due to the large amount of information training and the high entry barrier for professional AI technology, the establishment of AI models dedicated to specific functions is still monopolized by several large companies and AI experts.

As a result, other industries and small startups face significant difficulties in adopting AI, and blockchain ecosystem dApps face the same limitations. Since dApps must maintain the core value of “no trust” of not trusting third parties, they must have a decentralized AI infrastructure so that more protocols can adopt AI to provide services that users can trust.

In this context, Allora aims to achieve a self-improving decentralized AI infrastructure and support projects that want to securely integrate AI into its services.

2. Allora, Decentralized Inference Synthetic Network

Allora is a decentralized inference network that predicts and provides future values for specific topics required by different entities. There are two main methods to implement decentralized AI inference:

  • Single model/decentralized processing: Conduct the model training and inference process in a decentralized manner to establish a decentralized single AI model.

  • Multiple model/inference synthesis: Collect inferences from multiple pre-trained AI models and synthesize them into a single inference result.

Focus on Allora’s structure and vision: How can blockchain solve the long tail problem of artificial intelligence?插图1

Among these two methods, Allora adopts a multi-model/inference synthesis method. AI model operators can freely participate in the Allora network to perform inferences on prediction requests on specific topics, and the protocol will respond to the requester’s request based on the inferences drawn by these operators.

When synthesizing the inferences of AI models, Allora does not simply calculate the average of the inferences drawn by each model, but rather allocates weights to each model to arrive at the final inferences. Allora then compares the inference values derived by each model with the actual results for the topic, and performs self-improvement by giving higher weights and rewards to models with the inference values close to the actual results to improve the accuracy of the inference.

Through this approach, Allora can perform more professional and topic-specific inferences than AI established by a single model/decentralized processing approach. In order to encourage more AI models to participate in the protocol, Allora provides the open source architecture Allora MDK (Model Development Kit) to help anyone easily build and deploy AI models.

In addition, Allora also provides two SDKs for users who want to use Allora inference data, Allora Network Python and TypeScript SDK. These SDKs provide users with an environment where they can easily integrate and use the materials provided by Allora.

Allora’s goal is to become an intermediate layer connecting AI models and protocols that require inference data, by providing AI model operators with opportunities to generate revenue while building an unbiased data infrastructure for services and protocols.

Next, we will discuss Allora’s communication protocol architecture to gain a closer understanding of Allora’s operating methods and characteristics.

2.1. Communication protocol architecture

In Allora, anyone can set and deploy a specific topic, and in the process of performing inferences and obtaining the final inferences for a specific topic, there are four participants:

  • Consumers: Paid requests to infer specific topics.

  • Workers: Use their database to manipulate AI models and perform inferences on specific topics required by consumers.

  • Reputers: Evaluation is made by comparing and comparing the data inferred by the worker with actual values.

  • Validators: Operate Allora network nodes to process and record transactions generated by each participant.

The Allora network is structured into inference executors, evaluators and verifiers, and is centered around the network token $ALLO. ALLO is used as a fee for inference requests and a reward for inference execution, connecting network participants, while ensuring security through pledge.

Focus on Allora’s structure and vision: How can blockchain solve the long tail problem of artificial intelligence?插图2

Next, we will review the interactions between various participants in detail based on the functions of each Layer, including the inference consumption layer, the inference synthesis layer, and the consensus layer.

2.1.1. inferential consumption layer

The inference consumption layer handles interactions between protocol participants and Allora, including topic creation, topic participant management, and inference requests.

Users who want to create themes can interact with Allora’s Topic Coordinator, paying a certain amount of $ALLO and formulating rules to define what they want to infer, how to verify actual results, and evaluate the inference value drawn by workers.

Once a theme is established, workers and Reputers can use $ALLO to pay the registration fee and register as inference participants for the theme. Reputers must invest an additional amount of $ALLO in the theme to expose themselves to the risk of asset Slashing that malicious results may bring.

When a theme is created and the worker and Reputers register, consumers can pay $ALLO to the theme to request inferences, and the worker and Reputers will receive these theme request fees as a reward for inferences.

2.1.2. Inference and synthesis layer

The inference and synthesis layer is the core layer Allora uses to generate decentralized inferences. Here, workers perform inferences, and Reputers evaluate performance, and perform weighting and synthesis of inferences based on these evaluations.

Workers in the Allora network not only need to submit inferences for topics requested by consumers, but also need to evaluate the accuracy of other workers ‘inferences and derive “predicted losses” based on these assessments. These predicted losses will be reflected in the weight calculation required for inference synthesis. When the worker’s inference is accurate and accurately predicts the accuracy of other workers ‘inference, the worker will receive a higher reward. Through this structure, Allora can derive inferential composite weights that consider various scenarios, rather than just the worker’s past performance.

Focus on Allora’s structure and vision: How can blockchain solve the long tail problem of artificial intelligence?插图3

Workers' inference accuracy prediction for context awareness

Source: Allora Docs

For example, in the theme predicting the price of Bitcoin in one hour, we assumed that workers A and B would behave as follows:

  • Worker A: The average inference accuracy is as high as 90%, but in unstable markets, the accuracy will decrease.

  • Worker B: The average inference accuracy is 80%, but relatively high accuracy is maintained despite market fluctuations.

If the current market is highly volatile and multiple workers predict “Worker B’s advantage under volatile conditions, they will only have an error of about 5% in this prediction” and at the same time predict that “Worker A is expected to have an error of about 15% in this volatility”, Allora will still give higher weight to its inferences in this prediction despite Worker B’s lower average historical performance.

The theme coordinator uses the final weight comprehensive inference drawn through this process and provides the final inference value to the consumer. In addition, confidence intervals will be calculated and provided based on the distribution of inference values submitted by workers during this process. Subsequently, Reputers compare the actual results with the final inference values to evaluate each worker’s inference performance and predict the accuracy of other workers ‘inference accuracy, and adjust the weights of workers based on the pledge consensus ratio.

Allora uses this method to synthesize and evaluate inferences, especially its “situational cognitive” structure, which allows each worker to evaluate the inferences accuracy of other workers, allowing Allora to derive optimal inferences for various situations, and help improve inferences accuracy. In addition, as data on workers ‘inference effectiveness accumulates, the operating efficiency of the context-awareness function will also increase, allowing Allora’s inference function to improve itself more effectively.

Focus on Allora’s structure and vision: How can blockchain solve the long tail problem of artificial intelligence?插图4

Allora's inference synthesis process

Source: Allora Docs

Allora’s consensus layer is the place for theme weight calculation, online reward distribution, and recording participant activities. It is built on the Cosmos SDK based on CometBFT and DPoS consensus mechanisms.

Users can participate in the Allora network as verifiers by forging $ALLO tokens and operation nodes, and collect transaction fees submitted by Allora participants as remuneration for operating the network and ensuring security. Even if there is no operation node, users can indirectly earn these rewards by delegating their $ALLO to the verifier.

In addition, Allora’s feature is to distribute $ALLO rewards to online participants. 75% of the newly unlocked and distributed $ALLO is distributed to workers and Reputers who participate in topic inference, and the remaining 25% is distributed to verifiers. When all $ALLOs are released, these inflation rewards will stop and follow a structure where the number of unlocks will be gradually halved.

When 75% of inflation rewards are distributed to workers and Reputers, the distribution ratio depends not only on the performance of the workers and the pledge of the Reputers, but also on the weight of the subject. Theme weights are calculated based on the pledge amount and fee income of Reputers participating in the theme, thereby motivating workers and Reputers to continue to participate in themes with high demand and high stability.

3. From the chain to all walks of life

3.1. The upcoming Allora main network

Allora established the Allora Foundation on January 10, 2025, and is accelerating the launch of the main network after completing a public test network with more than 300,000 participating workers. As of February 6, Allora has conducted the Allora Model Forge Competition to select AI model creators for the upcoming main network.

Focus on Allora’s structure and vision: How can blockchain solve the long tail problem of artificial intelligence?插图5

Allora Model Forge Competition categories

Source: Allora Model Forge Competition

In addition, Allora also established partnerships with most projects before the launch of the main network. Allora’s main partnerships and the functions it provides are as follows:

  • Plume: Provide RWA prices, real-time APY and risk forecasts on the Plume website.

  • Story Protocol: Provides IP value assessment and potential analysis, price information on non-liquid chain assets, and provides Allora inferences for Story Protocol-based DeFi projects.

  • Monad: Provides price information for illiquid assets on the chain and provides Allora inferences for Monad-based DeFi projects.

  • 0xScope: Use Allora’s context-awareness function to support the development of on-chain assistant AI Jarvis.

  • Virtuals Protocol: Enhance proxy performance by integrating Allora Inference with Virtual Protocol’s G.A.M.E framework.

  • Eliza OS (formerly ai16z): Enhance agent performance by integrating Allora reasoning with Eliza OS’s Eliza framework.

Currently, Allora’s partners are mainly focused on AI/cryptocurrency projects, reflecting two key factors: 1) the need for decentralized inference in cryptocurrency projects, and 2) the need for on-chain data needed for AI models to perform inferences.

Focus on Allora’s structure and vision: How can blockchain solve the long tail problem of artificial intelligence?插图6

For early main-network launches, Allora expects to allocate large inflation rewards to attract participants. In order to encourage continued activity among the participants attracted by these inflation rewards, Allora needs to maintain the appropriate value for $ALLO. However, as inflation rewards gradually diminish over time, the long-term challenge will be to generate sufficient online transaction fees by increasing the need for inference to incentivize continued participation in the agreement.

Therefore, the key to assessing Allora’s potential success lies in Allora’s short-term $ALLO appreciation strategy and its ability to promote inferred demand to ensure stable and long-term expense revenue.

4. conclusion

With the advancement and practicality of AI technology, the adoption and implementation of AI reasoning are also actively developing in most industries. However, the resource intensity required to adopt AI is widening the competitive gap between large companies that have successfully introduced AI and small companies that have not yet successfully introduced AI. In this environment, demand for Allora’s capabilities to provide topic-optimized inferences and self-improve data accuracy through decentralization is expected to increase gradually.

Allora’s goal is to become a decentralized inference infrastructure that can be widely adopted across all industries, and achieving this goal requires demonstrating the effectiveness and sustainability of functionality. To prove this, Allora needs to get enough workers and Reputers after the main network is released and ensure that these network participants receive sustainable rewards.

If Allora successfully solves these challenges and is adopted in all walks of life, it will not only demonstrate the potential of blockchain as an important AI infrastructure, but also serve as an important example of how AI and blockchain technology can be combined to bring real value.

Welcome to join the official social community of Shenchao TechFlow

Telegram subscription group: www.gushiio.com/TechFlowDaily
Official Twitter account: www.gushiio.com/TechFlowPost
Twitter英文账号:https://www.gushiio.com/DeFlow_Intern

Popular Articles