Your Position Home News

DeepSeek’s impact on Web3 AI upstream and downstream protocols

DeepSeek burst the last bubble at the Agent Track, DeFAI may give birth to a new life, and the industry’s financing methods will usher in changes.

Written by Kevin, the Researcher at BlockBooster

TL;DR

  • The emergence of DeepSeek has shattered the computing power moat, and computing power optimization led by open source models has become a new direction;

  • DeepSeek benefits the model layer and application layer in the upstream and downstream of the industry, which has a negative impact on computing power protocols in infrastructure;

  • DeepSeek’s good news inadvertently burst the last bubble at Agent Track, and DeFAI is most likely to give birth to new life;

  • The zero-sum game of project financing is expected to come to an end, and the new financing method of community launch + a small number of VCs may become the norm.

The impact caused by DeepSeek will have a profound impact on the upstream and downstream of the AI industry this year. DeepSeek has successfully enabled home consumer graphics cards to complete large-model training tasks that could only be undertaken by a large number of high-end GPUs. The computing power of the first moat surrounding AI development has begun to collapse. When algorithm efficiency is running at a rate of 68% per year and hardware performance climbs linearly according to Moore’s Law, the deep-rooted valuation models of the past three years are no longer applicable. AI’s next chapter will be started with an open source model.

Although Web3 ‘s AI protocol is completely different from Web2 ‘s, it will inevitably suffer the impact of DeepSeek, which will create new use cases on the upstream and downstream of Web3 AI: infrastructure layer, middleware layer, model layer and application layer.

Sort out the collaborative relationship between upstream and downstream agreements

Through the analysis of technical architecture, functional positioning and actual use cases, I divided the entire ecosystem into: infrastructure layer, middleware layer, model layer, and application layer, and sorted out their dependencies:

infrastructure layer

The infrastructure layer provides decentralized underlying resources (computing power, storage, L1), among which computing power protocols include Render, Akash, io.net, etc.; storage protocols include Arweave, Filecoin, Storj, etc.;L1 includes NEAR, Olas, Fetch.ai, etc.

The computing power layer protocol supports model training, reasoning, and framework operation; the storage protocol stores training data, model parameters, and on-chain interaction records;L1 optimizes data transmission efficiency through dedicated nodes and reduces latency.

middleware layer

The middleware layer is a bridge connecting infrastructure and upper-level applications, providing framework development tools, data services and privacy protection. Among them, data annotation protocols include Grass, Masa, Vana, etc.; development framework protocols include Eliza, ARC, Swarms, etc.; privacy computing protocols include Phala, etc.

The data service layer provides fuel for model training, the development framework relies on the computing power and storage of the infrastructure layer, and the private computing layer protects the security of data during training/reasoning.

model layer

The model layer is used for model development, training and distribution, including the open source model training platform: Bittensor.

The model layer relies on the computing power of the infrastructure layer and the data of the middleware layer; the model is deployed to the chain through the development framework; and the model market delivers the training results to the application layer.

application layer

The application layer is an AI product for end users, among which Agents include GOAT, AIXBT, etc.;DeFAI protocols include Griffain, Buzz, etc.

The application layer calls pre-trained models at the model layer; relies on private computing at the middleware layer; and complex applications require real-time computing power at the infrastructure layer.

DeepSeek may have a negative impact on decentralized computing power

According to a sample survey, about 70% of Web3 AI projects actually call OpenAI or centralized cloud platforms, only 15% of projects use decentralized GPUs (such as Bittensor subnet model), and the remaining 15% are mixed architectures (sensitive data is processed locally, common tasks are put on the cloud).

The actual utilization rate of the decentralized computing power agreement is far lower than expected and does not match its actual market value. There are three reasons for the low utilization rate: Web2 developers followed the original tool chain when migrating to Web3; decentralized GPU platforms have not yet achieved price advantages; some projects evade data compliance reviews in the name of “decentralization”, and actual computing power still relies on the centralized cloud.

AWS/GCP accounts for 90%+ of AI computing power market share, compared with Akash’s equivalent computing power is only 0.2% of AWS’s. The centralized cloud platform moat includes: cluster management, RDMA high-speed network, and elastic scaling; the decentralized cloud platform has a web3 improved version of the above technologies, but the flaws that cannot be improved are latency problems: distributed node communication latency is 6 times that of the centralized cloud; tool chain fragmentation: PyTorch/TensorFlow does not natively support decentralized scheduling.

DeepSeek reduces computing power consumption by 50% through sparse training, and dynamic model pruning enables consumer-grade GPUs to train tens of billions of parameter models. Market demand expectations for high-end GPUs in the short term have been significantly lowered, and the market potential of edge computing has been revalued. As shown in the above figure, before the emergence of DeepSeek, the vast majority of protocols and applications in the industry used platforms such as AWS, and only a very small number of use cases were deployed in decentralized GPU networks. Such use cases focused on the latter’s consumer-level computing power. Price advantage and ignore the impact of latency.

This situation may deteriorate further with the emergence of DeepSeek. DeepSeek has released the limitations of long-tail developers, and low-cost and efficient reasoning models will be popularized at an unprecedented rate. In fact, the above-mentioned centralized cloud platforms and many countries have begun to deploy DeepSeek, and the significant reduction in reasoning costs will lead to a large number of front-end applications, which have huge demand for consumer-grade GPUs. Faced with the upcoming huge market, centralized cloud platforms will launch a new round of user competition, not only competing with the head platform, but also competing with countless small centralized cloud platforms. The most direct way to compete is to cut prices. It is foreseeable that the price of 4090 on the centralized platform will be lowered, which is a disaster for the Web3 computing power platform. When price is not the only moat for the latter and computing power platforms in the industry are forced to cut prices, the result is beyond io.net, Render, and Akash. The price war will destroy the latter’s remaining valuation cap, and the death spiral caused by declining revenue and user churn may transform the decentralized computing power agreement in a new direction.

The specific significance of DeepSeek to industry upstream and downstream agreements

DeepSeek’s impact on Web3 AI upstream and downstream protocols插图

As shown in the figure, I think DeepSeek will have different impacts on the infrastructure layer, model layer and application layer. In terms of positive impacts:

  • The application layer will benefit from a significant reduction in reasoning costs, and more applications can ensure that Agent applications can be online for a long time and complete tasks in real time at low cost;

  • At the same time, low-cost model overhead such as DeepSeek can allow the DeFAI protocol to form a more complex SWARM. Thousands of Agents are used in a use case, and the division of labor for each Agent will be very subtle and clear, which can greatly improve the user experience and avoid user input being disassembled and executed by model errors;

  • Developers at the application layer can fine-tune the model and feed DeFi-related AI applications with prices, on-chain data and analysis, and protocol governance data without having to pay high license fees.

  • After the birth of DeepSeek, the open source model layer proved its existence. Opening high-end models to long-tail developers can stimulate a wide range of development craze;

  • In the past three years, the high wall of computing power built around high-end GPUs has been completely broken. Developers have more choices and establish the direction for open source models. In the future, AI models will no longer compete for computing power but algorithms, and the transformation of beliefs will become The cornerstone of confidence for developers of open source models;

Specific subnets around DeepSeek will emerge one after another, model parameters will rise with the same computing power, and more developers will join the open source community.

In terms of negative impact:

  • The objective use delay of computing power protocols in infrastructure cannot be optimized;

  • And the hybrid network composed of A100 and 4090 has higher requirements for coordination algorithms, which is not an advantage of a decentralized platform.

DeepSeek bursts the last bubble at Agent Track, DeFAI may give birth to life, and industry financing methods usher in changes

Agents are the last hope for AI in the industry. The emergence of DeepSeek has freed the limitations of computing power and outlined future expectations for an application explosion. This was a huge benefit for the Agent Track, but due to the strong correlation between the industry, U.S. stocks and the Federal Reserve’s policies, the remaining bubble was burst, and the track’s market value fell to the bottom.

In the wave of integration between AI and the industry, technological breakthroughs and market games always follow suit. The chain reaction triggered by the shock in Nvidia’s market value is like a demon mirror, reflecting the deep dilemma of the AI narrative in the industry: from On-chain Agents to DeFAI engines, under the seemingly complete ecological map, weak technological infrastructure, hollow value logic, and capital-dominated cruel reality. The seemingly prosperous on-chain ecosystem hides hidden problems: a large number of high-FDV tokens compete for limited liquidity, obsolete assets rely on FOMO emotions to survive, and developers are trapped in PVP entanglements to consume innovation potential. When incremental capital and user growth hit the ceiling, the entire industry fell into the “innovator’s dilemma”, both eager for breakthrough narratives and difficult to get rid of the shackles of path dependence. This tearing state provides a historic opportunity for AI Agents: it is not only an upgrade of the technical toolbox, but also a reconstruction of the value creation paradigm.

In the past year, more and more teams in the industry have found that the traditional financing model is failing and the routine of giving VCs small shares, high control over the market, and waiting for orders is unsustainable. Under the triple pressure of tightening VC pockets, retail investors refuse to take over, and the threshold for large exchanges to list currencies is high. A new game that is more suitable for the bear market is emerging: joint head KOL+ a small number of VC, large proportion of community launches, and low market value cold launches.

Innovators represented by Soon and Pump Fun are opening up a new path through “Community Launch” and joining the head KOL endorsement, distributing 40%-60% of tokens directly to the community, launching projects at valuation levels as low as $10 million in FDV, achieving millions of dollars in financing. This model builds consensus FOMO through the influence of KOL, allowing the team to lock in earnings in advance, and at the same time exchange high liquidity for market depth. Although it gives up the advantage of short-term control, it can repurchase tokens at low prices in the bear market through a compliance market-making mechanism. In essence, this is a paradigm shift in the power structure: from a VC-led game of beating the drum and passing the flowers (institutions take over the offer-sellers sell-retail investors pay) to a transparent game of community consensus pricing, where project parties and communities form a new type of symbiotic relationship at a liquidity premium. As the industry enters a transparency revolution cycle, projects that are obsessed with traditional control logic may become an aftershadow of the times under the wave of power transfer.

The short-term pain in the market just confirms the irreversibility of the long wave of technology. When AI Agents reduce the cost of on-chain interactions by two orders of magnitude, and when adaptive models continue to optimize the financial efficiency of the DeFi protocol, the industry is expected to usher in the long-awaited Massive Adoption. This change does not rely on conceptual hype or capital ripening, but is rooted in technological penetration based on real needs. Just as the power revolution has never stalled due to the bankruptcy of light bulb companies, Agent will eventually become a real golden track after the bubble burst. DeFAI may be fertile ground for new life. When low-cost reasoning becomes daily, we may soon see the birth of use cases in which hundreds of Agents are combined into one Swarm. With equivalent computing power, the significant increase in model parameters can ensure that Agents in the open source model era can be more fully fine-tuned. Even in the face of complex input instructions from users, they can be divided into task pipelines that can be fully performed by a single Agent. Optimizing on-chain operations by each Agent may promote increased overall DeFi protocol activity and increased mobility. More complex DeFi products, led by DeFAI, will emerge, and this is where new opportunities emerge after the last bubble burst.

Welcome to join the official social community of Shenchao TechFlow

Telegram subscription group: www.gushiio.com/TechFlowDaily
Official Twitter account: www.gushiio.com/TechFlowPost
Twitter英文账号:https://www.gushiio.com/DeFlow_Intern

Popular Articles