Original Author: PonderingDurian, Researcher at Delphi Digital
Original compilation: Pzai, Foresight News
Given that Cryptocurrency is essentially Open Source software with built-in economic incentives, and AI is disrupting the way software is written, AI will have a huge impact on the entire blockchain space.
AI x Crypto Overall Stack
DeAI: Opportunities and Challenges
In my opinion, the biggest challenge facing DeAI lies in the infrastructure layer, because building foundational models requires a large amount of capital, and the scale of data and computation also yields high returns.
Considering the law of scaling, tech giants have a natural advantage: during the Web2 phase, they made huge profits from monopoly profits that aggregated consumer demand and reinvested those profits in cloud infrastructure during a decade of artificially low rates, and now, internet giants are trying to capture the AI market by capturing data and computing (a key element of AI):
Comparison of the token volume of the large model
Due to the capital intensity and high bandwidth requirements of large-scale training, a unified supercluster is still the best choice - providing technology giants with the best closed-source models - they plan to rent these models with monopolistic profits and reinvest the proceeds in each subsequent generation of products.
However, the fact is that the moat in the AI field is shallower than the network effect of Web2, and the leading-edge models quickly depreciate relative to the field, especially Meta’s open-source advanced models such as Llama 3.1, which has reached the SOTA level with an investment of billions of dollars.
Llama 3 Model Rating
At this point, the emerging research on low latency distributed training methods, combined with the potential commercialization of (some) cutting-edge business models, may shift competition (at least partially) from hardware superclusters (favorable to tech giants) to software innovation (slightly favorable to Open Source / Crypto Assets) as intelligent prices drop.
Ability Index (Quality) - Training Price Distribution Chart
Given the computational efficiency of the ‘hybrid expert’ architecture and large model synthesis/routing, we are likely to face not just a world of 3-5 giant models, but a world of millions of models with different cost/performance trade-offs. A intertwined intelligent network (hive).
This creates a huge coordination problem: blockchain and Cryptocurrency incentive mechanisms should be able to help solve this problem well.
Core DeAI Investment Areas
Software is eating the world. AI is eating software. And AI is basically data and computation.
Delphi is optimistic about various components in this stack:
Simplified AI x Crypto Stack
Infrastructure
Given that the power of AI comes from data and computing, DeAI infrastructure is committed to procuring data and computing as efficiently as possible, often using Cryptocurrency incentive mechanisms. As we mentioned earlier, this is the most challenging part of the competition, but it may also be the most rewarding part given the size of the end market.
Calculate
So far, distributed training protocol and GPU market have been constrained by latency, but they hope to coordinate potential heterogeneous hardware to provide lower cost, on-demand computing services for those who have been rejected by the integration solutions of giants. Companies such as Gensyn, Prime Intellect, and Neuromesh are driving the development of distributed training, while companies like io.net, Akash, and Aethir are achieving low-cost inference closer to edge intelligence.
Project ecological distribution based on aggregated supply position
Data
In an ubiquitous intelligent world based on smaller and more specialized models, the value and monetization of data assets are increasing.
So far, DEP has been widely acclaimed for its ability to build hardware networks at lower costs compared to capital-intensive enterprises, such as telecommunications companies. However, DEP’s largest potential market will emerge in the collection of new data sets, which will flow into on-chain intelligent systems: the DEP protocol (discussed later).
In this world, the largest potential market - labor force is being replaced by data and computation. In this world, De AI infrastructure provides a way for non-technical people to seize the means of production and contribute to the upcoming network economy.
Middleware
The ultimate goal of DeAI is to achieve effective composability. Like the capital Lego of Decentralized Finance, DeAI makes up for the lack of absolute performance today through permissionless composability, incentivizing an open ecosystem for software and computing primitives to continuously undergo compound interest over time, in the hope of surpassing existing software and computing primitives.
If Google represents the extreme of “integration”, then DeAI represents the extreme of “modularization”. As Clayton Christensen has warned, in emerging industries, integrated approaches often take the lead by reducing friction in the value chain, but as the field matures, modular value chains will gain a foothold by increasing competition and cost efficiency at each level of the stack.
Integrated vs Modular AI
We are very optimistic about several categories that are crucial for realizing this modular vision:
Router
In an intelligent fragmented world, how can we choose the right mode and time at the best price? The demand aggregator has always been capturing value (see aggregation theory), and the routing function is crucial for optimizing the Pareto curve between performance and cost in the intelligent world of networks.
Bittensor has been at the forefront in the first generation of products, but there have also been many dedicated competitors.
Allora holds competitions between different models in different ‘themes’ using ‘context awareness’ and self-improvement over time, and provides information for future predictions based on historical accuracy under specific conditions.
Morpheus’s goal is to be the “demand-side router” for Web3 use cases - essentially a local proxy with Open Source that can understand the relevant context of users and effectively route queries through the emerging components of Decentralized Finance or Web3’s “composable computing” infrastructure, also known as “Apple Intelligence”.
Agent interoperability protocol, such as Theoriq and Autonolas, aims to push modular routing to the extreme, making flexible Agents or components’ composable, composite ecosystem a fully mature on-chain service.
In short, in a smart and rapidly fragmented world, the supply and demand aggregator will play an extremely powerful role. If Google is a $2 million company that indexes information for the world, then the winner of the demand-side router—whether it’s Apple, Google, or a Web3 solution—is the company that serves as an agent for smart index compilation and will generate even greater scale.
Coprocessor
Given its decentralization, blockchain is greatly limited in terms of data and computation. How to introduce the computation and data-intensive AI applications that users need into blockchain? Through coprocessors!
Coprocessor in the Application Layer of Crypto
They have all provided different technologies to ‘verify’ the underlying data or model being used by the effective ‘Oracle Machine’, which can minimize new trust assumptions on-chain and greatly improve its capabilities. So far, many projects have used zkML, opML, TeeML, and encryption economic methods, each with its own advantages and disadvantages:
Co-processor Comparison
At a higher level, co-processors are crucial for the intelligence of Smart Contracts-providing solutions similar to a ‘data warehouse’ for querying for a more personalized on-chain experience, or verifying whether a given inference has been completed correctly.
TEE (Trusted Execution Environment) networks, such as SUPER, Phala, and Marlin, have become increasingly popular recently due to their practicality and ability to support large-scale applications.
Overall, coprocessors are crucial for integrating high-deterministic but low-performance blockchain with high-performance but probabilistic intelligent agents. Without coprocessors, AI would not appear in this generation of blockchain.
Developer Incentives
One of the biggest problems with AI Open Source development is the lack of incentive mechanisms to make it sustainable. AI development is highly capital-intensive, and the opportunity costs of computation and AI knowledge work are very high. Without proper incentives to reward Open Source contributions, this field will inevitably lose to the supercapitalist Supercomputer.
From Sentiment to Pluralis, Sahara AI, and Mira, the goal of these projects is to launch networks that enable decentralized individual networks to contribute to network intelligence, while providing appropriate incentives.
Through the compensation in the business model, the compound interest rate of Open Source should accelerate - providing a global choice for developers and AI researchers outside of big tech companies, and expecting to receive generous rewards based on the value created.
While achieving this is very difficult and the competition is becoming increasingly intense, the potential market here is huge.
GNN Model
Large language models partition patterns in large text corpora and learn to predict the next word, while Graph Neural Networks (GNNs) handle, analyze, and learn from graph-structured data. Since on-chain data is primarily composed of complex interactions between users and Smart Contracts, in other words, a graph, GNNs seem to be a reasonable choice for supporting on-chain AI use cases.
Projects like Pond and RPS are trying to establish foundational models for web3, which may be applied in use cases such as transactions, Defi, and even social interactions.
AI Finance: Integration with existing Decentralized Finance applications, advanced yield strategies and Liquidity utilization, better Risk Management / governance
On-chain marketing: More targeted Airdrop/positioning, recommendation engine based on on-chain behavior.
These models will heavily utilize data warehouse solutions, such as Space and Time, Subsquid, Covalent, and Hyperline, etc., and I am very bullish on them.
GNN can prove that the large model of blockchain and the Web3 data warehouse are essential auxiliary tools, which provide OLAP (online analytical processing) function for Web3.
Application
In my view, on-chain Agents may be the key to solving the well-known user experience issues of Crypto Assets, but more importantly, over the past decade, we have invested billions of dollars in Web3 infrastructure, but the utilization rate by the demand side is pitifully low.
Don’t worry, Agents are here…
AI’s test scores in various dimensions of human behavior rise
It also seems logical that these agents leverage an open, permissionless infrastructure that spans payments and composable computing to achieve a more complex end goal. In the coming networked smart economy, economic mobility may no longer be B-> B->C, but user-> agent-> computing network -> agent-> user. The end result of this flow is the proxy protocol. Application or service-based enterprises have limited overhead, run primarily on on-chain resources, and are much less expensive to meet the needs of end users (or each other) in a composable network than traditional enterprises. Just as Web2’s Application Layer captures most of its value, I’m also a big fan of the “fat proxy protocol” theory in DeAI. Over time, value capture should shift to the upper layers of the stack.
Value Accumulation in Generative AI
The next Google, Facebook, and Blackrock are likely to be agents of protocol, and the components that implement these protocols are emerging.
DeAI Endgame
AI will change our economic form. Today, the market expects the capture of this value to be limited to a few large companies on the West Coast of North America. DeAI represents a different vision. An open and composable intelligent network vision that rewards and rewards even tiny contributions, as well as more collective ownership/management rights.
Although some of the claims of DeAI are exaggerated, and the trading prices of many projects are much higher than the current actual driving force, the scale of the opportunity is indeed very objective. For those who are patient and have vision, the ultimate vision of DeAI’s truly computable combination may prove the rationality of the Block chain itself.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Delphi Digital: Depth解析DeAI的机遇与挑战
Original Author: PonderingDurian, Researcher at Delphi Digital
Original compilation: Pzai, Foresight News
Given that Cryptocurrency is essentially Open Source software with built-in economic incentives, and AI is disrupting the way software is written, AI will have a huge impact on the entire blockchain space.
AI x Crypto Overall Stack
DeAI: Opportunities and Challenges
In my opinion, the biggest challenge facing DeAI lies in the infrastructure layer, because building foundational models requires a large amount of capital, and the scale of data and computation also yields high returns.
Considering the law of scaling, tech giants have a natural advantage: during the Web2 phase, they made huge profits from monopoly profits that aggregated consumer demand and reinvested those profits in cloud infrastructure during a decade of artificially low rates, and now, internet giants are trying to capture the AI market by capturing data and computing (a key element of AI):
Comparison of the token volume of the large model
Due to the capital intensity and high bandwidth requirements of large-scale training, a unified supercluster is still the best choice - providing technology giants with the best closed-source models - they plan to rent these models with monopolistic profits and reinvest the proceeds in each subsequent generation of products.
However, the fact is that the moat in the AI field is shallower than the network effect of Web2, and the leading-edge models quickly depreciate relative to the field, especially Meta’s open-source advanced models such as Llama 3.1, which has reached the SOTA level with an investment of billions of dollars.
Llama 3 Model Rating
At this point, the emerging research on low latency distributed training methods, combined with the potential commercialization of (some) cutting-edge business models, may shift competition (at least partially) from hardware superclusters (favorable to tech giants) to software innovation (slightly favorable to Open Source / Crypto Assets) as intelligent prices drop.
Ability Index (Quality) - Training Price Distribution Chart
Given the computational efficiency of the ‘hybrid expert’ architecture and large model synthesis/routing, we are likely to face not just a world of 3-5 giant models, but a world of millions of models with different cost/performance trade-offs. A intertwined intelligent network (hive).
This creates a huge coordination problem: blockchain and Cryptocurrency incentive mechanisms should be able to help solve this problem well.
Core DeAI Investment Areas
Software is eating the world. AI is eating software. And AI is basically data and computation.
Delphi is optimistic about various components in this stack:
Simplified AI x Crypto Stack
Infrastructure
Given that the power of AI comes from data and computing, DeAI infrastructure is committed to procuring data and computing as efficiently as possible, often using Cryptocurrency incentive mechanisms. As we mentioned earlier, this is the most challenging part of the competition, but it may also be the most rewarding part given the size of the end market.
Calculate
So far, distributed training protocol and GPU market have been constrained by latency, but they hope to coordinate potential heterogeneous hardware to provide lower cost, on-demand computing services for those who have been rejected by the integration solutions of giants. Companies such as Gensyn, Prime Intellect, and Neuromesh are driving the development of distributed training, while companies like io.net, Akash, and Aethir are achieving low-cost inference closer to edge intelligence.
Project ecological distribution based on aggregated supply position
Data
In an ubiquitous intelligent world based on smaller and more specialized models, the value and monetization of data assets are increasing.
So far, DEP has been widely acclaimed for its ability to build hardware networks at lower costs compared to capital-intensive enterprises, such as telecommunications companies. However, DEP’s largest potential market will emerge in the collection of new data sets, which will flow into on-chain intelligent systems: the DEP protocol (discussed later).
In this world, the largest potential market - labor force is being replaced by data and computation. In this world, De AI infrastructure provides a way for non-technical people to seize the means of production and contribute to the upcoming network economy.
Middleware
The ultimate goal of DeAI is to achieve effective composability. Like the capital Lego of Decentralized Finance, DeAI makes up for the lack of absolute performance today through permissionless composability, incentivizing an open ecosystem for software and computing primitives to continuously undergo compound interest over time, in the hope of surpassing existing software and computing primitives.
If Google represents the extreme of “integration”, then DeAI represents the extreme of “modularization”. As Clayton Christensen has warned, in emerging industries, integrated approaches often take the lead by reducing friction in the value chain, but as the field matures, modular value chains will gain a foothold by increasing competition and cost efficiency at each level of the stack.
Integrated vs Modular AI
We are very optimistic about several categories that are crucial for realizing this modular vision:
Router
In an intelligent fragmented world, how can we choose the right mode and time at the best price? The demand aggregator has always been capturing value (see aggregation theory), and the routing function is crucial for optimizing the Pareto curve between performance and cost in the intelligent world of networks.![Delphi Digital:深度解析DeAI的机遇与挑战]()
Bittensor has been at the forefront in the first generation of products, but there have also been many dedicated competitors.
Allora holds competitions between different models in different ‘themes’ using ‘context awareness’ and self-improvement over time, and provides information for future predictions based on historical accuracy under specific conditions.
Morpheus’s goal is to be the “demand-side router” for Web3 use cases - essentially a local proxy with Open Source that can understand the relevant context of users and effectively route queries through the emerging components of Decentralized Finance or Web3’s “composable computing” infrastructure, also known as “Apple Intelligence”.
Agent interoperability protocol, such as Theoriq and Autonolas, aims to push modular routing to the extreme, making flexible Agents or components’ composable, composite ecosystem a fully mature on-chain service.
In short, in a smart and rapidly fragmented world, the supply and demand aggregator will play an extremely powerful role. If Google is a $2 million company that indexes information for the world, then the winner of the demand-side router—whether it’s Apple, Google, or a Web3 solution—is the company that serves as an agent for smart index compilation and will generate even greater scale.
Coprocessor
Given its decentralization, blockchain is greatly limited in terms of data and computation. How to introduce the computation and data-intensive AI applications that users need into blockchain? Through coprocessors!
Coprocessor in the Application Layer of Crypto
They have all provided different technologies to ‘verify’ the underlying data or model being used by the effective ‘Oracle Machine’, which can minimize new trust assumptions on-chain and greatly improve its capabilities. So far, many projects have used zkML, opML, TeeML, and encryption economic methods, each with its own advantages and disadvantages:
Co-processor Comparison
At a higher level, co-processors are crucial for the intelligence of Smart Contracts-providing solutions similar to a ‘data warehouse’ for querying for a more personalized on-chain experience, or verifying whether a given inference has been completed correctly.
TEE (Trusted Execution Environment) networks, such as SUPER, Phala, and Marlin, have become increasingly popular recently due to their practicality and ability to support large-scale applications.
Overall, coprocessors are crucial for integrating high-deterministic but low-performance blockchain with high-performance but probabilistic intelligent agents. Without coprocessors, AI would not appear in this generation of blockchain.
Developer Incentives
One of the biggest problems with AI Open Source development is the lack of incentive mechanisms to make it sustainable. AI development is highly capital-intensive, and the opportunity costs of computation and AI knowledge work are very high. Without proper incentives to reward Open Source contributions, this field will inevitably lose to the supercapitalist Supercomputer.
From Sentiment to Pluralis, Sahara AI, and Mira, the goal of these projects is to launch networks that enable decentralized individual networks to contribute to network intelligence, while providing appropriate incentives.
Through the compensation in the business model, the compound interest rate of Open Source should accelerate - providing a global choice for developers and AI researchers outside of big tech companies, and expecting to receive generous rewards based on the value created.
While achieving this is very difficult and the competition is becoming increasingly intense, the potential market here is huge.
GNN Model
Large language models partition patterns in large text corpora and learn to predict the next word, while Graph Neural Networks (GNNs) handle, analyze, and learn from graph-structured data. Since on-chain data is primarily composed of complex interactions between users and Smart Contracts, in other words, a graph, GNNs seem to be a reasonable choice for supporting on-chain AI use cases.
Projects like Pond and RPS are trying to establish foundational models for web3, which may be applied in use cases such as transactions, Defi, and even social interactions.
These models will heavily utilize data warehouse solutions, such as Space and Time, Subsquid, Covalent, and Hyperline, etc., and I am very bullish on them.
GNN can prove that the large model of blockchain and the Web3 data warehouse are essential auxiliary tools, which provide OLAP (online analytical processing) function for Web3.
Application
In my view, on-chain Agents may be the key to solving the well-known user experience issues of Crypto Assets, but more importantly, over the past decade, we have invested billions of dollars in Web3 infrastructure, but the utilization rate by the demand side is pitifully low.
Don’t worry, Agents are here…
AI’s test scores in various dimensions of human behavior rise
It also seems logical that these agents leverage an open, permissionless infrastructure that spans payments and composable computing to achieve a more complex end goal. In the coming networked smart economy, economic mobility may no longer be B-> B->C, but user-> agent-> computing network -> agent-> user. The end result of this flow is the proxy protocol. Application or service-based enterprises have limited overhead, run primarily on on-chain resources, and are much less expensive to meet the needs of end users (or each other) in a composable network than traditional enterprises. Just as Web2’s Application Layer captures most of its value, I’m also a big fan of the “fat proxy protocol” theory in DeAI. Over time, value capture should shift to the upper layers of the stack.
Value Accumulation in Generative AI
The next Google, Facebook, and Blackrock are likely to be agents of protocol, and the components that implement these protocols are emerging.
DeAI Endgame
AI will change our economic form. Today, the market expects the capture of this value to be limited to a few large companies on the West Coast of North America. DeAI represents a different vision. An open and composable intelligent network vision that rewards and rewards even tiny contributions, as well as more collective ownership/management rights.
Although some of the claims of DeAI are exaggerated, and the trading prices of many projects are much higher than the current actual driving force, the scale of the opportunity is indeed very objective. For those who are patient and have vision, the ultimate vision of DeAI’s truly computable combination may prove the rationality of the Block chain itself.