DecentralizationAI Research: Concepts, Architecture, and Industry Review of DecentralizationAI

Introduction

In the digital age, artificial intelligence (AI) has become a key force driving technological innovation and social progress. The development of AI is not only a technological advancement, but also an extension of human wisdom. AI has been a hot topic in the venture capital industry and the capital market over the past period of time.

With the development of blockchain technology, Decentralized AI has emerged. This article will explain the definition, architecture, and how it collaborates with the artificial intelligence industry.

Definition and Architecture of Decentralized AI

Decentralized AI utilizes decentralized computing resources and data storage to train and use AI models in a distributed manner, enhancing privacy and security. Its four-layer architecture includes:

• **Model Layer: Supports decentralized AI model development, sharing, and trading, promoting collaboration and innovation on a global scale. **Representative projects in this layer include Bittensor, which utilizes blockchain technology to create a global platform for AI model sharing and collaboration.

Training Layer: Using smart contracts and decentralized technology to reduce the cost of AI model training and simplify the process, improving training efficiency. The challenge at this level is how to effectively utilize distributed computing resources for efficient model training.

Data Layer: Using blockchain technology to store and manage data, ensuring the security and immutability of data, while giving users complete control over the data. Applications at this level include decentralized data marketplaces, which utilize blockchain technology to enable transparent data transactions and ownership confirmation.

Computing Power Layer: By providing decentralized GPU computing platform and bandwidth support, it offers distributed computing resources to support efficient training and inference of AI models. Technological advancements in this layer, such as edge computing and distributed GPU networks, provide new solutions for training and inference of AI models.

Decentralized AI Representative Project

Decentralized AI Industry Overview: Model Layer

Model Layer: The parameter amount of large models is increasing exponentially, and the performance of the models is significantly improving, but the benefits of further scaling the models are gradually diminishing. This trend requires us to rethink the development direction of AI models, and how to reduce costs and resource consumption while maintaining performance.

The development of AI large models follows the ‘scale law’, which means that there is a certain relationship between the performance of the model and the scale of parameters, dataset size, and computational resources.

When the model expands to a certain scale, its performance in a specific task will suddenly and significantly improve. With the increase in the number of parameters of large models, the magnitude of performance improvement gradually decreases. Balancing the scale of parameters and model performance will be a key factor in future development.

!

We have seen that API price competition for AI large models has intensified, and many manufacturers have reduced prices to increase market share. However, with the homogenization of large model performance, the sustainability of API revenue is also questioned. How to maintain high user stickiness and increase revenue will be a big challenge in the future.

!

The application of the device-side model will be achieved by reducing data accuracy and adopting a hybrid expert model (MoE) architecture. Model quantization can compress 32-bit floating-point data into 8 bits,** significantly reducing model size and memory consumption. In this way, the model can run efficiently on device-side devices, driving the further popularization of AI technology. **

!

Summary: Blockchain helps to improve the transparency, collaboration, and user participation of the model layer of AI models.

!

Centralized AI Industry Combing: Training Layer

Training layer: Large model training requires high-bandwidth and low-latency communication, and there is a possibility for decentralized computing networks to try large models. The challenge at this level is how to optimize the allocation of communication and computing resources for more efficient model training.

Decentralized computing power networks have great potential in large-scale model training. Despite the challenge of excessive communication overhead, training efficiency can be significantly improved by optimizing scheduling algorithms and compressing transmission data. However, overcoming network latency and data transmission bottlenecks in practical environments remains a major challenge for decentralized training.

To address the bottleneck of large model training in decentralized computing power networks, we can employ techniques such as data compression, scheduling optimization, and local updates and synchronization. These methods can reduce communication overhead, improve training efficiency, and make decentralized computing power networks a viable option for large model training.

Zero-Knowledge Machine Learning (zkML) combines zero-knowledge proofs and machine learning technologies, allowing model verification and inference without exposing training data and model details. This technology is particularly suitable for industries with high confidentiality requirements for data, such as medical and financial sectors, ensuring data privacy while verifying the accuracy and reliability of AI models.

Decentralized AI Industry Combing: Data Layer

The privacy and security of data has become a key issue in the development of AI. Decentralized data storage and processing technologies provide new solutions to address these problems.

Data storage, data indexing, and data application are all key links to ensure the normal operation of decentralized AI systems. Decentralized storage platforms such as Filecoin and Arweave have provided new solutions in data security and privacy protection, and reduced storage costs.

Decentralized Storage Cases:

!

  • Arweave’s data storage scale has grown rapidly since 2020, driven primarily by demand from the NFT marketplace and Web3 applications. Through Arweave, users can achieve decentralized permanent data storage, solving the problem of long-term data storage.
  • The AO project further enhances the Arweave ecosystem, providing users with more powerful computing capabilities and a wider range of application scenarios.

!

On this page, we compare two decentralized storage projects, Arweave and Filecoin. Arweave enables perpetual storage with a one-time payment, while Filecoin uses a monthly payment model that focuses on providing flexible storage services. Both have their own advantages in terms of technical architecture, business scale and market positioning, and users can choose the appropriate solution according to their specific needs.

Decentralized AI Industry Overview: Computing Power Layer

Computing Power Layer: As the complexity of AI models increases, the demand for computing resources is also growing. The emergence of decentralized computing power networks provides a new way of resource allocation for the training and inference of AI models.

!

Decentralized computing networks (and specialized computing networks for training and inference) are currently the most active and fastest-growing areas in the DeAI field. This is consistent with the infrastructure providers in the real world capturing the abundant fruits of the AI industry chain. With the ongoing shortage of computing resources such as GPUs, manufacturers of computing resource hardware devices are entering this field one after another.

Aethir Case:

!

Business Model: A bilateral market for computing power leasing

The decentralized computing marketplace essentially uses Web3 technology to extend the concept of grid computing into an economically incentive, trustless environment. **By incentivizing resource providers such as CPUs and GPUs to contribute idle computing power to the decentralized network, a decentralized computing power service market with a certain scale will be formed; It also connects the demanders of computing resources (such as model providers) to provide them with computing service resources at a lower cost and in a more flexible way. The decentralized computing power market is also a challenge to cloud service providers with centralized monopolies.

  • The decentralized computing power market can be further divided into: general-purpose and dedicated. General-purpose computing networks operate like decentralized clouds, providing computing resources for various applications. Dedicated computing networks are primarily for specific-purpose computing, tailored to specific use cases. For example, Render Network is a dedicated computing network focused on rendering workloads; Gensyn is a dedicated computing network focused on ML model training; and io.net is an example of a general-purpose computing network.
  • For DeAI, one important challenge in training models on decentralized infrastructure is the high latency caused by the limitations of large-scale computing power, bandwidth, and the use of heterogeneous hardware from different global vendors. Therefore, a dedicated AI computing network can provide more AI-adaptive functionality than a general-purpose computing network. Currently, centralized training of ML models remains the most efficient and stable approach, but it requires significant capital strength from project parties.

Epilogue

As an emerging technology trend, decentralized AI is gradually demonstrating its advantages in data privacy, security, and cost-effectiveness. In the next article, we will explore the risks and challenges faced by decentralized AI, as well as future development directions.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 1
  • Repost
  • Share
Comment
0/400
志在方圆vip
· 2024-12-11 14:59
bull回速归 🐂
View OriginalReply0
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)