X releases the original source code of the algorithm "Phoenix"! Feeding each other with Grok, Elon Musk chooses to keep the model weights confidential

robot
Abstract generation in progress

Elon Musk rewrote the X recommendation algorithm into a Transformer architecture and open-sourced it, but did not release the model weights or training data.
(Background: Elon Musk’s online advocacy for OpenAI: the former “top supporter” angrily demanded 134 billion USD, but ultimately sentiment was defeated by business?)
(Additional context: The ultimate guide to earning money by writing articles on the X platform: Musk launches X Articles with doubled revenue, audience targeting, fact presentation, word reduction, subscription promotion…)

Table of Contents

  • From Hard Rules to AI Prediction: A Complete Shift
  • Recommendation Score Formula Reveals “Attention Economy”
  • Open-Source Scope and Black Box Boundaries

On January 20th, US time, Elon Musk uploaded the new code “Phoenix” for the X platform’s recommendation algorithm to GitHub. The documentation shows that the system has fully transitioned from manual feature engineering to an AI architecture centered on Transformers, but the model weights and training data have not been released together.

From Hard Rules to AI Prediction: A Complete Shift

Over the past decade, tweet ranking on X (formerly Twitter) mainly relied on engineer-defined “if… then…” rules, such as keywords, follower relationships, or dwell time. According to the current code structure disclosed by X, Phoenix has removed most manual features and instead uses a Transformer-based analysis, similar to xAI’s Grok, to interpret user behavior sequences.

Likes, shares, blocks, and browsing duration are treated as continuous events. The model predicts the next action through probability distributions, determining content exposure level and scope.

Recommendation Score Formula Reveals “Attention Economy”

In the Git documentation, the core calculation logic is simplified as:

Score = Σ (Probability × Weight)

This indicates that the system estimates the probability of various behaviors triggered by a single tweet, then multiplies by platform-assigned weights.

For example, if the like probability is 60% and the block probability is 5%, and the platform assigns positive weight to “like” and negative weight to “block,” the final score will directly influence whether the tweet enters the recommendation flow. The document notes that dwell time can even be quantified to the second, meaning content creators are more algorithmically encouraged to “keep users engaged.” The specific weights for each behavior are not disclosed in the codebase.

Open-Source Scope and Black Box Boundaries

Although the code is available for review, the actual model parameters and complete training data have not been made public. Market analysis suggests that compared to TikTok or Meta’s fully closed systems, Phoenix at least provides the computation process; but without weights, external developers cannot verify the recommendation effectiveness or reproduce the model.

This contrasts with the partial parameter open-sourcing seen when X first open-sourced in 2023. Musk responded to community concerns about transparency from Slashdot and others with “demonstrative open-source,” but retained the core business moat.

Phoenix also symbolizes the integration of X and xAI’s tech stacks. The X platform feeds Grok with vast real-time interaction data, which in turn leads Grok to dominate traffic allocation, forming a closed loop.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)