Futures
Access hundreds of perpetual contracts
CFD
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
Just realized something worth paying attention to if you're tracking where the real infrastructure shifts are happening. The quantum computing space had an unusually busy 2024 — and I mean genuinely busy, not the usual hype cycle. Three separate breakthroughs from different companies using completely different approaches all landed within months of each other. That's the kind of pattern that usually signals a field is actually moving forward instead of recycling the same narrative.
Let me break down what actually happened and why it matters for anyone watching the convergence of quantum tech and digital infrastructure.
Google dropped Willow in December — 105-qubit superconducting processor, built at UC Santa Barbara. The headline sounds standard until you understand what they actually demonstrated. As they added more qubits, the error rate went down instead of up. That's been the central problem for quantum systems for nearly 30 years. More qubits always meant more noise, more cascading errors, less reliability. Willow flipped that relationship. They called it "below-threshold" operation — the architectural proof point that scaling actually helps rather than hurts.
The benchmark they published alongside it got instant attention: random circuit sampling computation completed in under five minutes that would take classical supercomputers 10 septillion years. But here's the honest part — that's still a narrow test case. It proves certain computations are classically intractable on this chip. It doesn't mean Willow is running drug discovery or climate models yet. What it does show is that large-scale error-corrected quantum computing isn't theoretical anymore. It's an engineering path that works.
Meanwhile, Microsoft and Quantinuum had already moved the needle in April with something that got less press but more researcher attention. They demonstrated logical qubits with error rates 800 times lower than the physical qubits they were built from. This is the real dividing line in quantum progress. Physical qubits are noisy hardware units. Logical qubits combine multiple physical qubits to encode information redundantly so errors can be detected and corrected. The overhead always made it impractical. An 800x improvement changes that calculus completely.
Then Microsoft extended it further in November, working with Atom Computing to create and entangle 24 logical qubits using ultracold neutral ytterbium atoms. Different hardware architecture entirely from Google's approach. Single-qubit gate fidelities hit 99.963%. Two-qubit operations at 99.56%. By December, Quantinuum had pushed it to 50 entangled logical qubits. That's the kind of progress pattern that matters — multiple viable paths advancing simultaneously rather than the field betting everything on one approach.
IBM's contribution was quieter but equally significant if you're thinking about where practical quantum computing actually emerges. Heron R2 processor in November: 156 qubits, 2-qubit gate errors down to 8×10⁻⁴, execution of circuits with up to 5,000 two-qubit gates. Workloads that took 120+ hours were running in 2.4 hours. That's measured, reproducible progress — the kind that actually gets deployed to enterprise clients.
But the most technically significant IBM result was their new error correction code. Conventional quantum error correction requires roughly 3,000 physical qubits to encode a single reliable logical qubit. IBM's bivariate bicycle qLDPC code achieves comparable error suppression with only 288 qubits total. That's a 10x efficiency gain. Suddenly fault-tolerant quantum computing looks less like a distant goal and more like an engineering problem with a defined solution.
Here's what made 2024 genuinely different: the field stopped progressing in one direction and started progressing in all directions simultaneously. Hardware improvements, error correction breakthroughs, logical qubit milestones, software efficiency, cryptographic standards. It shifted from theoretical physics toward engineering discipline.
On the cryptography side — and this is directly relevant for blockchain infrastructure — NIST formally published post-quantum cryptography standards in August 2024. ML-KEM and ML-DSA algorithms designed to resist quantum attacks. This wasn't academic exercise. It was the first concrete acknowledgment that quantum computers capable of breaking current encryption are no longer purely theoretical. Governments and enterprises need to start transitioning now. The deployment timeline from standard publication to widespread adoption typically runs a decade or more. NIST effectively started that clock in 2024.
For digital asset security specifically, this matters. Current asymmetric encryption protecting wallets, transactions, and smart contracts will eventually need quantum-resistant alternatives. We're not talking about immediate threat — but the infrastructure transition is now officially underway.
The honest assessment: quantum computing didn't "arrive" in 2024. Willow isn't running commercial applications yet. Logical qubits can detect errors, but full error correction is still being worked through. Neutral atom systems require sophisticated laser infrastructure that doesn't exist at scale. But what 2024 proved is more important than what it didn't. The latest breakthroughs in quantum computing 2024 established that large-scale error-corrected quantum systems are possible across multiple hardware approaches. The question shifted from "is this possible?" to "which approach scales fastest and when do the applications justify the investment?"
Looking at the trajectory, Google's next milestone is achieving fault-tolerant operation beyond the benchmark demonstrations. Microsoft is targeting 50-100 entangled logical qubits in commercial deployments within a few years. IBM's Starling processor, projected for 2029, aims for 100 million gates across 200 error-corrected qubits. The consistent direction across all three: we're past the theoretical phase. The engineering phase is what matters now.
For anyone tracking how quantum computing and digital infrastructure converge, 2024 was the year the field moved from speculation to measurable progress. That's worth watching closely.