Quantum computing sparks a wave of IPOs, Jensen Huang's "ambition" can no longer be hidden

Writing by: Miao Zheng

A few years ago, quantum mechanics was often treated as a joke: when in doubt, quantum mechanics.

But now, the joke has turned into an IPO prospectus.

In the past few months, three quantum computing companies—Infleqtion, Xanadu, and Horizon Quantum—have gone public one after another, with several more lining up to enter Nasdaq.

A project that once belonged only to laboratories and science fiction movies has suddenly been pushed into the public market.

The question is, has quantum computing really reached the eve of commercial explosion?

I don’t think so.

The most interesting part of this wave of IPOs isn’t that it proves quantum computing has matured, but that it exposes the true state of the industry.

Although everyone calls it quantum computing, the technical approaches are wildly different.

Not only that, but when you carefully examine these companies’ financial reports, you’ll find that only a few general-purpose quantum computers have been sold. Instead, the surrounding products of quantum computing sustain these companies’ operations.

Moreover, although this business is still in its early stages, Nvidia has already entered the scene.

As early as 2021, Nvidia used GPUs to help researchers simulate quantum circuits on classical computers.

Later, it invested in multiple quantum startup companies. At GTC 2025, Jensen Huang even announced the establishment of the Boston Quantum Research Center, NVAQC.

However, Huang’s goal isn’t the quantum computer itself; he wants to turn Nvidia into the underlying gateway of the quantum era.

Just like in the AI era, Nvidia isn’t just selling models; it’s selling the computing power needed for training and inference.

Whether Nvidia can replicate this success remains uncertain for now. But before that, let’s first understand what the current state of quantum computing really looks like.

Technical Approaches

Although everyone calls it quantum computing, the technical routes are vastly different. There are four main approaches, each based on completely different physical principles.

Superconducting quantum computing is currently the fastest route to industrialization.

Big companies like IBM, Google, and Rigetti are all on this path.

Its principle is to use Josephson junctions to build artificial qubits. This requires extremely low temperatures, reaching near absolute zero.

This is quite a niche fact: the temperature environment needed for superconducting quantum computing is colder than outer space, which is about 2.7 Kelvin.

The advantage of superconducting quantum computing is that its process is close to traditional semiconductors, offering strong scalability, but it has short coherence times and high noise.

This route has the largest funding scale, but its reliance on refrigeration systems makes costs high—dilution refrigerators cost several million dollars each.

IBM’s “Golden Eye” dilution refrigerator costs over $800k, with annual electricity costs exceeding $100k.

Larger systems, like Rigetti’s support for 500-qubit systems, can cost over $2 million. Cooling systems account for over 90% of the total cost of superconducting quantum computers.

Ion trap quantum computing is another approach.

Currently, IonQ and Quantinuum are working on this. They use charged ions as qubits, manipulated with lasers to perform quantum gates. This approach has the highest fidelity for quantum gates.

It’s like a big abacus: the charged ions are the beads, and each laser pulse is like moving a bead. High fidelity means more accurate operations and fewer errors.

IonQ announced in October 2025 that it achieved a 99.99% fidelity for two-qubit gates, a world record. Quantinuum had already reached over 99.9% fidelity in 2024. Their coherence times are also the longest, ranging from 0.2 seconds to 600 seconds, far surpassing the tens of microseconds of superconducting routes.

But the problem with ion traps is the difficulty in scaling the number of qubits.

The more ions there are, the harder they are to control. So, increasing qubit count isn’t just about adding more ions; it requires more complex control systems. This makes ion trap quantum computing prone to hitting a scalability ceiling.

Neutral atom quantum computing has emerged only in the past two years but is currently the hottest approach, with Infleqtion, Pasqal, and QuEra leading the way.

Its principle is to use optical lattices to trap arrays of neutral atoms, using focused laser beams—optical tweezers—to hold the atoms in place. Its biggest advantage is that it can easily reach thousands of qubits, with relatively long coherence times.

Infleqtion has already achieved an array of 1,600 physical qubits, setting a record. Its entanglement fidelity reaches 99.73%, the highest among neutral atom companies.

Infleqtion went public in February 2026. CEO Matthew Kinsella stated, “Neutral atoms are moving from scientific progress toward commercial relevance.”

Finally, there’s photonic quantum computing, which is perhaps the easiest to understand.

Xanadu, mentioned earlier, is pursuing this approach.

Its principle is to use photons as information carriers. Its biggest advantage is that it operates at room temperature, without vacuum or refrigeration systems, making it naturally suitable for integrating quantum communication and computing.

Xanadu became the first photonic quantum company to go public in March 2026. Its Aurora system claims to be the first modular, networked photonic quantum computer with real-time error correction, aiming to reach 500 logical qubits by 2029-2030.

Aurora consists of four independent server racks interconnected via fiber optics, containing 12 qubits, 35 photonic chips, and 13 kilometers of fiber. It operates at room temperature, with only the photon detectors needing cooling.

This is a natural advantage of photonic quantum computing.

However, the fidelity of photonic gates is far below that of superconducting and ion trap systems.

Photons don’t naturally interact with each other; two photons can pass through each other without disturbance. This makes implementing deterministic two-qubit gates very difficult. Additionally, photons experience loss during transmission, leading to information loss.

In other words, achieving the same computational power with photonic quantum computers is significantly more challenging than other approaches.

Which is more reliable? In terms of technological maturity, superconducting and ion trap are closest to commercialization, while neutral atoms and photonics are still in the “potential” stage.

But the immediate question is, which approach offers the best cost-performance ratio? That involves considering performance, costs, deployment, and other factors.

The essence of this wave of IPOs is that the capital market is being forced to vote on different technical routes for the first time. Investors are no longer satisfied with the grand narrative that “quantum computing is important”; they want to see costs and revenues.

Xanadu’s stock rose 15% on its first day but fell over 10% after hours. Horizon Quantum dropped 18% after hours. Infleqtion was valued at $1.8 billion at IPO in February, peaking at a market cap of $3.8 billion, but by April, its market cap had fallen to around $800k.

Nvidia’s Quantum Ambitions

When it comes to computing, Nvidia can’t be ignored.

Nvidia’s quantum strategy is very clear: it aims to replicate the success of CUDA by creating CUDA-Q, the quantum version of CUDA.

But before explaining, I need to introduce a concept—fault-tolerant quantum computing.

The qubits we discussed earlier are extremely fragile. Temperature, vibrations, electromagnetic noise, photon loss, and even imperfect operations can cause quantum states to drift.

Fault-tolerant quantum computing involves adding a whole set of protective mechanisms to these building blocks.

It uses many unreliable physical qubits combined into a more reliable “logical qubit.” Even if some physical qubits fail, the system can detect, correct, and continue computing.

It’s like telling a message to 100 people to pass along—some might forget or miscommunicate, but at least someone will remember.

On the hardware side, Nvidia has developed the NVQLink platform architecture. It enables microsecond-level low-latency connections between GPUs and quantum processors via RDMA over Ethernet, with latency below 4 microseconds. This latency is critical for quantum error correction.

For the most advanced quantum processors, each error correction decoding window is only a few microseconds. NVQLink allows GPUs to perform error correction decoding within the QPU’s clock cycle, which is essential for fault-tolerant quantum computing.

On the software side, Nvidia has developed the CUDA-Q platform and CUDA-Q QEC library, providing a unified programming interface.

Developers can write hybrid quantum-classical applications in the same environment without worrying about hardware differences. The latest CUDA-Q QEC 0.6 version, released in April 2026, has deep integration with NVQLink, supporting real-time GPU decoding.

On the ecosystem front, Nvidia collaborates with over a dozen supercomputing centers worldwide, including Japan’s G-QuAT and Singapore’s National Quantum Computing Center, integrating quantum processors into existing HPC infrastructure.

Quantinuum has announced that its latest Helios QPU and all future processors will be integrated with Nvidia GPUs via NVQLink. The Helios QPU is equipped with Nvidia’s GH200 Grace Hopper as a real-time host for quantum error correction.

Today, quantum computing is at a turning point—from “lab prototypes” to “requiring large-scale classical computing support.” Quantum error correction, calibration, and hybrid algorithms all demand powerful classical computing in real time, which is Nvidia’s forte.

But here’s a problem: quantum computing isn’t AI.

The explosion of AI is because deep learning is a killer app for GPUs—only GPUs can do it well, and CPUs can’t. That’s what made Nvidia dominant.

So far, quantum computing still lacks a killer app.

Applications that would make enterprises willing to pay for quantum computing time are still unclear.

Industry predictions for when fault-tolerant quantum computers will be released range from 5 to 10 years. Nvidia, betting on physical AI and digital twins, might not have that much time or energy to double down on quantum computing.

In September 2025, Nvidia invested in Quantinuum, QuEra, and PsiQuantum—covering ion traps, neutral atoms, and photonics. This shows Nvidia is casting a wide net but also indicates uncertainty about which route will ultimately succeed.

If the coherence time of quantum processors improves significantly or if new architectures that don’t rely on real-time error correction emerge, NVQLink could become obsolete.

Nvidia is betting that “quantum computing will inevitably become fault-tolerant, and fault tolerance will require powerful classical support.”

This assumption seems reasonable for now but isn’t the only possible path.

AI moved from lab to commercialization in about ten years—from AlexNet in 2012 to ChatGPT in 2022.

But quantum computing is still in an earlier stage. If it takes 10 years to reach commercialization, can Nvidia wait that long?

What is the industry’s real situation?

Looking into the quantum industry, few companies are actually selling universal quantum computers. Most revenue now comes from peripheral products.

This is also the most noteworthy aspect of this wave of IPOs.

Most quantum companies today aren’t generating revenue from their main product—general-purpose quantum computers. Instead, they profit from quantum sensors, quantum clocks, control chips, software stacks, and HPC integration services.

A mature, scalable commercial market for universal quantum computers has yet to form.

To put it plainly, the industry is using fringe products to support a long-term main line.

Infleqtion’s main revenue comes from optical atomic clocks, quantum RF receivers, and inertial sensors, used in energy and space sectors.

As of June 2025, Infleqtion had sold three quantum computers and hundreds of quantum sensors. Its revenue in the past 12 months was about $29 million, with a compound annual growth rate of roughly 80% over two years. Revenue in 2026 is projected to reach $40 million.

Prices for quantum sensors range from tens of thousands to hundreds of thousands of dollars. Research-grade atomic clocks and gravimeters can cost over $500k.

As manufacturing scales up, costs are expected to drop by an order of magnitude over the next decade—similar to solid-state LiDAR, which used to cost tens of thousands, now around $2,000.

Xanadu’s situation is similar; most revenue comes from quantum computing peripherals, primarily from its top three clients.

Additionally, nearly all listed quantum companies receive substantial government funding.

Xanadu has secured support from DARPA projects and Canada’s “Quantum Champion” program. Infleqtion, IonQ, and Rigetti all have contracts with the U.S. Department of Defense and Energy.

The key question is: how long can this fringe revenue model sustain?

The market for quantum sensing is limited.

Products like atomic clocks and inertial sensors mainly serve defense, aerospace, and scientific research—markets that aren’t large enough to support a multi-billion-dollar valuation. Even government contracts can’t grow indefinitely; the “landlord” has limited resources.

Cloud services for quantum computers, before reaching “quantum supremacy,” will also struggle to scale. Currently, quantum computers are far from cost-effective compared to traditional computers.

You might say, SpaceX initially relied on launch services to fund its Mars plans, and Tesla used carbon credits to subsidize EV R&D.

But don’t forget: SpaceX’s launch service is itself a huge market, and rocket technology is universal—launching satellites or going to Mars uses the same tech. Tesla’s EVs, though initially loss-making, are at least products that consumers want, with real market demand.

Quantum computing is different. No matter how much quantum sensors are sold, it’s hard to sustain a company worth billions long-term.

The current situation in the quantum industry is somewhat awkward. Technologically, progress is real, but the path to true commercialization is still very long—so long that even entrepreneurs can’t give a precise timeline.

How far this model can go depends on two factors. One is the speed of technological breakthroughs. If a major breakthrough occurs—such as coherence time increasing by an order of magnitude or error correction efficiency improving significantly—the industry’s commercialization process will accelerate.

The second is the patience of capital markets. Those who dared to invest in AI ten years ago, after seeing Anthropic and OpenAI, are probably more willing to invest in quantum computing now.

In my view, this wave of IPOs isn’t really the start of quantum computing commercialization; it’s more like a stress test for the capital market’s confidence in the industry. If you can wait, you might as well invest now.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments