A quantum processor is not defined by its best qubit. It is limited by its worst. A week ago, I celebrated the 1.68 ms "hero qubit" T1 time from the Princeton Nature paper. It’s an incredible materials science achievement. But for fault-tolerance, we don't need one perfect qubit. We need thousands of "good enough" qubits that are uniform and stable. So, let's "flip the story" and look at the spread in that same paper. 𝗧𝗵𝗲 𝗗𝗲𝘃𝗶𝗰𝗲-𝘁𝗼-𝗗𝗲𝘃𝗶𝗰𝗲 𝗦𝗽𝗿𝗲𝗮𝗱 The paper transparently reports data on 45 qubits across nine chips. If you look at the time-averaged quality factor (Qavg), the variation is significant. • 𝗧𝗵𝗲 𝗕𝗲𝘀𝘁: Qubit 34 hits an average quality factor of 15.2 million. • 𝗧𝗵𝗲 𝗪𝗼𝗿𝘀𝘁: Qubit 27 only 5.5 million. That is a 𝟯𝘅 𝘀𝗽𝗿𝗲𝗮𝗱 in performance on devices made from the same "recipe." 𝗧𝗵𝗲 𝗧𝗶𝗺𝗲-𝗙𝗹𝘂𝗰𝘁𝘂𝗮𝘁𝗶𝗼𝗻 𝗦𝗽𝗿𝗲𝗮𝗱 Even more critical is the temporal instability. A qubit's T1 is not a fixed number; it fluctuates. The paper reports a mean T1 fluctuation span of 𝟯𝟲%. And this 36% span, monitored over 88 hours, might not even capture the full picture. It misses faster, millisecond-scale fluctuations that the measurement protocol wasn't designed to resolve. Most notably, this 36% span is "𝘀𝗶𝗺𝗶𝗹𝗮𝗿 𝘁𝗼 𝘁𝗵𝗮𝘁 𝗼𝗯𝘀𝗲𝗿𝘃𝗲𝗱 𝗶𝗻 𝗽𝗿𝗲𝘃𝗶𝗼𝘂𝘀 𝗴𝗲𝗻𝗲𝗿𝗮𝘁𝗶𝗼𝗻𝘀 𝗼𝗳 𝗾𝘂𝗯𝗶𝘁𝘀, 𝗱𝗲𝘀𝗽𝗶𝘁𝗲 𝗽𝗿𝗲𝘃𝗶𝗼𝘂𝘀 𝗾𝘂𝗯𝗶𝘁𝘀 𝗵𝗮𝘃𝗶𝗻𝗴 𝗺𝘂𝗰𝗵 𝗹𝗼𝘄𝗲𝗿 𝗼𝘃𝗲𝗿𝗮𝗹𝗹 𝗰𝗼𝗵𝗲𝗿𝗲𝗻𝗰𝗲". 𝗪𝗵𝘆 𝗧𝗵𝗶𝘀 𝗠𝗮𝘁𝘁𝗲𝗿𝘀 The platform is so clean that it confirms a fundamental truth: 𝗪𝗲 𝗵𝗮𝘃𝗲 𝘀𝘂𝗰𝗰𝗲𝘀𝘀𝗳𝘂𝗹𝗹𝘆 𝗿𝗮𝗶𝘀𝗲𝗱 𝘁𝗵𝗲 𝗰𝗲𝗶𝗹𝗶𝗻𝗴 𝗳𝗼𝗿 𝗧𝟭, 𝗯𝘂𝘁 𝘄𝗲 𝗵𝗮𝘃𝗲 𝗻𝗼𝘁 𝘆𝗲𝘁 𝘀𝗼𝗹𝘃𝗲𝗱 𝘁𝗵𝗲 𝗶𝗻𝘀𝘁𝗮𝗯𝗶𝗹𝗶𝘁𝘆 𝗽𝗿𝗼𝗯𝗹𝗲𝗺. The UHV fabrication step, for example, did not improve T1 but improved T2E (coherence). So we tackled dephasing noise, not relaxation. Better materials alone won't save us. At least not in the near-term. We clearly need to double down on real-time control that can track and adapt to these fluctuations. And are we even seeing the full picture? How fast are you measuring your T1s? 📸 Credits: Bland et al., 𝘕𝘢𝘵𝘶𝘳𝘦 volume 647, pages 343–348 (2025) Andrew Houck Nathalie de Leon
Evaluating Quality Over Volume in Quantum Hardware
Explore top LinkedIn content from expert professionals.
Summary
Evaluating quality over volume in quantum hardware means focusing not just on increasing the number of qubits in a quantum computer, but ensuring each qubit performs reliably and consistently. In quantum computing, quality refers to how stable, accurate, and predictable each qubit and operation is, which is crucial for solving real-world problems.
- Prioritize uniform performance: Build quantum systems that maintain consistently high reliability across all qubits, not just a few standout performers.
- Monitor environmental factors: Regularly check for hidden influences like temperature or calibration drift that can impact how quantum hardware runs.
- Stack practical methods: Combine multiple simple, efficient techniques to create robust solutions, making quantum experiments more repeatable and accessible for broader teams.
-
-
Quantum hardware is fragile, and consistent performance in experiments rarely comes from one “silver bullet” or hyped method. 𝗜𝘁 𝗰𝗼𝗺𝗲𝘀 𝗳𝗿𝗼𝗺 𝗮 𝘀𝘁𝗮𝗰𝗸. Many frontier research demonstrations still depend on a single technique plus “more resources” (more shots, more classical overhead, more executions, etc.). These make some quantum experiments costing tens or hundreds of thousands of dollars. That can work for a demo, but it doesn’t scale to broader teams or high-volume exploration. The practical path may be the opposite: we recover much of that performance by composing multiple lightweight techniques into a single, fast, repeatable pipeline, adding up state-of-the-art efficiency at each step. Haiqu’s infrastructure enables exactly this: flexible construction and operation of increasingly complex middleware stacks. We believe that is where #quantum #utility starts to get discovered. Similar to how modern #AI became feasible once tooling like PyTorch or TensorFlow made complex models repeatable and fast.
-
To start addressing quantum hardware we can talk about 5 essential aspects that best describe and characterize quantum hardware, particularly for superconducting or trapped-ion systems: 1️⃣ 𝐐𝐮𝐛𝐢𝐭 𝐂𝐨𝐡𝐞𝐫𝐞𝐧𝐜𝐞 𝐓𝐢𝐦𝐞𝐬 Average T1 (Relaxation Time): Time it takes for a qubit to decay from the excited state ∣1⟩ to the ground state ∣0⟩. ➡️ Measures energy loss to the environment. Average T2 (Dephasing Time): Time over which a qubit maintains phase coherence in a superposition. Sensitive to both energy loss and environmental noise. ➡️ These define how long quantum information can be stored reliably. Ref -> Schlosshauer: 𝘋𝘦𝘤𝘰𝘩𝘦𝘳𝘦𝘯𝘤𝘦, 𝘵𝘩𝘦 𝘮𝘦𝘢𝘴𝘶𝘳𝘦𝘮𝘦𝘯𝘵 𝘱𝘳𝘰𝘣𝘭𝘦𝘮, 𝘢𝘯𝘥 𝘪𝘯𝘵𝘦𝘳𝘱𝘳𝘦𝘵𝘢𝘵𝘪𝘰𝘯𝘴 𝘰𝘧 𝘲𝘶𝘢𝘯𝘵𝘶𝘮 𝘮𝘦𝘤𝘩𝘢𝘯𝘪𝘤𝘴, Rev. Mod. Phys. 2005 2️⃣ 𝐆𝐚𝐭𝐞 𝐅𝐢𝐝𝐞𝐥𝐢𝐭𝐲 Measures the accuracy of quantum gate operations (1-qubit and 2-qubit). Includes: Average gate fidelity Process fidelity Randomized benchmarking ➡️ Reflects the precision and control over quantum operations. Ref -> Magesan et al., 𝘚𝘤𝘢𝘭𝘢𝘣𝘭𝘦 𝘢𝘯𝘥 𝘙𝘰𝘣𝘶𝘴𝘵 𝘙𝘢𝘯𝘥𝘰𝘮𝘪𝘻𝘦𝘥 𝘉𝘦𝘯𝘤𝘩𝘮𝘢𝘳𝘬𝘪𝘯𝘨, Phys. Rev. Lett. 2011 3️⃣ 𝐑𝐞𝐚𝐝𝐨𝐮𝐭 𝐅𝐢𝐝𝐞𝐥𝐢𝐭𝐲 (or Readout Error) Probability of correctly measuring a qubit’s state. Affected by: Signal-to-noise ratio in detectors Crosstalk Measurement backaction ➡️ Critical for interpreting results from quantum computation. Ref -> Gambetta et al., 𝘘𝘶𝘢𝘯𝘵𝘶𝘮 𝘵𝘳𝘢𝘫𝘦𝘤𝘵𝘰𝘳𝘺 𝘢𝘱𝘱𝘳𝘰𝘢𝘤𝘩 𝘵𝘰 𝘤𝘪𝘳𝘤𝘶𝘪𝘵 𝘘𝘌𝘋, Phys. Rev. A 2008 (for superconducting) -> Wiseman & Milburn — Quantum Measurement and Control 4️⃣ 𝐐𝐮𝐛𝐢𝐭 𝐂𝐨𝐧𝐧𝐞𝐜𝐭𝐢𝐯𝐢𝐭𝐲 (or Coupling Map) Defines which qubits can directly interact, i.e., apply 2-qubit gates like CNOT or CZ. Represented as a graph. Limited connectivity leads to longer circuits after transpilation, affecting fidelity. ➡️ Influences circuit depth and efficiency in real hardware. Ref -> Linke et al., 𝘌𝘹𝘱𝘦𝘳𝘪𝘮𝘦𝘯𝘵𝘢𝘭 𝘤𝘰𝘮𝘱𝘢𝘳𝘪𝘴𝘰𝘯 𝘰𝘧 𝘵𝘸𝘰 𝘲𝘶𝘢𝘯𝘵𝘶𝘮 𝘤𝘰𝘮𝘱𝘶𝘵𝘪𝘯𝘨 𝘢𝘳𝘤𝘩𝘪𝘵𝘦𝘤𝘵𝘶𝘳𝘦𝘴, PNAS 2017 5️⃣ 𝐂𝐫𝐨𝐬𝐬𝐭𝐚𝐥𝐤 (and Noise Characteristics) Crosstalk: Undesired interaction between qubits or control lines. Noise Spectral Density: Frequency distribution of environmental or control noise affecting qubits. ➡️ Affects scalability and stability of computations over large systems. Ref -> Breuer & Petruccione — The Theory of Open Quantum Systems
-
It's interesting to revisit the history of quantum computing through the "quantity - quality - speed" lens. Until not so long ago (a couple of years maybe), the industry was all about quantity. "Let's have large amounts of qubits, and then hopefully we can make something useful out of them. After all, a system with lots of qubits can't be classically simulated". Now, since the results of this approach have been underwhelming and since quantum error correction no longer looks so infeasible, the industry has pivoted to quality. "Let's make good qubits and let's make enough of them, so that we can run the deep circuits that are required to yield quantum advantage". Then, when we are finally able to run big circuits and solve meaningful problems, we will start realizing that a big circuit takes a long time to run. At this point, the focus will shift towards speed. And it will be a good sign, because it will mean the industry is finally reaching maturity. After all, speed is the ultimate reason why we're building quantum computers. Now, is this all oversimplified? Yes, of course. Nobody focuses solely on quantity, quality, or speed. You need them all to succeed. And they're all somehow interconnected. Still, it can't be denied that some metrics are emphasized more than others. That large (physical) qubit counts no longer impress as much as they once did. Or that few (apart maybe from Google?) put the focus on the speed of their current devices. So, watch out for the change of narrative. When more manufacturers start talking about speed, it might be that something's cooking. Thanks to Nils Löwhagen for the discussion we had this morning, which enabled me to flesh out these ideas!
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development