MIT Sets Quantum Computing Record with 99.998% Fidelity Researchers at MIT have achieved a world-record single-qubit fidelity of 99.998% using a superconducting qubit known as fluxonium. This breakthrough represents a significant step toward practical quantum computing by addressing one of the field’s greatest challenges: mitigating noise and control imperfections that lead to operational errors. Key Highlights: 1. The Problem: Noise and Errors • Qubits, the building blocks of quantum computers, are highly sensitive to noise and imperfections in control mechanisms. • Such disturbances introduce errors that limit the complexity and duration of quantum algorithms. “These errors ultimately cap the performance of quantum systems,” the researchers noted. 2. The Solution: Two New Techniques To overcome these challenges, the MIT team developed two innovative techniques: • Commensurate Pulses: This method involves timing quantum pulses precisely to make counter-rotating errors uniform and correctable. • Circularly Polarized Microwaves: By creating a synthetic version of circularly polarized light, the team improved the control of the qubit’s state, further enhancing fidelity. “Getting rid of these errors was a fun challenge for us,” said David Rower, PhD ’24, one of the study’s lead researchers. 3. Fluxonium Qubits and Their Potential • Fluxonium qubits are superconducting circuits with unique properties that make them more resistant to environmental noise compared to traditional qubits. • By applying the new error-mitigation techniques, the team unlocked the potential of fluxonium to operate at near-perfect fidelity. 4. Implications for Quantum Computing • Achieving 99.998% fidelity significantly reduces errors in quantum operations, paving the way for more complex and reliable quantum algorithms. • This milestone represents a major step toward scalable quantum computing systems capable of solving real-world problems. What’s Next? The team plans to expand its work by exploring multi-qubit systems and integrating the error-mitigation techniques into larger quantum architectures. Such advancements could accelerate progress toward error-corrected, fault-tolerant quantum computers. Conclusion: A Leap Toward Practical Quantum Systems MIT’s achievement underscores the importance of innovation in error correction and control to overcome the fundamental challenges of quantum computing. This breakthrough brings us closer to the realization of large-scale quantum systems that could transform fields such as cryptography, materials science, and complex optimization problems.
Addressing Reliability Issues in Quantum Devices
Explore top LinkedIn content from expert professionals.
-
-
Isolating fragile quantum states relies on specific mathematical boundaries. Scaling quantum hardware involves eliminating correlations between a local system and its surrounding environment. When a bipartite quantum state undergoes a unitary operation followed by a decoupling map, the objective is to make the resulting system independent of environmental noise. Past approaches to calculate decoupling error limits relied on approximations and smoothing techniques. A joint research initiative between RWTH Aachen University and National Taiwan University introduces a one-shot decoupling theorem. This study defines the decoupling error bound through exact mathematical structures rather than general estimations. The research was conducted by Mario Berta, Yongsheng Yao, and Hao-Chung Cheng. Consider the technical parameters of this published theorem: → It utilizes quantum relative entropy distance instead of the standard trace distance criteria. → It provides a precise characterisation of one-shot decoupling error without using smoothing techniques or additive terms. → It delivers a single-letter expression for exact error exponents in quantum state merging. → It outlines achievability bounds for entanglement distillation assisted by local operations and classical communication. These mathematical limits apply directly to system performance. For coding rates below the first-order asymptotic capacity, the error decays exponentially for every blocklength. This provides a large-deviation characterisation that is mathematically stronger than conventional first-order approaches. Relative entropy operates as the primary metric for defining the capacity of these operational tasks. The bounds formulated under relative entropy convert directly into purified distance statements via standard entropy-fidelity inequalities. This establishes a strict performance criterion for applications like quantum channel simulation and secure channel coding. The current theorem primarily addresses scenarios involving identical, independently distributed quantum states. The subsequent phase of research requires applying these refined entropy bounds to complex systems featuring correlated noise and memory. This research supplies experimental physicists with a defined mathematical framework for future quantum architecture. How do you evaluate the transition from theoretical limits to functional quantum hardware? Reply in the comments.
-
I’ve definitely done this before: placing wirebonds across the resonator to connect ground planes. Back then, it seemed harmless—maybe even necessary. But it turns out, a single wirebond can form a parasitic Josephson junction with the oxidized aluminum pad beneath. And if that junction happens to be enclosed in a superconducting loop—formed by other bond wires or traces—it becomes a parasitic RF-SQUID. And then things start to break. This parasitic SQUID can cause: • 𝗦𝘁𝗿𝗼𝗻𝗴 𝗗𝗖 𝗺𝗮𝗴𝗻𝗲𝘁𝗶𝗰 𝗰𝗼𝘂𝗽𝗹𝗶𝗻𝗴 to nearby flux-tunable transmons, modulating the qubit frequency in a hysteretic, sawtooth-like pattern. • 𝗗𝗶𝘀𝗽𝗲𝗿𝘀𝗶𝘃𝗲 𝗔𝗖 𝗰𝗼𝘂𝗽𝗹𝗶𝗻𝗴 to the readout resonator, producing sharp, asymmetric dips in frequency at regular intervals. • 𝗖𝗼𝗺𝗽𝗹𝗲𝘁𝗲 𝘀𝘂𝗽𝗽𝗿𝗲𝘀𝘀𝗶𝗼𝗻 𝗼𝗳 𝗾𝘂𝗯𝗶𝘁 𝗳𝘂𝗻𝗰𝘁𝗶𝗼𝗻𝗮𝗹𝗶𝘁𝘆 in some cases. All of this—from a wirebond! And what’s worse: the entire effect can vanish the moment the wirebond is removed. It’s the kind of issue that’s easy to miss, especially in early-stage experiments where manual bonding is common and attention is focused on the qubits. But it’s a crucial reminder: in superconducting quantum circuits, the entire assembly 𝘪𝘴 the device. Wirebonds, airbridges, packaging—none of it is outside the quantum system. We spend enormous effort optimizing gates, fidelities, and calibration routines. But sometimes, the root cause of instability isn’t in the software—or even in the circuit design. It’s in the loop you didn’t mean to make. 📸 Image Credits: B. Berlitz et al. (2025, arXiv:2505.20458)
-
“But in a new study, published May 7 in the journal Nature Communications Materials, researchers proposed using a new, pure form of silicon — the semiconductor material used in conventional computers — as the basis for a qubit that is far more scalable than existing technologies. Building qubits from semiconducting materials like silicon, gallium or germanium has advantages over superconducting metal qubits, according to the quantum computing company QuEra. The coherence times are relatively long, they are cheap to make, they operate at higher temperatures and they are extremely tiny — meaning a single chip can hold huge numbers of qubits. But impurities in semiconducting materials cause decoherence during computations, which makes them unreliable. In the new study, the scientists proposed making a qubit out of silicon-28 (Si-28), which they described as the "world's purest silicon," after stripping away the impurities found in natural silicon. These silicon-based qubits would be less prone to failure, they said, and could be fabricated to the size of a pinhead. Natural silicon is normally made up of three isotopes, or atoms of different masses — Si-28, Si-29 and Si-30. Natural silicon works well in conventional computing due to its metalloid properties, but problems arise when using it in quantum computing. Si-29 in particular, which makes up 5% of natural silicon, causes a "nuclear flip-flopping effect" that leads to decoherence and the loss of information. In the study, the scientists got around this by developing a new method to engineer silicon without Si-29 and Si-30 atoms. "Now that we can produce extremely pure silicon-28, our next step will be to demonstrate that we can sustain quantum coherence for many qubits simultaneously," project co-supervisor David Jamieson, professor of physics at the University of Melbourne, said in the statement. "A reliable quantum computer with just 30 qubits would exceed the power of today's supercomputers for some applications." https://lnkd.in/gAUmAcdd
-
By driving a quantum processor with laser pulses arranged according to the Fibonacci sequence, physicists observed the emergence of an entirely new phase of matter—one that displays extraordinary stability in a domain where fragility is the norm. Quantum computers operate using qubits, which differ radically from classical bits. A qubit can exist in superposition, occupying multiple states at once, and can become entangled with others across space. These properties enable immense computational power, but they come with a cost: quantum states are notoriously short-lived. Environmental noise, microscopic imperfections, and edge effects rapidly degrade coherence, limiting how long quantum information can survive. Seeking a new way to protect fragile quantum states, scientists at the Flatiron Institute, instead of applying laser pulses at regular intervals, they used a rhythm governed by the Fibonacci sequence—an ordered but non-repeating pattern long known to appear in biological growth, crystal structures, and wave interference. The experiment was carried out on a chain of ten trapped-ion qubits, driven by precisely timed laser pulses. The result was the formation of what is described as a time quasicrystal. Unlike ordinary crystals, which repeat periodically in space, a time quasicrystal exhibits structure in time without repeating in a simple cycle. The Fibonacci-based driving created a temporal order that resisted disruption, allowing the quantum system to remain coherent far longer than expected. The improvement was significant. Under standard conditions, the quantum state persisted for roughly 1.5 seconds. When driven by the Fibonacci pulse sequence, coherence times stretched to approximately 5.5 seconds—more than a threefold increase. Even more intriguing was the system’s temporal behavior. Measurements indicated that the quantum dynamics unfolded as if time itself possessed two independent structural directions. This does not imply time flowing backward, but rather that the system’s evolution followed two intertwined temporal pathways—an emergent property arising purely from the Fibonacci drive. The researchers propose that the non-repeating structure of the Fibonacci sequence suppresses errors that typically accumulate at the boundaries of quantum systems. By distributing disturbances in a highly ordered yet aperiodic way, the sequence stabilizes the collective behavior of the qubits. In effect, a mathematical pattern found throughout nature acts as a self-organizing error-management protocol. The findings suggest a powerful new strategy for quantum control. Rather than fighting noise solely with complex correction algorithms, future quantum technologies may harness structured patterns—drawn from mathematics and natural order—to achieve resilience at a fundamental level. https://lnkd.in/dVxp7R8J https://lnkd.in/dDVNRsPk
-
CONTINUOUS-VARIABLE QUANTUM KEY DISTRIBUTION IN AN INTEGRATED PHOTONIC GLASS PLATFORM Recent progress in continuous‑variable (CV) quantum information processing has accelerated the development of high‑rate QRNGs and QKD systems. CV approaches, based on coherent detection of optical field quadratures, offer strong compatibility with existing telecom infrastructure and can achieve Gbit/s‑level performance. However, the scalability and reliability of these systems depend critically on the optical front‑end — particularly on minimizing loss, ensuring polarization stability, and enabling precise, tunable coherent detection. While silicon photonics and InP platforms have driven major advances in integrated quantum receivers, both exhibit intrinsic polarization dependence and higher propagation losses. These constraints become limiting in quantum regimes where attenuation directly translates into irreversible information loss. In this context, femtosecond‑laser micromachining (FLM) in borosilicate glass has emerged as a compelling alternative. FLM enables direct writing of three‑dimensional waveguides with low propagation loss (~0.1 dB/cm at 1550 nm) and excellent mode matching to standard telecom fibers, resulting in interface losses as low as 0.2 dB. Importantly, the weak residual birefringence of laser‑written waveguides (10⁻⁶–10⁻⁵) ensures polarization‑insensitive operation, a key requirement for robust CV‑QKD and CV‑QRNG implementations. Using this platform, researchers have demonstrated a fully tunable, low‑loss heterodyne receiver integrated directly into borosilicate glass. The device incorporates: Fixed and tunable beam splitters Thermo‑optic phase shifters for quadrature control 3D waveguide crossings Polarization‑independent directional couplers Stable SNR over 8+ hours These metrics match or exceed many silicon‑based coherent receivers, while eliminating the need for polarization management. Demonstrated Performance: QRNG and QKD on a Single Chip The same glass‑based photonic front‑end supports two state‑of‑the‑art CV quantum technologies: Source‑device‑independent CV‑QRNG → Secure random‑bit generation rate: 42.7 Gbit/s (record‑setting) QPSK‑based CV‑QKD → Secret‑key rate: 3.2 Mbit/s over a simulated 9.3‑km fiber link This dual‑function capability highlights the versatility and robustness of glass‑integrated coherent detection. Advantages for Scalable Quantum Networks Glass‑based PICs offer several practical benefits: Environmental and thermal stability Low‑loss fiber coupling True 3D routing flexibility Cost‑effective, rapid prototyping Polarization‑insensitive operation without active control While borosilicate glass is ideal for low‑loss, polarization‑independent coherent detection, sapphire introduces additional capabilities for mission‑critical environments, particularly high‑radiation, high‑temperature, or high‑power quantum receivers in hybrid stacks. #https://lnkd.in/ewj93RtA
-
cuts quantum computer heat emissions by 10,000 times, offering a breakthrough in cooling and efficiency for next-generation machines. Heat is a major challenge in quantum computing, as excess energy disrupts qubits and causes errors. Reducing emissions is essential for scaling up powerful quantum systems. This device operates at extremely low temperatures, maintaining qubits in stable states while drastically minimizing unwanted thermal noise, allowing longer computations with higher accuracy. It could be launched as early as 2026, potentially revolutionizing how quantum computers are built, cooled, and deployed, making them more practical for real-world applications. Controlling heat at this scale reminds us that engineering solutions, combined with quantum science, are key to unlocking the full potential of quantum computing, enabling faster, more reliable, and energy-efficient machines. Thank YOU — Quantum Cookie The device is a cryogenic traveling-wave parametric amplifier (TWPA) made with specialized "quantum materials." Traditional amplifiers used for reading out qubit signals in superconducting quantum computers generate noticeable heat (even if small in absolute terms), which adds thermal noise, raises the cooling burden on dilution refrigerators, and limits how many qubits can be packed into one cryostat. Qubic's version reportedly cuts thermal output by a factor of 10,000, bringing it down to practically zero (on the order of 1–10 microwatts), while also reducing overall power consumption by about 50%. Why this matters for quantum computing - Heat is a core scaling bottleneck: Qubits (especially superconducting ones) must operate at millikelvin temperatures (~10–50 mK). Even tiny amounts of heat from readout electronics or control lines can cause decoherence, increase error rates, and require more powerful (and expensive) cryogenic systems. - The amplifier's role: It boosts the faint microwave signals from qubits without adding much noise. Conventional semiconductor-based amplifiers at cryogenic stages dissipate more heat; this new TWPA minimizes that, potentially allowing twice as many qubits per dilution refrigerator by easing the thermal load and simplifying cabling. - Potential impact: Lower cooling demands could cut operational costs and energy use significantly, making larger, more practical quantum systems feasible for real-world applications rather than just lab prototypes. Timeline and status The company has received grant funding and aims for commercialization/launch in 2026. As of early 2026 reports, development is ongoing with targets like 20 dB gain over a 4–12 GHz bandwidth. No major contradictions or retractions have appeared in credible coverage.
-
A new quantum machine learning technique based on graph neural networks to produce QPU-aware quantum kernels can be an interesting path to explore. A recent paper, "Hardware-Aware Quantum Kernel Design Based on Graph Neural Networks," introduces an innovative framework called HaQGNN. This work addresses a critical challenge in quantum machine learning: designing effective quantum kernels that can adapt to both specific tasks and the limitations of near-term quantum hardware. HaQGNN could offer several key advantages for advancing practical quantum kernel design on NISQ devices: * Hardware awareness: It explicitly integrates real-device topology and noise characteristics during both the quantum circuit generation and performance prediction stages. This makes the generated circuits highly compatible with different quantum hardware backends and enhances their fidelity. * Efficient performance estimation with GNNs: HaQGNN employs two specialized Graph Neural Networks (GNNs) for rapid and accurate evaluation of candidate quantum circuits. GNNs-1 predicts the "Probability of Successful Trials (PST)," a metric correlated with circuit fidelity, allowing for the early rejection of low-fidelity circuits affected by hardware noise. GNNs-2 estimates the "Kernel-Target Alignment (KTA)," which is a reliable surrogate metric for classification accuracy. * Noise robustness: By filtering out noisy (low-fidelity circuits early in the pipeline), HaQGNN effectively reduces the impact of gate errors and decoherence, leading to more reliable kernel performance on noisy devices. Here more details: https://lnkd.in/dvGd9sMs #qml #quantum #machinelearning #datascience
-
State of the art in fault-tolerant quantum computing – Questions and issues https://lnkd.in/e-EptrmB Report of French Académie des technologies This report reviews the construction and potential use of FTQC (Fault Tolerant Quantum Computing) computers to reliably perform complex calculations by overcoming the problems posed by the errors and noise inherent in quantum systems. After recalling the reality of the quantum advantage and its needs, the report describes the use of error-correcting codes in the design of FTQC computers. It then reports on the progress of the five most advanced physical technologies in the world for building such computers and the obstacles they will have to face in order to achieve the transition to scale necessary for the execution of useful applications. Finally, it discusses the technical and economic environment for quantum computers, how their performance can be compared and evaluated, and their future coexistence with other computing technologies (3D silicon, AI) or with supercomputers.
-
While it was initially thought that we would not see reliable quantum computers until the late 2030s, recent breakthroughs have led many experts to believe that early fault-tolerant machines will be a reality sooner than expected – we're now looking at years, not decades. The key to unlocking that reality – and one of our biggest challenges in the quantum community– is quantum error correction (QEC). Present day qubits are fragile and susceptible to quantum noise, which causes high rates of error and prevents today’s intermediate-scale quantum computers from achieving practical advantage. Microsoft’s qubit-virtualization system combines advanced runtime error diagnostics with computational error correction to significantly reduce the noise of physical qubits and enable the creation of reliable logical qubits – which are fundamental to resilient quantum computing. Think of it like noise-cancelling headphones, but for quantum disruption! Just love that visual! In April, we applied our qubit-virtualization system and Quantinuum’s ion-trap hardware to achieve an 800x improvement on the error rate of physical qubits, demonstrating the most reliable logical qubits on record. As we continue this groundbreaking work, we are getting closer to the era of fault-tolerant quantum computing and our goal of building a scalable hybrid supercomputer. What’s next? Stay tuned! #QuantumComputing #QEC #AzureQuantum
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Event Planning
- Training & Development