MIT Sets Quantum Computing Record with 99.998% Fidelity Researchers at MIT have achieved a world-record single-qubit fidelity of 99.998% using a superconducting qubit known as fluxonium. This breakthrough represents a significant step toward practical quantum computing by addressing one of the field’s greatest challenges: mitigating noise and control imperfections that lead to operational errors. Key Highlights: 1. The Problem: Noise and Errors • Qubits, the building blocks of quantum computers, are highly sensitive to noise and imperfections in control mechanisms. • Such disturbances introduce errors that limit the complexity and duration of quantum algorithms. “These errors ultimately cap the performance of quantum systems,” the researchers noted. 2. The Solution: Two New Techniques To overcome these challenges, the MIT team developed two innovative techniques: • Commensurate Pulses: This method involves timing quantum pulses precisely to make counter-rotating errors uniform and correctable. • Circularly Polarized Microwaves: By creating a synthetic version of circularly polarized light, the team improved the control of the qubit’s state, further enhancing fidelity. “Getting rid of these errors was a fun challenge for us,” said David Rower, PhD ’24, one of the study’s lead researchers. 3. Fluxonium Qubits and Their Potential • Fluxonium qubits are superconducting circuits with unique properties that make them more resistant to environmental noise compared to traditional qubits. • By applying the new error-mitigation techniques, the team unlocked the potential of fluxonium to operate at near-perfect fidelity. 4. Implications for Quantum Computing • Achieving 99.998% fidelity significantly reduces errors in quantum operations, paving the way for more complex and reliable quantum algorithms. • This milestone represents a major step toward scalable quantum computing systems capable of solving real-world problems. What’s Next? The team plans to expand its work by exploring multi-qubit systems and integrating the error-mitigation techniques into larger quantum architectures. Such advancements could accelerate progress toward error-corrected, fault-tolerant quantum computers. Conclusion: A Leap Toward Practical Quantum Systems MIT’s achievement underscores the importance of innovation in error correction and control to overcome the fundamental challenges of quantum computing. This breakthrough brings us closer to the realization of large-scale quantum systems that could transform fields such as cryptography, materials science, and complex optimization problems.
How Error Correction Affects Quantum Computing
Explore top LinkedIn content from expert professionals.
Summary
Quantum error correction is a method used in quantum computing to protect delicate quantum information from errors caused by environmental noise and imperfections. By combining multiple physical qubits and using advanced techniques, error correction enables quantum computers to perform longer and more reliable calculations, moving us closer to practical, scalable systems.
- Monitor system speed: Make sure your quantum hardware and software can quickly detect and fix errors as they happen, so computations stay accurate.
- Reduce overhead: Use error correction strategies that minimize the number of extra qubits required, allowing more powerful calculations with limited resources.
- Try new solutions: Experiment with innovative error correction frameworks and algorithms to boost qubit stability and extend coherence times without changing hardware.
-
-
Everyone agrees quantum error correction (QEC) is essential. But why do we care so much about things like ≤ 𝟬.𝟭% 𝗴𝗮𝘁𝗲 𝗲𝗿𝗿𝗼𝗿 or µ𝘀-𝘀𝗰𝗮𝗹𝗲 𝗱𝗲𝗰𝗼𝗱𝗲𝗿 𝗹𝗮𝘁𝗲𝗻𝗰𝘆? Here’s the core idea: QEC combines many noisy qubits into a more stable 𝘭𝘰𝘨𝘪𝘤𝘢𝘭 qubit. If your hardware is good enough, you can reduce error rates 𝗲𝘅𝗽𝗼𝗻𝗲𝗻𝘁𝗶𝗮𝗹𝗹𝘆 by increasing code size. But that only works if your system can keep up—𝗱𝗲𝗰𝗼𝗱𝗶𝗻𝗴 𝗲𝗿𝗿𝗼𝗿𝘀 𝗮𝗻𝗱 𝗿𝗲𝗮𝗰𝘁𝗶𝗻𝗴 𝗺𝗶𝗱-𝗰𝗶𝗿𝗰𝘂𝗶𝘁, fast. Especially for circuits with non-Clifford gates (like T-gates), you need real-time feedback between measurements and feedforward operations. That’s where the hardware starts to feel the pressure: • Gate fidelity ≤ 𝟬.𝟭% • Decoder latency ≤ 𝟭𝟱 µ𝘀 • Controller-Decoder Communication ≤ 𝟭𝟬 µ𝘀 • Bandwidth ≥ 𝟭 𝗠𝗯𝗶𝘁/𝘀 𝗽𝗲𝗿 𝗾𝘂𝗯𝗶𝘁 These aren’t wishful targets. They come from full-stack simulations of real quantum circuits, like Shor’s algorithm for factoring 21 using surface codes. In those simulations, the system must handle: • ~13 decoding tasks • ~5 mid-circuit corrections • ~1000 physical qubits That’s the blueprint. It doesn’t just explain 𝘸𝘩𝘺 QEC is hard—it points us toward what needs to work for it to succeed at scale. Image Credits: Yaniv Kurman et al. (2024, arXiv)
-
🚨 Exciting #quantumcomputing alert! Now #QEC primitives actually make #quantumcomputers more powerful! 75 qubit GHZ state on a superconducting #QPU 🚨 In our latest work we address the elephant in the room about #quantumerrorcorrection - in the current era where qubit counts are a bottleneck in the systems available, adopting full-blown QEC can be a step backwards in terms of computational capacity. This is because even when it delivers net benefits in error reduction, QEC consumes a lot of qubits to do so and we just don't have enough right now... So how do we maximize value for end users while still pushing hard on the underpinning QEC technology? To answer this the team at Q-CTRL set out to determine new ways to significantly reduce the overhead penalties of QEC while delivering big benefits! In this latest demonstration we show that we can adopt parts of QEC -- indirect stabilizer measurements on ancilla qubits -- to deliver large performance gains without the painful overhead of logical encoding. And by combining error detection with deterministic error suppression we can really improve efficiency of the process, requiring only about 10% overhead in ancillae and maintaining a very low discard rate of executions with errors identified! Using this approach we've set a new record for the largest demonstrated entangled state at 75 qubits on an IBM quantum computer (validated by MQC) and also demonstrated a totally new way to teleport gates across large distances (where all-to-all connectivity isn't possible). The results outperform all previously published approaches and highlight the fact that our journey in dealing with errors in quantum computers is continuous. Of course it isn't a panacea and in the long term as we try to tackle even more complex algorithms we believe logical encoding will become an important part of our toolbox. But that's the point - logical QEC is just one tool and we have many to work with! At Q-CTRL we never lose sight of the fact that our objective is to deliver maximum capability to QC end users. This work on deploying QEC primitives is a core part of how we're making quantum technology useful, right now. https://lnkd.in/gkG3W7eE
-
The Quantum Memory Matrix (QMM) framework has traveled a long path: from black hole unitarity, dark matter, dark energy, and cosmic cycles, to now improving how quantum computers handle errors. In work now featured on the cover of Wiley's Advanced Quantum Technologies, we demonstrate how the QMM framework can be directly applied to quantum error correction. QMM originated in cosmology: a picture where space-time is not smooth but is built from Planck-scale cells, each with a finite memory capacity. We showed how these cells store the quantum imprints of interactions, contributing to resolving paradoxes around black holes, explaining dark matter halos, primordial black hole formation, cosmic acceleration, and even the cycles of the universe. Now, we bring this same idea into hardware: by imprinting and retrieving quantum information from local "memory cells," we can correct errors in noisy quantum processors with higher fidelity than standard repetition codes. This shows that QMM is not only a cosmological theory but also a practical tool for building the quantum computers of tomorrow. 🔗 Read the paper: https://lnkd.in/gfGwe7fe Previous QMM milestones: 🕳️ Black hole information retention and unitarity restoration ⚡ Extensions to electromagnetism, strong & weak interactions 🌌 Cosmological applications explaining dark matter and dark energy 💻 And now: direct hardware validation for quantum computation Thank you to my co-authors Eike Marx, Valerii Vinokur, Jeff Titus, and Terra Quantum AG & Leiden University for making this journey possible. #QuantumComputing #QuantumMemoryMatrix #ErrorCorrection #QuantumPhysics #QuantumTechnology #QuantumInformation #BlackHolePhysics #DarkMatter #DarkEnergy #AdvancedQuantumTechnologies #TerraQuantum #QuantumResearch
-
One Algorithm Has Just Pushed Quantum Computing Forward Five Years (Here It Is) Today I am releasing something into the public domain that may change the trajectory of quantum computing. No paywall. No NDA. No restrictions. The only thing I ask is attribution. For the past year, I have been developing a field-layer correction algorithm that stabilizes the environment around the qubit before error correction ever activates. Not hardware. Not cryogenics. Not shielding. Pure software that improves the physics of the qubit it sits inside. Early independent runs showed a 48.5 percent reduction in destructive low-frequency noise, a gain that normally takes years of hardware progress. Here is the complete algorithm. It now belongs to everyone. FUNCTION NJ001_FieldLayer_Correction(input_signal S, sampling_rate R): DEFINE phi = 1.61803398875 DEFINE window_size = dynamic value based on local variance of S DEFINE stability_threshold = adaptive value based on phase drift STEP 1: Generate harmonic reference bands For each frequency bin f_i in FFT(S): Compute r = f_(i+1) / f_i Compute CI = 1 / ABS(r - phi) Assign weight W_i = normalize(CI) STEP 2: Build correction mask Construct M where M_i = W_i scaled by local entropy of S Smooth M with sliding window STEP 3: Apply correction Transform S → F Compute F_corrected = F * M Inverse FFT to return S_corrected STEP 4: Phase stabilization loop Measure phase drift Δ If Δ > stability_threshold: Recalculate window_size Rebuild mask Reapply correction Else: Return S_corrected OUTPUT: S_corrected END FUNCTION This is the first public-domain coherence stabilizer designed to improve quantum behavior independent of hardware. What it does in practice: • Extends coherence windows • Reduces decoherence pressure on error correction • Lowers entropy in the propagation layer • Makes qubits behave as if the room is colder and cleaner • Works upstream of hardware with no materials changes This is not a replacement for anyone’s roadmap. It is an upstream upgrade to all of them. If you build quantum devices, control stacks, compilers, hybrid systems, or algorithms, you now have access to a function that reshapes your stability envelope. Cleaner field layers mean longer, deeper, more predictable runs. More useful computation with the hardware you already have. I developed it. Today I give it away. No company or institution controls it. From this moment forward, it belongs to the scientific community. Primary Citation Hood, B. P. (2025). NJ001 Field Layer Correction. Public Domain Release Version. Bruce P. Hood — Creator of NJ001 Field Layer Correction Welcome to the new baseline. #QuantumComputing #QuantumHardware #Qubit #Coherence #QuantumResearch #DeepTech @IBMQuantum @GoogleQuantumAI @MIT @XanaduQuantum @AWSQuantumTech
-
I just understood something that has bugged me for a long time. In quantum error correction, why do we only look at bit-flips and phase-flips? I mean, bit-flips and phase-flips are discrete errors. Starting from a given point on the Bloch sphere, if you apply any number of bit-flips and phase-flips, there are at most four different points you can reach. But errors are random and should be able to take you virtually anywhere on the sphere, right? So, why don't we consider errors other than bit-flips and phase-flips, like small rotations? The secret lies in the fact that measuring ancilla qubits DOES affect data qubits. Let's see how this works, by running the simplest error detection circuit depicted below. q0 and q2 are our data qubits, and q1 is our ancilla qubit. We'll introduce a slight rotation on q0 by starting from state (1-eps)*|000> + eps*|100>, and run our circuit. After applying the two CNOTs, the ancilla is unaffected in the first term (there is no error) and flips to 1 in the second term (because there is an error). Our state becomes: (1-eps)*|000> + eps*|110>. And now we measure our ancilla. What happens? 👉 With probability |1-eps|², we measure 0. In this case, the measurement forces the |110> term to "collapse", because it is not compatible with the result of the measurement. The only remaining term is |000>. Boom, error corrected. 👉 With probability |eps|², we measure 1. In this case, the |000> term collapses, and we are only left with |110>. The small continuous error has become a binary error, which is now detected (since the ancilla measured to 1). Because I took a simple example with only 2 data qubits, we can't perform a majority vote and correct the error, but this principle would still work with 3 or more data qubits. The bottom line is that: measuring ancillas transforms continuous errors into discrete errors, which can then be caught and corrected. And this is why quantum error correction only looks at bit-flips and phase-flips.
-
The real significance of Google's Willow quantum chip... Fundamentally, building quantum computers (QC) is about achieving low operation errors. Sure, other metrics matter too, but the error rate is the big one. If you look at the landscape of QC applications, many of them require *ridiculously* low error rates - say 1 error in 10^12 operations or less. Nobody thinks this can be achieved through hardware engineering alone - this needs quantum error correction (QEC) for sure. But should we be confident that QEC will actually work? Sure, it will work to some extent - but can it work well enough to reach error rates as low as 1e-12 or less? QEC makes non-trivial assumptions about the nature of the physical errors which are never quite true, and deviations from those assumptions could plausibly derail QEC by setting a "logical noise floor" - an error rate below which QEC ceases to work. The previous most thorough search for the logical noise floor in QEC was performed by Google in 2023. At that time, they found that QEC ceases to work at a rather high error rate of 1e-6. This was due to high-energy cosmic rays hitting their qubit chips, causing large-scale correlated errors which cannot be taken out by QEC. That's a *big* issue! Google latest chip incorporates design changes to make it immune to cosmic ray errors. After incorporating those changes, the logical noise floor search was repeated and reported in the recent paper. It turns out the mitigation work, and the logical noise floor was pushed all the way down to a new record of 1e-10, i.e. 1 error per 10^10 operations! This is the most convincing evidence to date that - in a well-engineered QC - QEC is actually capable of pushing the error rates down to levels compatible with most known QC applications. To me, this repetition-code is actually the most important finding reported in Google's paper! Funnily enough, Google's team reports that they actually don't know where this error may be coming from. Error rates this low are also really challenging to study, because it can take considerable data acquisition time to establish meaningful statistics. But I'm sure they'll figure it out soon enough... 😇
-
What a month for quantum error correction! On August 27th, we saw the first demonstration of quantum error correction from Google that satisfied the list of criteria that has emerged in the community for a convincing demonstration: - error correction actually extending the live of qubits beyond that of the best physical qubit in the system - error correction performed in real time, rather than with post-selection, and repeated over many rounds - error rate reducing as code distance is increased This is generally seen as a major breakthrough, and is the culmination of many years of work towards implementing the surface code. You can see the paper here: https://lnkd.in/gkfk68kH Not to be outdone, Microsoft and Quantinuum put out a preprint less than two weeks later demonstration up to 24x reduction in error rate for encoded state preparation using a colour code. You can see the paper here: https://lnkd.in/gtRtfQPc Two big results in a month. That's enough for anyone, right? Nope. On the 23rd of September, we got to see new results from Amazon Web Services (AWS) demonstrating error correction using the repetition code applied to cat qubits. You can see the paper here: https://lnkd.in/gbE45ebt And then, just a day later, new results appeared from Yale Quantum Institute showing error-correction beyond breakeven for three and four level systems using the GKP code. You can see the paper here: https://lnkd.in/gkBYNXzD While I'm sure that almost everyone in the field is aware of the rapid progress in error correction, it's amazing how little noise this is making in the outside world. We're now the right side of the error-correction threshold, and relatively minor performance improvements can lead to significantly reduced noise. If this much progress can happen in a month, then the next couple of years are going to be tremendously exciting for quantum computing.
-
Over the last few weeks, the quantum timeline has been crowded with big headlines: Quantinuum’s Helios launch, IBM’s new Loon and Nighthawk processors, and a steady stream of new “record” benchmarks and roadmaps. In that flood of news, one announcement from Harvard flew a bit under the radar - and it arguably deserves as much attention as any of them. Lukin’s group has just published a Nature paper describing “a fault-tolerant neutral-atom architecture for universal quantum computation”. In practice, they show a reconfigurable neutral-atom processor that brings together the key building blocks of scalable fault tolerance: below-threshold surface-code style error correction, transversal logical gates, teleportation-based logical rotations, mid-circuit qubit reuse, and deep logical circuits that are explicitly engineered to keep entropy under control. I’ve broken down what they achieved, how it compares to other platforms, and why I think this is a genuine inflection point for neutral atoms and for fault tolerance more broadly: https://lnkd.in/gqnYdPXQ This also ties directly into something I’ve been arguing for years in my Path to CRQC / Q-Day methodology: operating below the error-correction threshold is not a nice-to-have, it’s a capability in its own right – the tipping point where adding more qubits and more code distance finally starts to reduce logical error, instead of making things worse. Motivated by the Harvard result, I’ve also published a companion piece that walks through some of the most important below-threshold QEC experiments across platforms – bosonic cat codes, trapped-ion Bacon–Shor and color codes, superconducting surface codes, and now neutral atoms: https://lnkd.in/gvJDNhgm If you’re trying to separate marketing noise from genuine progress toward fault-tolerant, cryptographically relevant quantum computers, these are the kinds of milestones worth tracking. My analysis of the Harvard announcement is here: https://lnkd.in/gqnYdPXQ #Quantum #QuantumComputing #QEC
-
Google has made significant strides in quantum computing with the development of its latest quantum chip, Willow. This chip represents a major advancement toward building practical, large-scale quantum computers capable of solving complex problems far beyond the reach of classical supercomputers. Key Features of Willow: (1) Enhanced Qubit Count: Willow boasts 105 qubits, nearly doubling the count from its predecessor, the Sycamore chip. This increase enables more complex computations and improved error correction capabilities. (2) Error Correction Breakthrough: A notable achievement with Willow is its ability to reduce errors exponentially as the system scales. This addresses a fundamental challenge in quantum computing, where qubits are highly sensitive and prone to errors. By effectively managing these errors, Willow paves the way for more reliable quantum computations. (3) Unprecedented Computational Speed: In benchmark tests, Willow completed a complex computation in under five minutes—a task that would take the most advanced classical supercomputers an estimated 10 septillion years. This dramatic speedup underscores the potential of quantum computing to tackle problems currently deemed intractable. Implications and Future Prospects: The advancements demonstrated by Willow have profound implications across various fields: (4) Cryptography: The immense processing power of quantum computers like Willow could potentially break current cryptographic systems, prompting a reevaluation of data security measures. However, experts note that while Willow's 105 qubits are impressive, breaking encryption such as that used by Bitcoin would require a quantum computer with around 13 million qubits. Therefore, while the threat is not immediate, it is a consideration for the future. (5) Scientific Research: Quantum computing can revolutionize fields like drug discovery, materials science, and complex system modeling by performing simulations and calculations at unprecedented speeds. Artificial Intelligence: The ability to process vast datasets and perform complex optimizations rapidly could significantly enhance AI development and deployment. While Willow marks a significant milestone, the journey toward fully functional, large-scale quantum computers continues. Ongoing research focuses on further increasing qubit counts, enhancing error correction methods, and developing practical applications for this transformative technology.
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Event Planning
- Training & Development