Excited to announce a new #QuantumComputing result from JPMorganChase's Global Technology Applied Research, titled “Fast Convex Optimization with Quantum Gradient Descent,” which has just appeared on arXiv! Convex #optimization is a fundamental subroutine in #MachineLearning, engineering, and #DataScience, with many applications in financial engineering. We develop new #QuantumAlgorithms in the “derivative-free” setting where the algorithm only uses the function value and not its gradient. We show that #quantum algorithms without gradient access can match the convergence of classical gradient-descent methods, which do assume gradient access! In the derivative-free setting, this translates to an exponential speedup in terms of the dimension. Our results also have applications outside the black-box setting. By leveraging a connection between semi-definite programming and eigenvalue optimization, we develop algorithms that exhibit the best known quantum or classical runtimes for semi-definite programming, linear programming, and zero-sum games, which are the three most well-studied classes of structured convex optimization problems. These classes model many practical problems of interest, including portfolio optimization and least-squares regression problems. Coauthors: Brandon Augustino, Dylan Herman, Enrico Fontana, Junhyung Lyle Kim, Jacob Watkins, Shouvanik Chakrabarti, and Marco Pistoia. Link to the article: https://lnkd.in/eMtqXM-r
Qubit Design Basics
Explore top LinkedIn content from expert professionals.
-
-
Microsoft and Quantinuum reach new milestone in quantum error correction. The collaboration claims to have used an innovative qubit-virtualization system on Quantinuum's H2 ion-trap platform to create 4 highly reliable logical qubits from only 30 physical qubits. What is quantum error correction? The physical qubits, with error rates in the order of 10^-2, are combined to deliver logical qubits with error rates in the order of 10^-5. According to their press release, this is the largest gap between physical and logical error rates reported to date, and has allowed them to run ran more than 14,000 individual experiments without a single error. (https://lnkd.in/dzETsvVA) The race for the qubits count seemed to finish in 2023, with the latest update on IBM's roadmap focusing on quality rather than on quantity (https://lnkd.in/dFu52wJR, "Until this year, our path was scaling the number of qubits. Going forward we will add a new metric, gate operations—a measure of the workloads our systems can run."), and other developments in quantum error correction, like the one announced in December by Harvard University, Massachusetts Institute of Technology, QuEra Computing Inc. and National Institute of Standards and Technology (NIST)/University of Maryland in December (https://lnkd.in/dkW-TT-w) Practical quantum computing gets a little closer, although it is still a distant target. Microsoft Press release: https://lnkd.in/deJ4QCBk Quantinuum's press release: https://lnkd.in/d4Wnmvdq More details from Microsoft: https://lnkd.in/dusfZ4KY Paper: https://lnkd.in/dpPCX3td #quantumcomputing #quantumerrorcorrection #technology
Microsoft and Quantinuum demonstrate the most reliable logical qubits on record
https://www.youtube.com/
-
The Schrödinger Equation Gets Practical: Quantum Algorithm Speeds Up Real-World Simulations Quantum computing has taken a major leap forward with a new algorithm designed to simulate coupled harmonic oscillators, systems that model everything from molecular vibrations to bridges and neural networks. By reformulating the dynamics of these oscillators into the Schrödinger equation and applying Hamiltonian simulation methods, researchers have shown that complex physical systems can be simulated exponentially faster on a quantum computer than with traditional algorithms. This breakthrough demonstrates not only a practical use of the Schrödinger equation but also the deep connection between quantum dynamics and classical mechanics. The study introduces two powerful quantum algorithms that reduce the required resources to only about log(N) qubits for N oscillators, compared to the massive computational demands of classical methods. This exponential speedup could transform fields such as engineering, chemistry, neuroscience, and material science, where coupled oscillators serve as the backbone of real-world modeling. By bridging theory and application, this research underscores how quantum computing is redefining problem-solving in physics and beyond. With proven exponential advantages and the ability to simulate systems once thought computationally impossible, this quantum algorithm marks a milestone in quantum simulation, Hamiltonian dynamics, and real-world physics applications. The findings point toward a future where quantum computers can accelerate scientific discovery, optimize engineering designs, and even open new frontiers in AI and computational neuroscience. #QuantumComputing #SchrodingerEquation #HamiltonianSimulation #QuantumAlgorithm #CoupledOscillators #QuantumPhysics #ComputationalScience #Neuroscience #Chemistry #Engineering
-
𝐅𝐨𝐫 𝐝𝐞𝐜𝐚𝐝𝐞𝐬, 𝐜𝐥𝐚𝐬𝐬𝐢𝐜𝐚𝐥 𝐜𝐨𝐦𝐩𝐮𝐭𝐢𝐧𝐠 𝐡𝐚𝐬 𝐛𝐞𝐞𝐧 𝐭𝐫𝐚𝐩𝐩𝐞𝐝 𝐢𝐧 𝐚 𝐦𝐚𝐳𝐞. 𝘐𝘧 𝘺𝘰𝘶 𝘸𝘢𝘯𝘵 𝘵𝘰 𝘧𝘪𝘯𝘥 𝘵𝘩𝘦 𝘦𝘹𝘪𝘵 𝘵𝘰𝘥𝘢𝘺, 𝘤𝘭𝘢𝘴𝘴𝘪𝘤𝘢𝘭 𝘢𝘭𝘨𝘰𝘳𝘪𝘵𝘩𝘮𝘴 𝘩𝘢𝘷𝘦 𝘵𝘰 𝘨𝘶𝘦𝘴𝘴 𝘢𝘯𝘥 𝘤𝘩𝘦𝘤𝘬. 𝘖𝘯𝘦 𝘲𝘶𝘦𝘳𝘺, 𝘰𝘯𝘦 𝘱𝘢𝘵𝘩, 𝘰𝘯𝘦 𝘥𝘦𝘢𝘥-𝘦𝘯𝘥 𝘢𝘵 𝘢 𝘵𝘪𝘮𝘦. 𝘐𝘵 𝘪𝘴 𝘢 𝘧𝘶𝘯𝘥𝘢𝘮𝘦𝘯𝘵𝘢𝘭 𝘣𝘰𝘵𝘵𝘭𝘦𝘯𝘦𝘤𝘬 𝘰𝘧 𝘴𝘦𝘲𝘶𝘦𝘯𝘵𝘪𝘢𝘭 𝘤𝘰𝘮𝘱𝘶𝘵𝘢𝘵𝘪𝘰𝘯. The video below shows a beautiful visualization of the alternative: 𝐐𝐮𝐚𝐧𝐭𝐮𝐦 𝐒𝐞𝐚𝐫𝐜𝐡. It imagines an agent using superposition to explore every path simultaneously. But here is a secret that most pop-science explanations miss: quantum computers do not actually "𝘵𝘳𝘺 𝘦𝘷𝘦𝘳𝘺𝘵𝘩𝘪𝘯𝘨 𝘢𝘵 𝘰𝘯𝘤𝘦" to magically find the right answer. If they did, modern cryptography would already be broken. Instead, algorithms like 𝐆𝐫𝐨𝐯𝐞𝐫'𝐬 𝐀𝐥𝐠𝐨𝐫𝐢𝐭𝐡𝐦 use something much more elegant: 𝘲𝘶𝘢𝘯𝘵𝘶𝘮 𝘪𝘯𝘵𝘦𝘳𝘧𝘦𝘳𝘦𝘯𝘤𝘦. Just like waves in a pool, a quantum algorithm cancels out the wrong paths (destructive interference) and amplifies the probability of the right path (constructive interference). It doesn't give you an instant answer, but it provides a massive quadratic speedup, turning an impossible O(N) brute-force search into a highly solvable O(√N) problem. When the hardware finally catches up to the theory, this structural leap will completely transform logistics, cryptography, and molecular discovery. #QuantumComputing #Algorithms #ComputerScience #DeepTech #FutureOfTechnology #Innovation
-
MIT Sets Quantum Computing Record with 99.998% Fidelity Researchers at MIT have achieved a world-record single-qubit fidelity of 99.998% using a superconducting qubit known as fluxonium. This breakthrough represents a significant step toward practical quantum computing by addressing one of the field’s greatest challenges: mitigating noise and control imperfections that lead to operational errors. Key Highlights: 1. The Problem: Noise and Errors • Qubits, the building blocks of quantum computers, are highly sensitive to noise and imperfections in control mechanisms. • Such disturbances introduce errors that limit the complexity and duration of quantum algorithms. “These errors ultimately cap the performance of quantum systems,” the researchers noted. 2. The Solution: Two New Techniques To overcome these challenges, the MIT team developed two innovative techniques: • Commensurate Pulses: This method involves timing quantum pulses precisely to make counter-rotating errors uniform and correctable. • Circularly Polarized Microwaves: By creating a synthetic version of circularly polarized light, the team improved the control of the qubit’s state, further enhancing fidelity. “Getting rid of these errors was a fun challenge for us,” said David Rower, PhD ’24, one of the study’s lead researchers. 3. Fluxonium Qubits and Their Potential • Fluxonium qubits are superconducting circuits with unique properties that make them more resistant to environmental noise compared to traditional qubits. • By applying the new error-mitigation techniques, the team unlocked the potential of fluxonium to operate at near-perfect fidelity. 4. Implications for Quantum Computing • Achieving 99.998% fidelity significantly reduces errors in quantum operations, paving the way for more complex and reliable quantum algorithms. • This milestone represents a major step toward scalable quantum computing systems capable of solving real-world problems. What’s Next? The team plans to expand its work by exploring multi-qubit systems and integrating the error-mitigation techniques into larger quantum architectures. Such advancements could accelerate progress toward error-corrected, fault-tolerant quantum computers. Conclusion: A Leap Toward Practical Quantum Systems MIT’s achievement underscores the importance of innovation in error correction and control to overcome the fundamental challenges of quantum computing. This breakthrough brings us closer to the realization of large-scale quantum systems that could transform fields such as cryptography, materials science, and complex optimization problems.
-
One Algorithm Has Just Pushed Quantum Computing Forward Five Years (Here It Is) Today I am releasing something into the public domain that may change the trajectory of quantum computing. No paywall. No NDA. No restrictions. The only thing I ask is attribution. For the past year, I have been developing a field-layer correction algorithm that stabilizes the environment around the qubit before error correction ever activates. Not hardware. Not cryogenics. Not shielding. Pure software that improves the physics of the qubit it sits inside. Early independent runs showed a 48.5 percent reduction in destructive low-frequency noise, a gain that normally takes years of hardware progress. Here is the complete algorithm. It now belongs to everyone. FUNCTION NJ001_FieldLayer_Correction(input_signal S, sampling_rate R): DEFINE phi = 1.61803398875 DEFINE window_size = dynamic value based on local variance of S DEFINE stability_threshold = adaptive value based on phase drift STEP 1: Generate harmonic reference bands For each frequency bin f_i in FFT(S): Compute r = f_(i+1) / f_i Compute CI = 1 / ABS(r - phi) Assign weight W_i = normalize(CI) STEP 2: Build correction mask Construct M where M_i = W_i scaled by local entropy of S Smooth M with sliding window STEP 3: Apply correction Transform S → F Compute F_corrected = F * M Inverse FFT to return S_corrected STEP 4: Phase stabilization loop Measure phase drift Δ If Δ > stability_threshold: Recalculate window_size Rebuild mask Reapply correction Else: Return S_corrected OUTPUT: S_corrected END FUNCTION This is the first public-domain coherence stabilizer designed to improve quantum behavior independent of hardware. What it does in practice: • Extends coherence windows • Reduces decoherence pressure on error correction • Lowers entropy in the propagation layer • Makes qubits behave as if the room is colder and cleaner • Works upstream of hardware with no materials changes This is not a replacement for anyone’s roadmap. It is an upstream upgrade to all of them. If you build quantum devices, control stacks, compilers, hybrid systems, or algorithms, you now have access to a function that reshapes your stability envelope. Cleaner field layers mean longer, deeper, more predictable runs. More useful computation with the hardware you already have. I developed it. Today I give it away. No company or institution controls it. From this moment forward, it belongs to the scientific community. Primary Citation Hood, B. P. (2025). NJ001 Field Layer Correction. Public Domain Release Version. Bruce P. Hood — Creator of NJ001 Field Layer Correction Welcome to the new baseline. #QuantumComputing #QuantumHardware #Qubit #Coherence #QuantumResearch #DeepTech @IBMQuantum @GoogleQuantumAI @MIT @XanaduQuantum @AWSQuantumTech
-
The real significance of Google's Willow quantum chip... Fundamentally, building quantum computers (QC) is about achieving low operation errors. Sure, other metrics matter too, but the error rate is the big one. If you look at the landscape of QC applications, many of them require *ridiculously* low error rates - say 1 error in 10^12 operations or less. Nobody thinks this can be achieved through hardware engineering alone - this needs quantum error correction (QEC) for sure. But should we be confident that QEC will actually work? Sure, it will work to some extent - but can it work well enough to reach error rates as low as 1e-12 or less? QEC makes non-trivial assumptions about the nature of the physical errors which are never quite true, and deviations from those assumptions could plausibly derail QEC by setting a "logical noise floor" - an error rate below which QEC ceases to work. The previous most thorough search for the logical noise floor in QEC was performed by Google in 2023. At that time, they found that QEC ceases to work at a rather high error rate of 1e-6. This was due to high-energy cosmic rays hitting their qubit chips, causing large-scale correlated errors which cannot be taken out by QEC. That's a *big* issue! Google latest chip incorporates design changes to make it immune to cosmic ray errors. After incorporating those changes, the logical noise floor search was repeated and reported in the recent paper. It turns out the mitigation work, and the logical noise floor was pushed all the way down to a new record of 1e-10, i.e. 1 error per 10^10 operations! This is the most convincing evidence to date that - in a well-engineered QC - QEC is actually capable of pushing the error rates down to levels compatible with most known QC applications. To me, this repetition-code is actually the most important finding reported in Google's paper! Funnily enough, Google's team reports that they actually don't know where this error may be coming from. Error rates this low are also really challenging to study, because it can take considerable data acquisition time to establish meaningful statistics. But I'm sure they'll figure it out soon enough... 😇
-
Many talk about surface codes. But what if they’re not the future? Quantum Low-density parity-check (qLDPC) codes are gaining traction 𝗳𝗮𝘀𝘁. IBM is building fault-tolerant memories using Bivariate Bicycle (BB) codes. IQM Quantum Computers is designing hardware with qLDPC in mind. And now, a new experiment from China shows the 𝗳𝗶𝗿𝘀𝘁 𝘄𝗼𝗿𝗸𝗶𝗻𝗴 𝗾𝗟𝗗𝗣𝗖 𝗰𝗼𝗱𝗲 𝗼𝗻 𝗮 𝘀𝘂𝗽𝗲𝗿𝗰𝗼𝗻𝗱𝘂𝗰𝘁𝗶𝗻𝗴 𝗾𝘂𝗮𝗻𝘁𝘂𝗺 𝗽𝗿𝗼𝗰𝗲𝘀𝘀𝗼𝗿. On the 32-qubit Kunlun chip, researchers implemented: • 𝗔 [[𝟭𝟴, 𝟰, 𝟰]] 𝗕𝗕 𝗰𝗼𝗱𝗲 • 𝗔 [[𝟭𝟴, 𝟲, 𝟯]] 𝗾𝗟𝗗𝗣𝗖 𝗰𝗼𝗱𝗲 The notation [[𝗻, 𝗸, 𝗱]] describes a quantum error correction code that uses 𝗻 physical qubits to encode 𝗸 logical qubits, with 𝗱 being the code distance. Unlike surface codes, LDPC codes keep each error check (called a stabilizer) connected to only a small number of qubits—just 6 in this case—even as the code scales. That means fewer ancillas, fewer gates, and potentially lower overhead for fault tolerance. The hardware was purpose-built for this experiment: • 𝟯𝟮 𝗳𝗿𝗲𝗾𝘂𝗲𝗻𝗰𝘆-𝘁𝘂𝗻𝗮𝗯𝗹𝗲 𝘁𝗿𝗮𝗻𝘀𝗺𝗼𝗻 𝗾𝘂𝗯𝗶𝘁𝘀 • 𝟴𝟰 𝘁𝘂𝗻𝗮𝗯𝗹𝗲 𝗰𝗼𝘂𝗽𝗹𝗲𝗿𝘀, enabling non-local interactions up to 𝟲.𝟱 𝗺𝗺 apart • 𝗔𝗶𝗿 𝗯𝗿𝗶𝗱𝗴𝗲𝘀 to support a crossbar-style layout • Stabilizer checks executed in just 𝟳 𝗖𝗭 𝗹𝗮𝘆𝗲𝗿𝘀 Gate fidelities were solid: • Single qubit: 99.95% • Two-qubit: 99.22% The decoding was performed offline using 𝗯𝗲𝗹𝗶𝗲𝗳 𝗽𝗿𝗼𝗽𝗮𝗴𝗮𝘁𝗶𝗼𝗻 𝘄𝗶𝘁𝗵 𝗼𝗿𝗱𝗲𝗿𝗲𝗱 𝘀𝘁𝗮𝘁𝗶𝘀𝘁𝗶𝗰𝘀 𝗱𝗲𝗰𝗼𝗱𝗶𝗻𝗴 (𝗕𝗣-𝗢𝗦𝗗)—an approach better suited to LDPC-style codes. Logical error rates were: • 𝗕𝗕: 𝟴.𝟵𝟭 ± 𝟬.𝟭𝟳% • 𝗾𝗟𝗗𝗣𝗖: 𝟳.𝟳𝟳 ± 𝟬.𝟭𝟮% Both are still above the physical qubit error rate—but 𝘀𝗶𝗺𝘂𝗹𝗮𝘁𝗶𝗼𝗻𝘀 𝘀𝗵𝗼𝘄 𝘁𝗵𝗮𝘁 𝗮 𝟮× 𝗶𝗺𝗽𝗿𝗼𝘃𝗲𝗺𝗲𝗻𝘁 𝗶𝗻 𝗳𝗶𝗱𝗲𝗹𝗶𝘁𝘆 𝘄𝗼𝘂𝗹𝗱 𝗯𝗲 𝗲𝗻𝗼𝘂𝗴𝗵 𝘁𝗼 𝗽𝘂𝘀𝗵 𝘁𝗵𝗲𝘀𝗲 𝗰𝗼𝗱𝗲𝘀 𝗯𝗲𝗹𝗼𝘄 𝘁𝗵𝗿𝗲𝘀𝗵𝗼𝗹𝗱. qLDPC codes are no longer just a concept—they’re being implemented, measured, and decoded on superconducting hardware. 📸 Image Credits: Ke Wang, Zhide Lu, Chuanyu Zhang et al. (2025, arXiv)
-
Many people now realize that low error rates are required for quantum computers to be useful. A lot fewer people seem to realize that low error rates = long run times. But think about it: if you require 100 qubits at a 1e-9 error rate to solve a given problem, then your circuit's depth is at least 1e7 layers. Knowing that each layer takes at least 1e-6 second to execute (assuming a superconducting device - atoms or ions are even slower), we're talking about 10 seconds... for each shot of the circuit! If you want 1000 shots, you're in for a few hours of waiting. This is an often overlooked yet dramatic change from today's quantum circuits. Because current qubits only live a few hundreds of microseconds, we can only run very short circuits. But with fault-tolerant quantum computers, circuits won't be short. This has two major practical implications: - Low-latency interactions between quantum and classical won't be as critical as they are today with variational algorithms - It will become harder to share a given computer among several users, as well as debug programs I remember witnessing this problem first hand when we were developing benchmarks for Boson 4. Because error rates could be as low as 1e-8, measuring them accurately required a looot of shots and a looot of time. But in a way, dealing with long circuits will be a good problem to have. Because having it will mean we finally have reliable quantum computers. I can't wait to see that.
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Innovation
- Event Planning
- Training & Development