> Sharing Resource < Ok, that's huge: "Exponential quantum advantage in processing massive classical data" by Haimeng Zhao, Alexander Zlokapa, Hartmut Neven, Ryan Babbush, John Preskill, Jarrod R. McClean, Hsin-Yuan (Robert) Huang Abstract: Broadly applicable quantum advantage, particularly in classical data processing and machine learning, has been a fundamental open problem. In this work, we prove that a small quantum computer of polylogarithmic size can perform large-scale classification and dimension reduction on massive classical data by processing samples on the fly, whereas any classical machine achieving the same prediction performance requires exponentially larger size. Furthermore, classical machines that are exponentially larger yet below the required size need superpolynomially more samples and time. We validate these quantum advantages in real-world applications, including single-cell RNA sequencing and movie review sentiment analysis, demonstrating four to six orders of magnitude reduction in size with fewer than 60 logical qubits. These quantum advantages are enabled by quantum oracle sketching, an algorithm for accessing the classical world in quantum superposition using only random classical data samples. Combined with classical shadows, our algorithm circumvents the data loading and readout bottleneck to construct succinct classical models from massive classical data, a task provably impossible for any classical machine that is not exponentially larger than the quantum machine. These quantum advantages persist even when classical machines are granted unlimited time or if BPP=BQP, and rely only on the correctness of quantum mechanics. Together, our results establish machine learning on classical data as a broad and natural domain of quantum advantage and a fundamental test of quantum mechanics at the complexity frontier. Link: https://lnkd.in/gmA-ntVU #quantummachinelearning #quantumcomputing #research #paper #bigdata #logicalqubits
Proving Performance in Quantum Computing
Explore top LinkedIn content from expert professionals.
Summary
Proving performance in quantum computing means demonstrating that quantum computers can solve certain real-world problems faster or more efficiently than traditional (classical) computers. This often involves carefully designed experiments and mathematical proofs to show that a quantum advantage exists, especially in fields like data analysis, chemistry, and machine learning.
- Show measurable advantage: Focus on tasks where quantum computers can outperform classical systems in speed or memory usage, such as simulating molecules or processing massive datasets.
- Prioritize trustworthy results: Build experiments where the output can be independently verified to ensure the results aren't just theoretical but practical and reliable for real applications.
- Address hardware challenges: Recognize the importance of error correction, circuit design, and integration with classical workflows to make proven performance meaningful outside the lab.
-
-
Lockheed and IBM Use Quantum Computing to Solve Chemistry Puzzle Once Thought Impossible Introduction: Cracking a Chemical Code with Quantum Power In a breakthrough for quantum chemistry, Lockheed Martin and IBM have successfully used quantum computing to model the complex electronic structure of an “open-shell” molecule—a challenge that has defied classical computing for years. This marks the first application of the sample-based quantum diagonalization (SQD) method to such systems and signals a significant advance in the practical application of quantum computing for scientific research. Key Highlights from the Collaboration • The Molecule: Methylene (CH₂): • Methylene is an open-shell molecule, meaning it has unpaired electrons that lead to complex quantum behavior. • These molecules are notoriously difficult to simulate accurately because electron correlations create exponentially growing complexity for classical algorithms. • The Innovation: Sample-Based Quantum Diagonalization (SQD): • The team used IBM’s quantum processor to implement SQD for the first time in an open-shell system. • SQD is a hybrid algorithm that leverages quantum sampling to solve eigenvalue problems in quantum chemistry, reducing computational burdens. • Why Classical Methods Fall Short: • Traditional high-performance computing (HPC) platforms struggle with electron correlation in multi-electron systems. • Approximation techniques become prohibitively expensive as system size increases, especially for reactive or radical species like methylene. • Quantum Advantage in Practice: • Quantum processors can represent electron configurations using entangled qubits, offering more scalable solutions. • By simulating the electronic structure directly, quantum methods could help scientists design new materials, catalysts, and pharmaceuticals faster and more efficiently. Why It Matters: Pushing Past the Limits of Classical Chemistry • Industrial and Scientific Impact: • Simulating open-shell systems is vital for battery design, combustion processes, and metalloprotein modeling. • The success of SQD opens the door to accurate modeling of previously inaccessible molecules, potentially accelerating innovations in energy, health, and aerospace. • Defense and Aerospace Relevance: • Lockheed Martin’s involvement reflects strategic interest in applying quantum computing to defense-grade materials and mission-critical chemistry. • Quantum Chemistry as a Flagship Use Case: • This achievement underscores how quantum computing is beginning to deliver real results in scientific domains where classical methods hit their ceiling. • As quantum hardware improves, the number of solvable molecular systems will expand exponentially. Quantum computing just helped humanity take a critical step into the chemical unknown, proving its value not just in theory—but in practice. Keith King https://lnkd.in/gHPvUttw
-
Yesterday, Google announced it had achieved something called a “verifiable quantum advantage”. The announcement might sound like marketing mush but it’s not. It represents one of the most interesting inflection points in the story of computing since the transistor. For decades, the dream of quantum computing has dangled like science fiction: machines that use the strange rules of quantum mechanics to solve problems that would take supercomputers millennia. In 2019, Google claimed quantum supremacy - meaning their quantum computer solved a problem no classical computer could feasibly do in a reasonable timeframe. But that problem was a glorified dice roll: random number sampling. A proof of principle, not of purpose Their latest claim - quantum advantage - goes further. It says a quantum machine has outperformed the best classical algorithms on a task that’s scientifically meaningful. In their experiment, Google’s Willow processor, a 105-qubit superconducting chip, ran an algorithm called Quantum Echoes to model how information spreads and decoheres - essentially, how order unravels into chaos inside quantum systems. That’s the kind of math that underpins chemistry, materials science, and condensed-matter physics. Willow completed the task 13,000x faster than the world’s best supercomputers, while remaining verifiable - that is, its output could be independently checked. In other words, the machine wasn’t playing a party trick anymore; it was doing science. Every era of computing begins with a strange, narrow demo that later looks obvious in hindsight. ➰ The Wright brothers’ first flight lasted 12 seconds - not exactly air travel. ➰ The first transistor amplified a single signal - not exactly an iPhone. ➰ The first webpage looked like a grocery list - not exactly the internet. Google’s quantum milestone feels the same. A narrow, technical victory that, decades later, we’ll point to and say: that’s when the impossible started to feel inevitable. Of course, the hype shouldn’t outrun the hardware. Quantum systems face 3 towering challenges: ▪️ Error correction: Qubits are noisy - one stray photon can flip a bit of reality. ▪️ Scalability: Doubling qubits isn’t like doubling transistors; coherence decays exponentially. ▪️ Integration: Quantum systems must coexist with classical infrastructure - data movement, cooling, algorithms, verification. For now, the near horizon is hybrid quantum-classical computing, where quantum processors handle intractable subproblems inside classical workflows. For the past 80 years, computing has been about logic - zeros and ones manipulating symbols. Quantum computing is about reality itself: entanglement, superposition, uncertainty. It represents a paradigm where the map is the territory - where we use the universe’s own rules to understand the universe. In that sense, the shift from quantum supremacy to advantage mirrors the shift from theory to instrument - from “it works” to “it works for us.”
-
Thought you knew which #quantumcomputers were best for #quantum optimization? The latest results from Q-CTRL have reset expectations for what is possible on today's gate-model machines. Q-CTRL today announced newly published results that demonstrate a boost of more than 4X in the size of an optimization problem that can be accurately solved, and show for the first time that a utility-scale IBM quantum computer can outperform competitive annealer and trapped ion technologies. Full, correct solutions at 120+ qubit scale for classically nontrivial optimizations! Quantum optimization is one of the most promising quantum computing applications with the potential to deliver major enhancements to critical problems in transport, logistics, machine learning, and financial fraud detection. McKinsey suggests that quantum applications in logistics alone are worth over $200-500B/y by 2035 – if the quantum sector can successfully solve them. Previous third-party benchmark quantum optimization experiments have indicated that, despite their promise, gate-based quantum computers have struggled to live up to their potential because of hardware errors. In previous tests of optimization algorithms, the outputs of the gate-based quantum computers were little different than random outputs or provided modest benefits under limited circumstances. As a result, an alternative architecture known as a quantum annealer was believed – and shown in experiments – to be the preferred choice for exploring industrially relevant optimization problems. Today’s quantum computers were thought to be far away from being able to solve quantum optimization problems that matter to industry. Q-CTRL’s recent results upend this broadly accepted industry narrative by addressing the error challenge. Our methods combine innovations in the problem’s hardware execution with the company’s performance-management infrastructure software run on IBM’s utility-scale quantum computers. This combination delivered improved performance previously limited by errors with no changes to the hardware. Direct tests showed that using Q-CTRL’s novel technology, a quantum optimization problem run on a 127-qubit IBM quantum computer was up to 1,500 times more likely than an annealer to return the correct result, and over 9 times more likely to achieve the correct result than previously published work using trapped ions These results enable quantum optimization algorithms to more consistently find the correct solution to a range of challenging optimization problems at larger scales than ever before. Check out the technical manuscript! https://lnkd.in/gRYAFsRt
-
A new preprint from Google Quantum AI, Caltech, MIT, and Oratomic just made a bold claim: a quantum computer with fewer than 60 logical qubits can outperform any classical machine with exponentially more memory on real machine learning tasks (arXiv:2604.07639). Actual sentiment analysis on IMDb reviews and cell-type classification of scRNA-seq data. The core idea is called quantum oracle sketching. Instead of loading an entire dataset into quantum memory at once (which has always been the Achilles heel of quantum ML), the algorithm streams data one sample at a time. Each sample drives a tiny rotation of the quantum state, a phase gate whose angle is proportional to the data value divided by the total number of samples. After processing enough samples, these microscopic rotations accumulate into an approximate quantum oracle for the full dataset, without ever storing the dataset itself. Because the circuit is deterministically constructed from data rather than trained by gradient descent, it also sidesteps the barren plateau problem that plagues variational quantum approaches. That said, some challenges remain. The circuit depth scales linearly with the dataset size, which means wall-clock runtime grows with N even as memory stays tiny. And on conventional fault-tolerant hardware, those arbitrary-angle rotation gates must be approximated through T gate synthesis and magic state distillation, an overhead the paper does not account for and one that could easily dwarf the rest of the computation. If you followed my post on the STAR architecture two weeks ago, you might already see where this is going. STAR's native support for arbitrary-angle rotations (small angles) removes precisely the magic state distillation overhead that makes quantum oracle sketching look expensive. The linear depth challenge remains open. Together they sketch a more credible path toward practical fault-tolerant quantum machine learning than either suggests alone. #QuantumComputing #QuantumML #AI #FTQC #QuantumResearch #EmergingTech
-
Is this the first real-world use case for quantum computers? True randomness is hard to come by. And in a world where cryptography and fairness rely on it, “close enough” just doesn’t cut it. A new paper in Nature claims to present a demonstrated, certified application of quantum computing, not in theory or simulation, but in the real world. Led by Quantinuum, JPMorganChase, Argonne National Laboratory, Oak Ridge National Laboratory, and The University of Texas at Austin, the team successfully ran a certified randomness expansion protocol on Quantinuum’s 56-qubit H2 quantum computer, and validated the results using over 1.1 exaflops of classical computing power. TL;DR is certified randomness--the kind of true, verifiable unpredictability that’s essential to cryptography and security--was generated by a quantum computer and validated by the world’s fastest supercomputers. Here’s why that matters: True randomness is anything but trivial. Classical systems can simulate randomness, but they’re still deterministic at the core. And for high-stakes environments such as finance, national security, or fairness in elections, you don’t want pseudo-anything. You want cold, hard entropy that no adversary can predict or reproduce. Quantum mechanics is probabilistic by nature. But just generating randomness with a quantum system isn’t enough; you need to certify that it’s truly random and not spoofed. That’s where this experiment comes in. Using a method called random circuit sampling, the team: ⚇ sent quantum circuits to Quantinuum’s 56-qubit H2 processor, ⚇ had it return outputs fast enough to make classical simulation infeasible, ⚇ verified the randomness mathematically using the Frontier supercomputer ⚇ while the quantum device accessed remotely, proving a future where secure, certifiable entropy doesn’t require trusting the hardware in front of you The result? Over 71,000 certifiably random bits generated in a way that proves they couldn’t have come from a classical machine. And it’s commercially viable. Certified randomness may sound niche—but it’s highly relevant to modern cryptography. This could be the start of the earliest true “quantum advantage” that actually matters in practice. And later this year, Quantinuum plans to make it a product. It’s a shift— from demos to deployment from supremacy claims to measurable utility from the theoretical to the trustworthy read more from Matt Swayne at The Quantum Insider here --> https://lnkd.in/gdkGMVRb peer-reviewed paper --> https://lnkd.in/g96FK7ip #QuantumComputing #CertifiedRandomness #Cryptography
-
Google’s Quantum Leap just crossed a line we’ve waited a long time for After years of headlines that overpromised, quantum computing finally delivered something real. Google’s new Willow chip and Quantum Echoes algorithm just solved a scientific problem 13,000 times faster than the most powerful supercomputer alive. And for the first time, another quantum machine could verify the result. Speed matters, but what really changed is trust. So what Happened? Google’s quantum team ran an experiment where their chip sent a pulse into its qubits, nudged one slightly, then reversed the process and “listened” for an echo. That echo revealed details about how particles interact - details classical computers can only approximate. Supercomputers can try to mimic this kind of behavior, but they run out of steam fast. The Willow chip finished the job in minutes, and when another quantum system checked the result, it matched. That’s not just a milestone in speed. It’s a sign of scientific reliability, something quantum computing has struggled to prove. Why It Matters! Quantum breakthroughs used to sound impressive until you looked closer and realized the results couldn’t be verified. This one holds up. Two independent quantum systems reached the same conclusion, which means scientists can finally start to trust these results, not just admire them. That trust is what turns quantum from a lab experiment into a usable scientific tool. What This Unlocks… The possibilities are huge, even if most people won’t notice the change yet. • Medicine: New drugs could be modeled atom by atom instead of through endless testing. • Energy: Better batteries and superconductors could be designed in simulation before they’re built. • Climate: Atmospheric chemistry can be mapped at a level that makes prediction meaningful. • AI: Future systems might use quantum simulations to learn faster, with less data. Think of it as science getting a microscope upgrade. Every field that depends on modeling or simulation - from drug labs to materials research - just got new reach. 💪 A Shift in Confidence Back in 2019, Google claimed “quantum supremacy.” That was about beating supercomputers in speed. This time, it’s about credibility. The Willow experiment doesn’t solve every problem, but it shows the path forward is real. Quantum computing is moving from noisy promise to something you can measure, verify, and build upon. We may not feel it yet, but this is one of those quiet breakthroughs that change how science works. The world’s best computers just hit their limits. Quantum didn’t just pass them. It proved it could be trusted to tell the truth.
-
Google's quantum computer achieved a measurable advantage over classical computers for molecular analysis. Their Quantum Echoes algorithm represents progress toward practical quantum computing applications in chemistry and materials science. The research details: ↳ Published in Nature with peer review ↳ 13,000x performance improvement on specific calculations ↳ Tested on molecules with 15 and 28 atoms ↳ Results verified against established Nuclear Magnetic Resonance data The algorithm functions as a "molecular ruler" that can measure atomic distances and interactions. It uses quantum interference effects to amplify measurement signals, providing sensitivity that classical computers struggle to achieve efficiently. Current applications being explored include: ↳ Drug development for understanding molecular binding ↳ Materials research for battery and polymer characterization ↳ Chemical analysis for determining molecular structures ↳ Nuclear Magnetic Resonance enhancement for laboratory use Google worked with UC Berkeley to validate the approach. The quantum computer analyzed molecular structures and provided information that traditional methods either missed or required significantly more computational time to obtain. The research addresses a practical problem in computational chemistry where molecular modeling requires substantial computing resources. Quantum computers may offer efficiency advantages for these specific types of calculations. This work follows Google's established quantum computing research program, building on their previous demonstrations of quantum error correction and computational complexity advantages. Which scientific fields do you think will adopt quantum-enhanced analysis methods first? ♻️ Share this to inspire someone. ➕ Follow me to stay in touch.
-
Google has unveiled its latest quantum computing chip, Willow, marking a significant breakthrough in the field. This new chip features 105 superconducting qubits and demonstrates unprecedented performance across several metrics[8]. Key Achievements 1. Willow can reduce errors exponentially as it scales up using more qubits, addressing a challenge that has persisted in quantum computing for nearly 30 years[2][9]. 2. The chip performed a standard benchmark computation in under five minutes that would take one of today's fastest supercomputers approximately 10 septillion (10^25) years to complete[1][3]. Willow operates using superconducting transmon qubits, which are tiny electrical circuits exhibiting quantum behavior at extremely low temperatures. These circuits are engineered to function like artificial atoms in a quantum state[2]. The chip's qubits demonstrate coherence times nearly five times better than previous designs. This improvement, combined with advanced machine learning algorithms, enables real-time error correction and exponential error suppression as qubit lattices scale from 3x3 to 7x7 grids[8]. Implications While Willow represents a significant step forward in quantum computing, experts caution that practical applications remain years away[8]. However, this advancement paves the way for future developments in areas such as drug discovery, fusion energy, and battery design[2]. Citations: [1] https://lnkd.in/gQMCS3vc [2] https://lnkd.in/gbZfsHBk [3] https://lnkd.in/gGjj4Hhm [4] https://lnkd.in/gxsSRqP5 [5] https://lnkd.in/guXJm6DS [6] https://lnkd.in/giPxf_h4 [7] https://lnkd.in/gGrVP76u [8] https://lnkd.in/gfWyEFFh [9] https://lnkd.in/gcbe4HMU [10] https://lnkd.in/g_xZDv3j [11] https://lnkd.in/gmEJVSAX [12] https://lnkd.in/gzaFGSKt [13] https://lnkd.in/g3Ff3--S [14] https://lnkd.in/gMhmgfRS [15] https://lnkd.in/gnAy5puH [16] https://lnkd.in/gfTGSXH3
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Event Planning
- Training & Development