“Exponential quantum advantage in processing massive classical data" by Haimeng Zhao, Alexander Zlokapa, Hartmut Neven, Ryan Babbush, John Preskill, Jarrod R. McClean, Hsin-Yuan (Robert) Huang Abstract: Broadly applicable quantum advantage, particularly in classical data processing and machine learning, has been a fundamental open problem. In this work, we prove that a small quantum computer of polylogarithmic size can perform large-scale classification and dimension reduction on massive classical data by processing samples on the fly, whereas any classical machine achieving the same prediction performance requires exponentially larger size. Furthermore, classical machines that are exponentially larger yet below the required size need superpolynomially more samples and time. We validate these quantum advantages in real-world applications, including single-cell RNA sequencing and movie review sentiment analysis, demonstrating four to six orders of magnitude reduction in size with fewer than 60 logical qubits. These quantum advantages are enabled by quantum oracle sketching, an algorithm for accessing the classical world in quantum superposition using only random classical data samples. Combined with classical shadows, our algorithm circumvents the data loading and readout bottleneck to construct succinct classical models from massive classical data, a task provably impossible for any classical machine that is not exponentially larger than the quantum machine. These quantum advantages persist even when classical machines are granted unlimited time or if BPP=BQP, and rely only on the correctness of quantum mechanics. Together, our results establish machine learning on classical data as a broad and natural domain of quantum advantage and a fundamental test of quantum mechanics at the complexity frontier. Link: https://lnkd.in/gmA-ntVU #quantummachinelearning #quantumcomputing #research #paper #bigdata #logicalqubits
Quantum Advantage in Classical Data Processing
More Relevant Posts
-
> Sharing Resource < Ok, that's huge: "Exponential quantum advantage in processing massive classical data" by Haimeng Zhao, Alexander Zlokapa, Hartmut Neven, Ryan Babbush, John Preskill, Jarrod R. McClean, Hsin-Yuan (Robert) Huang Abstract: Broadly applicable quantum advantage, particularly in classical data processing and machine learning, has been a fundamental open problem. In this work, we prove that a small quantum computer of polylogarithmic size can perform large-scale classification and dimension reduction on massive classical data by processing samples on the fly, whereas any classical machine achieving the same prediction performance requires exponentially larger size. Furthermore, classical machines that are exponentially larger yet below the required size need superpolynomially more samples and time. We validate these quantum advantages in real-world applications, including single-cell RNA sequencing and movie review sentiment analysis, demonstrating four to six orders of magnitude reduction in size with fewer than 60 logical qubits. These quantum advantages are enabled by quantum oracle sketching, an algorithm for accessing the classical world in quantum superposition using only random classical data samples. Combined with classical shadows, our algorithm circumvents the data loading and readout bottleneck to construct succinct classical models from massive classical data, a task provably impossible for any classical machine that is not exponentially larger than the quantum machine. These quantum advantages persist even when classical machines are granted unlimited time or if BPP=BQP, and rely only on the correctness of quantum mechanics. Together, our results establish machine learning on classical data as a broad and natural domain of quantum advantage and a fundamental test of quantum mechanics at the complexity frontier. Link: https://lnkd.in/gmA-ntVU #quantummachinelearning #quantumcomputing #research #paper #bigdata #logicalqubits
To view or add a comment, sign in
-
-
Ever since John Preskill and Hsin-Yuan Huang joined oratomic they are dropping one bomb after another 🙀 “A small quantum computer of polylogarithmic size (i.e. its size grows only as log log of the data size) can perform large-scale classification and dimension reduction on massive classical datasets, processing data samples on the fly. Any classical machine achieving the same prediction performance requires exponentially larger size. This is a provable, rigorous separation — not a heuristic claim.” Popular summary by Claude.
> Sharing Resource < Ok, that's huge: "Exponential quantum advantage in processing massive classical data" by Haimeng Zhao, Alexander Zlokapa, Hartmut Neven, Ryan Babbush, John Preskill, Jarrod R. McClean, Hsin-Yuan (Robert) Huang Abstract: Broadly applicable quantum advantage, particularly in classical data processing and machine learning, has been a fundamental open problem. In this work, we prove that a small quantum computer of polylogarithmic size can perform large-scale classification and dimension reduction on massive classical data by processing samples on the fly, whereas any classical machine achieving the same prediction performance requires exponentially larger size. Furthermore, classical machines that are exponentially larger yet below the required size need superpolynomially more samples and time. We validate these quantum advantages in real-world applications, including single-cell RNA sequencing and movie review sentiment analysis, demonstrating four to six orders of magnitude reduction in size with fewer than 60 logical qubits. These quantum advantages are enabled by quantum oracle sketching, an algorithm for accessing the classical world in quantum superposition using only random classical data samples. Combined with classical shadows, our algorithm circumvents the data loading and readout bottleneck to construct succinct classical models from massive classical data, a task provably impossible for any classical machine that is not exponentially larger than the quantum machine. These quantum advantages persist even when classical machines are granted unlimited time or if BPP=BQP, and rely only on the correctness of quantum mechanics. Together, our results establish machine learning on classical data as a broad and natural domain of quantum advantage and a fundamental test of quantum mechanics at the complexity frontier. Link: https://lnkd.in/gmA-ntVU #quantummachinelearning #quantumcomputing #research #paper #bigdata #logicalqubits
To view or add a comment, sign in
-
-
Anyone else trying to wrap their head around this paper? 🤯 John Preskill, Ryan Babbush, Hartmut Neven, and Jarrod McClean just published on exponential quantum advantage for classical data processing. The claim: a quantum computer with <60 logical qubits can do classification tasks that would require a classical machine exponentially larger. And they say they've circumvented the data loading bottleneck. I'll be honest, I'm an experimentalist, and the theory here is way above my pay grade. But if these results hold, the implications are massive. Current quantum computing market estimates (~$1T) assume advantages in simulation, optimization, cryptography. This paper suggests quantum might become essential for scaling AI/ML itself. That's… a much bigger deal. They validated on RNA sequencing and movie sentiment analysis with 4-6 orders of magnitude improvements. Real applications, not toy problems. So here's my question: Is this the moment quantum computing becomes about way more than we thought? Or am I reading too much into this? Would love to hear from folks who've dug into the math. Or better, from the authors themselves 😊 📄 https://lnkd.in/eys4B8Kw
To view or add a comment, sign in
-
A new quantum paper makes a very bold claim: Small quantum machines may compress and classify massive datasets in ways classical systems cannot match efficiently. Why is that a big deal? Because quantum machine learning has had a credibility problem for years. Not because the ideas were not interesting. Because the obvious question was always: how do you load massive classical datasets into a quantum computer without losing the advantage? That has been one of the field’s weakest points. This paper tries to go straight at that problem. The authors argue that a small quantum system can build a much more compact representation of large classical datasets, using random samples rather than relying on unrealistic assumptions about perfect large-scale quantum memory. If that holds up, the implication is huge. It would mean quantum advantage may not be limited to chemistry, materials, or cryptography. It may also extend to one of the biggest problems in computing: finding useful patterns in massive amounts of classical data. And that matters because this is a much larger and more commercially relevant domain than many of the areas where quantum advantage is usually discussed. To me, that is what makes this paper so interesting. Not that it proves practical quantum AI is here. It does not. But it is one of the clearest attempts I have seen to answer the hardest objection in quantum ML with a concrete framework instead of hand-waving. This is also not a fringe claim. The paper comes from a highly credible group spanning Caltech and Google Quantum AI, including John Preskill, Hartmut Neven, Ryan Babbush, Jarrod McClean, Haimeng Zhao, Alexander Zlokapa, and Hsin-Yuan Huang. That does not make the result automatically right. But it absolutely makes it worth taking seriously. Would love to host some of the authors on Beyond the Qubit for a deep dive on the assumptions, implications, and what this could mean for the future of quantum machine learning. https://lnkd.in/expVegEy submitted 8 april. #QuantumComputing #QuantumMachineLearning #DeepTech #QuantumTechnology #MachineLearning #BeyondTheQubit
To view or add a comment, sign in
-
Interesting new preprint by Haimeng Zhao, Alexander Zlokapa, Hartmut Neven, Ryan Babbush, John Preskill, Jarrod McClean, and Hsin-Yuan (Robert) Huang: “Exponential quantum advantage in processing massive classical data” (arXiv:2604.07639). The idea: Small quantum computers may process massive classical datasets with exponentially less memory than classical systems. What makes this especially notable is that the authors do not rely on QRAM. Instead, they introduce quantum oracle sketching, a way to build quantum access from streamed classical samples, combined with efficient classical readout. If this holds up, it could move the quantum advantage discussion beyond niche problems and closer to real machine learning and data-processing tasks. Still early and theoretical, but definitely worth watching. #QuantumComputing #QuantumAI #MachineLearning #DataScience #Innovation
To view or add a comment, sign in
-
Exponential quantum advantage in processing massive classical data by John Preskill https://lnkd.in/eUTvGHaX Abstract Broadly applicable quantum advantage, particularly in classical data processing and machine learning, has been a fundamental open problem. In this work, we prove that a small quantum computer of polylogarithmic size can perform large-scale classification and dimension reduction on massive classical data by processing samples on the fly, whereas any classical machine achieving the same prediction performance requires exponentially larger size. Furthermore, classical machines that are exponentially larger yet below the required size need superpolynomially more samples and time. We validate these quantum advantages in real-world applications, including single-cell RNA sequencing and movie review sentiment analysis, demonstrating four to six orders of magnitude reduction in size with fewer than 60 logical qubits. These quantum advantages are enabled by quantum oracle sketching, an algorithm for accessing the classical world in quantum superposition using only random classical data samples. Combined with classical shadows, our algorithm circumvents the data loading and readout bottleneck to construct succinct classical models from massive classical data, a task provably impossible for any classical machine that is not exponentially larger than the quantum machine. These quantum advantages persist even when classical machines are granted unlimited time or if BPP = BQP, and rely only on the correctness of quantum mechanics. Together, our results establish machine learning on classical data as a broad and natural domain of quantum advantage and a fundamental test of quantum mechanics at the complexity frontier.
To view or add a comment, sign in
-
[#arXivaria] Happy to read that processing classical data on quantum computers might finally be saved. Until today, quantum ML with classical data was virtually impossible due to the massive data loading (I/O) bottleneck. As proven theoretically, an exponential separation is enabled by loading classical data on the fly: each data point applies a multi-qubit phase gate to a small quantum register and is then discarded. Through a clever algorithmic construction, noise accumulation is avoided, allowing just 60 qubits to match the performance of a classical machine requiring 1e4 to 1e6 floating-point numbers. https://lnkd.in/efsiTrFp
To view or add a comment, sign in
-
🌐⚛️ Happy World Quantum Day! Today, on 4/14 (Planck's constant in units of 10⁻³⁴ J·s — yes, that's intentional 🤓), I want to celebrate a breakthrough that genuinely excites us. A new paper from researchers at Caltech, Google, and collaborating institutions just answered one of quantum computing's most stubborn questions: Can quantum computers actually help with real-world AI — or are they just hyped-up tools for niche physics problems? The answer, it turns out, is a resounding YES. And here's why it matters, in plain English: 🔬 The old problem: Quantum computers are powerful, but they need data fed to them in a very special way — kind of like a very picky chef who only works with pre-prepared, perfectly formatted ingredients. Real-world data (your Netflix reviews, your medical records, your DNA scans) doesn't come that way. Getting it into a quantum computer efficiently has been an unsolved bottleneck for decades. 💡 The new solution: "Quantum Oracle Sketching" Instead of loading a mountain of data all at once, this technique feeds data like a stream — each piece nudging the quantum system slightly, building up a complete picture without needing massive memory storage. Elegant, efficient, and provably optimal. 📊 The results? → 10,000 to 1,000,000x reduction in memory usage — demonstrated on real datasets → A 300-qubit quantum computer could outperform a classical machine made from every atom in the observable universe for certain tasks → Tested on movie reviews and single-cell biology data with fewer than 60 logical qubits This isn't science fiction. This is peer-reviewed, rigorous mathematics — and it suggests quantum-enhanced AI isn't just possible, it's likely coming sooner than most people think. We may be standing at the same historical moment as the early days of deep learning — right before everything changed. To every physicist, ML engineer, and curious mind out there: the quantum + AI frontier is wide open. Happy World Quantum Day. 🚀 🔗 Full article: https://lnkd.in/geqiRbU7 📄 Paper: arxiv.org/abs/2604.07639 #WorldQuantumDay #QuantumComputing #QuantumAI #MachineLearning #AI #Qubits #FutureOfComputing #Caltech #Research
To view or add a comment, sign in
-
The future of data science might not be purely classical. Sara A. Metwalli shares insights on how quantum computing could reshape the field and its possibilities.
To view or add a comment, sign in
More from this author
-
Citizen AI Engineer Program 2018 Free, Open, and Adaptive program
Dr. Mohan K. Bavirisetty 8y -
This week's top 12 inspiring, thoughtful and moving quotes
Dr. Mohan K. Bavirisetty 9y -
October Books of the Month | The Sciences of the Artificial | The Effective Executive | The Road to Reality
Dr. Mohan K. Bavirisetty 9y
Explore related topics
- Applying Quantum Superposition to Machine Learning Models
- Applying Quantum Advantage in Professional Domains
- Real-World Uses of Quantum Advantage Today
- Machine Learning Applications in Quantum Readout
- Impact of Qubits on Machine Learning Models
- Applying Quantum Machine Learning to Diverse Quantum Data
- Achieving Quantum Advantage Across All Computing Platforms
- Quantum vs Classical Computation in Real-World Applications
- Competitive Advantages of Quantum Computing Companies
- Quantum Principles Shaping Machine Learning Trends
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development