Exponential quantum advantage in processing massive classical data by John Preskill https://lnkd.in/eUTvGHaX Abstract Broadly applicable quantum advantage, particularly in classical data processing and machine learning, has been a fundamental open problem. In this work, we prove that a small quantum computer of polylogarithmic size can perform large-scale classification and dimension reduction on massive classical data by processing samples on the fly, whereas any classical machine achieving the same prediction performance requires exponentially larger size. Furthermore, classical machines that are exponentially larger yet below the required size need superpolynomially more samples and time. We validate these quantum advantages in real-world applications, including single-cell RNA sequencing and movie review sentiment analysis, demonstrating four to six orders of magnitude reduction in size with fewer than 60 logical qubits. These quantum advantages are enabled by quantum oracle sketching, an algorithm for accessing the classical world in quantum superposition using only random classical data samples. Combined with classical shadows, our algorithm circumvents the data loading and readout bottleneck to construct succinct classical models from massive classical data, a task provably impossible for any classical machine that is not exponentially larger than the quantum machine. These quantum advantages persist even when classical machines are granted unlimited time or if BPP = BQP, and rely only on the correctness of quantum mechanics. Together, our results establish machine learning on classical data as a broad and natural domain of quantum advantage and a fundamental test of quantum mechanics at the complexity frontier.
Validating Quantum Speedup in Algorithms
Explore top LinkedIn content from expert professionals.
Summary
Validating quantum speedup in algorithms means proving that quantum computers can solve certain complex problems much faster than classical computers, marking a significant leap in computing power. This process involves using real-world tasks and rigorous testing to show that quantum algorithms truly outperform their classical counterparts, moving the idea of quantum advantage from theory into practical application.
- Focus on verification: Prioritize experiments and benchmarks that can be mathematically checked, ensuring that quantum performance claims are transparent and trustworthy.
- Explore real-world tasks: Apply quantum algorithms to practical problems like machine learning, chemistry, or data analysis to demonstrate their value beyond theoretical speedups.
- Compare across platforms: Assess performance using both quantum and classical machines on the same problem to highlight genuine speedup and pinpoint areas where quantum computing shines.
-
-
Google Demonstrates a Practical Quantum Algorithm That Beats a Supercomputer Introduction Google and a broad academic collaboration have unveiled a quantum algorithm that delivers a clear, time-based quantum advantage while pointing toward real scientific utility. Known as “quantum echoes,” the approach dramatically outperforms classical supercomputers on specific calculations and moves the field beyond symbolic milestones toward meaningful applications. From Quantum Supremacy to Quantum Advantage The field has shifted focus from raw quantum supremacy to two stronger benchmarks: quantum utility and quantum advantage. Quantum advantage requires a quantum system to complete a task vastly faster than classical machines using the best known algorithms. Google’s new work demonstrates such an advantage in elapsed time, not just theoretical complexity. How Quantum Echoes Work The algorithm evolves a quantum system forward in time, applies a small randomized perturbation, then evolves it backward. Forward and backward evolutions interfere quantum mechanically, producing measurable effects known as out-of-time-order correlations. Repeating these “echoes” many times reveals probability distributions that are extremely costly to simulate classically. A task completed on Google’s quantum computer in about two hours would take the Frontier supercomputer an estimated 3.2 years. Demonstrating Advantage and Early Utility The quantum echo algorithm was run on up to 65 qubits, exploiting entanglement and interference effects inaccessible to classical simulation at scale. While classical supercomputers can simulate a single instance, repeated sampling quickly becomes infeasible. Google extended the concept to nuclear magnetic resonance experiments, showing how quantum echoes could probe long-range molecular structure. This opens a potential path to improving NMR analysis by extracting structural information currently beyond classical modeling. Limits and Open Questions Current demonstrations involved small molecules that remain classically simulable, meaning quantum advantage and quantum utility were shown separately rather than simultaneously. Modeling molecules fully beyond classical reach will require further improvements in qubit fidelity by a factor of three to four. Verification remains challenging, as quantum echoes cannot be easily checked by classical means and no other quantum system currently matches the required scale and accuracy. Why This Matters Quantum echoes represent a concrete step toward useful quantum computation. They show that quantum computers can already outperform the world’s best classical machines in time-to-solution while illuminating real physical systems. Even with open questions around verification and scale, this work signals that quantum advantage is no longer just theoretical—it is beginning to intersect with practical science and experimentation Keith King https://lnkd.in/gHPvUttw
-
"Researchers from USC and Johns Hopkins used two IBM Eagle quantum processors to pull off an unconditional, exponential speedup on a classic “guess-the-pattern” puzzle, proving—without assumptions—that quantum machines can now outpace the best classical computers." "What makes a speedup “unconditional,” Lidar explains, is that it doesn’t rely on any unproven assumptions. Prior speedup claims required the assumption that there is no better classical algorithm against which to benchmark the quantum algorithm. Here, the team led by Lidar used an algorithm they modified for the quantum computer to solve a variation of “Simon’s problem,” an early example of quantum algorithms that can, in theory, solve a task exponentially faster than any classical counterpart, unconditionally." https://lnkd.in/ec39PXwv "The goal of demonstrating an algorithmic quantum speedup, i.e., a quantum speedup that scales favorably as the problem size grows, is central to establishing the utility of quantum computers. Simon’s problem is an early example of the Abelian hidden subgroup problem and a precursor to Shor’s factoring algorithm. It requires exponential time to solve on a classical computer but only linear time on a noiseless quantum computer, assuming we count oracle queries but do not account for the actual resources spent on executing the oracle. Here, we studied a modified version of Simon’s problem, which restricts the allowed Hamming weight of the hidden bitstring to 𝑤 ≤𝑛. The classical solution of this version scales as 𝑛𝑤/2. Our goal was to determine whether NISQ devices are capable of providing an algorithmic quantum speedup in solving this version of Simon’s problem. We ran restricted-HW Simon’s algorithm demonstrations on the IBM Quantum platform and demonstrated that two 127-qubit devices, Sherbrooke and Brisbane, exhibited an exponential algorithmic quantum speedup, which extended to larger HW values when we incorporated suitably optimized DD protection." DOI: 10.1103/PhysRevX.15.021082
-
> Sharing Resource < Ok, that's huge: "Exponential quantum advantage in processing massive classical data" by Haimeng Zhao, Alexander Zlokapa, Hartmut Neven, Ryan Babbush, John Preskill, Jarrod R. McClean, Hsin-Yuan (Robert) Huang Abstract: Broadly applicable quantum advantage, particularly in classical data processing and machine learning, has been a fundamental open problem. In this work, we prove that a small quantum computer of polylogarithmic size can perform large-scale classification and dimension reduction on massive classical data by processing samples on the fly, whereas any classical machine achieving the same prediction performance requires exponentially larger size. Furthermore, classical machines that are exponentially larger yet below the required size need superpolynomially more samples and time. We validate these quantum advantages in real-world applications, including single-cell RNA sequencing and movie review sentiment analysis, demonstrating four to six orders of magnitude reduction in size with fewer than 60 logical qubits. These quantum advantages are enabled by quantum oracle sketching, an algorithm for accessing the classical world in quantum superposition using only random classical data samples. Combined with classical shadows, our algorithm circumvents the data loading and readout bottleneck to construct succinct classical models from massive classical data, a task provably impossible for any classical machine that is not exponentially larger than the quantum machine. These quantum advantages persist even when classical machines are granted unlimited time or if BPP=BQP, and rely only on the correctness of quantum mechanics. Together, our results establish machine learning on classical data as a broad and natural domain of quantum advantage and a fundamental test of quantum mechanics at the complexity frontier. Link: https://lnkd.in/gmA-ntVU #quantummachinelearning #quantumcomputing #research #paper #bigdata #logicalqubits
-
𝐐𝐮𝐚𝐧𝐭𝐮𝐦 𝐀𝐧𝐝𝐡𝐫𝐚 𝐒𝐞𝐫𝐢𝐞𝐬 -𝐀 𝐌𝐢𝐥𝐞𝐬𝐭𝐨𝐧𝐞 𝐌𝐨𝐦𝐞𝐧𝐭 𝐢𝐧 𝐐𝐮𝐚𝐧𝐭𝐮𝐦 𝐂𝐨𝐦𝐩𝐮𝐭𝐢𝐧𝐠 For years, quantum advantage was debated. Now, it has been verified. 𝐆𝐨𝐨𝐠𝐥𝐞’𝐬 𝐖𝐢𝐥𝐥𝐨𝐰 𝐐𝐮𝐚𝐧𝐭𝐮𝐦 𝐂𝐡𝐢𝐩 (105 qubits) has officially demonstrated a 𝐯𝐞𝐫𝐢𝐟𝐢𝐚𝐛𝐥𝐞 𝐪𝐮𝐚𝐧𝐭𝐮𝐦 𝐚𝐝𝐯𝐚𝐧𝐭𝐚𝐠𝐞, confirmed by 𝐍𝐚𝐭𝐮𝐫𝐞. This is not hype. This is science that can be checked and trusted. 🔹 What actually happened? Using a new algorithm called 𝐐𝐮𝐚𝐧𝐭𝐮𝐦 𝐄𝐜𝐡𝐨𝐞𝐬, the Willow processor simulated extremely complex quantum interactions known as 𝐎𝐮𝐭-𝐨𝐟-𝐓𝐢𝐦𝐞-𝐎𝐫𝐝𝐞𝐫 𝐂𝐨𝐫𝐫𝐞𝐥𝐚𝐭𝐨𝐫𝐬 (OTOCs). 👉 The result: ~13,000× faster than the world’s fastest classical supercomputer. • Quantum computer: ~2 hours • Classical supercomputer: > 3 years 🔹 Why this result matters more than earlier claims Earlier quantum advantage demonstrations were hard to verify. This one is mathematically cross-checkable. That means: ✅ Accuracy can be validated ✅ Results are scientifically reliable ✅ Quantum computing moves from theory to practice 🔹 Performance highlights • Single-qubit fidelity: ~𝐈𝐗.𝟗𝟕% • Two-qubit fidelity: ~𝐈𝐗.𝟖𝟖% • Readout accuracy: ~𝐈𝐗.𝟓% This level of precision is critical for real-world quantum simulations. 🔹 The bigger picture This breakthrough opens doors to: • Quantum-accelerated chemistry • Advanced materials science • Understanding complex magnetic and molecular systems Not full-scale molecular design yet — but a decisive step toward it. 𝐅𝐨𝐫 𝐭𝐡𝐞 𝐟𝐢𝐫𝐬𝐭 𝐭𝐢𝐦𝐞, 𝐚 𝐪𝐮𝐚𝐧𝐭𝐮𝐦 𝐜𝐨𝐦𝐩𝐮𝐭𝐞𝐫 𝐝𝐢𝐝𝐧’𝐭 𝐣𝐮𝐬𝐭 𝐫𝐮𝐧 𝐟𝐚𝐬𝐭𝐞𝐫 — 𝐢𝐭 𝐩𝐫𝐨𝐝𝐮𝐜𝐞𝐝 𝐬𝐜𝐢𝐞𝐧𝐜𝐞 𝐜𝐥𝐚𝐬𝐬𝐢𝐜𝐚𝐥 𝐜𝐨𝐦𝐩𝐮𝐭𝐞𝐫𝐬 𝐜𝐚𝐧 𝐛𝐚𝐫𝐞𝐥𝐲 𝐭𝐨𝐮𝐜𝐡. This is a turning point for quantum simulation. #QuantumAndhra #QuantumComputing #QuantumAdvantage #WillowChip #QuantumSimulation #FutureOfComputing #QuantumPhysics #ScienceBreakthrough #TechInnovation #QuantumAlgorithms #NextGenTechnology
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development