This image is from an Amazon Braket slide deck that just did the rounds of all the Deep Tech conferences I've been at recently (this one from Eric Kessler). It's more profound than it might seem. As technical leaders, we're constantly evaluating how emerging technologies will reshape our computational strategies. Quantum computing is prominent in these discussions, but clarity on its practical integration is... emerging. It's becoming clear however that the path forward isn't about quantum versus classical, but how quantum and classical work together. This will be a core theme for the year ahead. As someone now on the implementation partner side of this work, and getting the chance to work on specific implementations of quantum-classical hybrid workloads, I think of it this way: Quantum Processing Units (QPUs) are specialised engines capable of tackling calculations that are currently intractable for even the largest supercomputers. That's the "quantum 101" explanation you've heard over and over. However, missing from that usual story, is that they require significant classical infrastructure for: - Control and calibration - Data preparation and readout - Error mitigation and correction frameworks - Executing the parts of algorithms not suited for quantum speedup Therefore, the near-to-medium term future involves integrating QPUs as accelerators within a broader classical computing environment. Much like GPUs accelerate specific AI/graphics tasks alongside CPUs, QPUs are a promising resource to accelerate specific quantum-suited operations within larger applications. What does this mean for technical decision-makers? Focus on Integration: Strategic planning should center on identifying how and where quantum capabilities can be integrated into existing or future HPC workflows, not on replacing them entirely. Identify Target Problems: The key is pinpointing high-value business or research problems where the unique capabilities of quantum computation could provide a substantial advantage. Prepare for Hybrid Architectures: Consider architectures and software platforms designed explicitly to manage these complex hybrid workflows efficiently. PS: Some companies like Quantum Brilliance are focused on this space from the hardware side from the outset, working with Pawsey Supercomputing Research Centre and Oak Ridge National Laboratory. On the software side there's the likes of Q-CTRL, Classiq Technologies, Haiqu and Strangeworks all tackling the challenge of managing actual workloads (with different levels of abstraction). Speaking to these teams will give you a good feel for topic and approaches. Get to it. #QuantumComputing #HybridComputing #HPC
Quantum Processor vs Classical Processor Capabilities
Explore top LinkedIn content from expert professionals.
Summary
Quantum processors use the principles of quantum mechanics to solve complex problems much faster and more efficiently than classical processors, which rely on traditional electronic circuits. Compared to classical processors, quantum processors are uniquely suited for tasks like simulating quantum systems, solving large-scale mathematical problems, and handling massive data sets with fewer resources.
- Pinpoint quantum advantages: Identify specific challenges—such as advanced molecular simulations or large-scale data analysis—where quantum processors can offer substantial speed and efficiency benefits over classical processors.
- Explore hybrid solutions: Consider integrating quantum processors with classical computing systems to harness the strengths of each, especially for workloads that require both rapid quantum calculations and reliable classical processing.
- Monitor evolving economics: Stay informed about breakthroughs that demonstrate quantum processors outperforming classical systems in targeted tasks, since these shifts may alter investment strategies and reshape future technology infrastructure.
-
-
Dear Prof Feynman, Since your 1982 paper “Simulating Physics with Computers,” quantum computing has developed from speculation into experimental reality. Here’s where we stand in June 2025. Your insight that classical computers cannot efficiently simulate quantum systems proved correct - this became the foundation for building quantum computers. Ion trapping techniques developed in the 1980s now control dozens of trapped ions as quantum bits, enabling high accuracy in single quantum operations and extended coherence times. Josephson junctions became artificial atoms: superconducting circuits that manipulate quantum states at millikelvin temperatures. Current superconducting processors include Google’s Willow chip and IBM’s advanced systems. Two-qubit gate accuracies approach 99%, though environmental noise still limits algorithmic applications to dozens of useful qubits working together. Shor’s factoring algorithm works on small numbers but would need millions of error-corrected quantum bits for practical cryptography. Google’s 2019 quantum demonstration solved a sampling problem faster than classical computers, though the practical advantage is close to nil. Scientists have built logical quantum bits that actually last longer and make fewer errors than the physical quantum bits they’re made from. However, fault-tolerant computation requires significant overhead, necessitating many physical quantum bits per logical quantum bit. IBM plans to develop 200-logical-qubit systems by 2029, utilizing advanced error correction codes. Your original challenge persists. Quantum many-body systems remain exponentially hard to simulate classically, yet building quantum simulators requires controlling thousands of quantum components with extraordinary precision.
-
Quantum Disruption Signal: Small System Challenges Massive AI Infrastructure Economics A new breakthrough suggests that quantum computing may begin to challenge the economic foundations of large scale artificial intelligence systems. Researchers have demonstrated that a compact quantum setup can match or exceed the performance of a far larger classical AI network in a real world task. The study showed that a system built on just nine interacting quantum elements was able to perform multi step weather prediction at a level comparable to a classical reservoir computing network with 10000 nodes. This is significant because traditional AI systems require extensive computational infrastructure, often costing tens or hundreds of millions of dollars to achieve similar outcomes. The key advantage lies in how quantum systems process information. By leveraging quantum interactions, the system can capture complex patterns and dynamics more efficiently than classical architectures. This allows smaller systems to deliver competitive performance in specific domains without the need for massive data center scale resources. While the results are task specific and not yet generalizable across all AI applications, they provide an early indication that quantum enhanced computing could reduce the cost and scale required for certain types of advanced modeling. This introduces a potential shift in how future AI infrastructure is designed and deployed. The implications are strategic and far reaching. If quantum systems can consistently deliver high performance at a fraction of the cost, the current trajectory of building ever larger AI data centers may be challenged. This could reshape investment priorities, accelerate hybrid quantum AI architectures, and redefine competitive advantage in both technology and national capability.
-
A quantum computer recently solved a problem in just four minutes that would take even the most advanced classical supercomputer billions of years to complete. This breakthrough was achieved using a 76-qubit photon-based quantum computer prototype called Jiuzhang. Unlike traditional computers, which rely on electrical circuits, this quantum computer uses an intricate system of lasers, mirrors, prisms, and photon detectors to process information. It performs calculations using a technique known as Gaussian boson sampling, which detects and counts photons. With the ability to count 76 photons, this system far surpasses the five-photon limit of conventional supercomputers. Beyond being a scientific milestone, this technique has real-world potential. It could help solve highly complex problems in quantum chemistry, advanced mathematics, and even contribute to developing a large-scale quantum internet. For example, quantum computers could help scientists design new medicines by simulating how molecules interact at the quantum level—something that classical computers struggle to do efficiently. This could lead to faster discoveries of life-saving drugs and treatments. While both quantum and classical computers are used to solve problems, they function very differently. Quantum computers take advantage of the unique properties of quantum mechanics—such as superposition and entanglement—to perform calculations at incredible speeds. This makes them especially powerful for solving problems that would be nearly impossible for traditional computers, bringing exciting new possibilities for scientific and technological advancements. As the Gaelic saying goes, “Tús maith leath na hoibre”—“A good start is half the work.” Quantum computing is still in its early stages, but its potential to reshape science, medicine, and technology is already clear.
-
> Sharing Resource < Ok, that's huge: "Exponential quantum advantage in processing massive classical data" by Haimeng Zhao, Alexander Zlokapa, Hartmut Neven, Ryan Babbush, John Preskill, Jarrod R. McClean, Hsin-Yuan (Robert) Huang Abstract: Broadly applicable quantum advantage, particularly in classical data processing and machine learning, has been a fundamental open problem. In this work, we prove that a small quantum computer of polylogarithmic size can perform large-scale classification and dimension reduction on massive classical data by processing samples on the fly, whereas any classical machine achieving the same prediction performance requires exponentially larger size. Furthermore, classical machines that are exponentially larger yet below the required size need superpolynomially more samples and time. We validate these quantum advantages in real-world applications, including single-cell RNA sequencing and movie review sentiment analysis, demonstrating four to six orders of magnitude reduction in size with fewer than 60 logical qubits. These quantum advantages are enabled by quantum oracle sketching, an algorithm for accessing the classical world in quantum superposition using only random classical data samples. Combined with classical shadows, our algorithm circumvents the data loading and readout bottleneck to construct succinct classical models from massive classical data, a task provably impossible for any classical machine that is not exponentially larger than the quantum machine. These quantum advantages persist even when classical machines are granted unlimited time or if BPP=BQP, and rely only on the correctness of quantum mechanics. Together, our results establish machine learning on classical data as a broad and natural domain of quantum advantage and a fundamental test of quantum mechanics at the complexity frontier. Link: https://lnkd.in/gmA-ntVU #quantummachinelearning #quantumcomputing #research #paper #bigdata #logicalqubits
-
If you've been doubting whether quantum computers will ever do anything useful beyond breaking encryption, this one's for you. A quantum computer with fewer than 60 logical qubits can run AI on massive real-world datasets using ten thousand to a million times less memory than any classical machine. Movie review sentiment analysis. Cell type classification from RNA sequencing. Real AI tasks, real data. This is not a storage trick. The quantum computer runs the full ML pipeline. An algorithm called quantum oracle sketching streams data through the processor one sample at a time. Each sample applies a small quantum rotation, then gets discarded. The accumulated rotations build a compressed quantum model of the entire dataset in a handful of qubits. Quantum algorithms then run classification and dimensionality reduction directly on that model. A readout protocol extracts the results. Data in, model built, inference done, predictions out. All on a tiny quantum chip. A classical machine matching this provably needs exponentially more memory, and that proof is unconditional. It relies only on quantum superposition being real. It holds even if you give classical machines unlimited time. Think about what this means for the age of AI. The world generates more data every day than it can store. Every sensor, every device, every interaction. Classical AI has to choose: store less and learn worse, or build bigger data centers and burn more energy. A quantum ML pipeline that learns from streaming data without storing it sidesteps that tradeoff entirely. But to be clear: This is a theoretical proof validated through numerical simulations. It has not been demonstrated on actual quantum hardware. Yet, fewer than 60 logical qubits is in the range that near-term error-corrected machines are targeting. We are finally getting the use-case evidence this field needed. 📸 Credits: Haimeng Zhao, Caltech Alexander Zlokapa Hsin-Yuan (Robert) Huang John Preskill Ryan Babbush Jarrod McClean Hartmut Neven Paper on arXiv:2604.07639 Deep dive on this live on X (@drmichaela_e). Newsletter version at 5pm CET today, link on my website.
-
This Quantum Chip Is 7 Septillion Years Faster Than a Supercomputer Google’s Willow chip, the latest milestone in quantum computing, has shattered computational barriers. Performing a complex mathematical task in under five minutes, Willow achieved what would take a classical supercomputer an astonishing 7 septillion years—that’s 7,000,000,000,000,000,000,000,000 years, far exceeding the 13.8-billion-year age of the universe. The Willow chip operates with 105 quantum bits (qubits). Unlike classical bits (which represent data as 0s or 1s), qubits utilize superposition, where a single qubit represents both 0 and 1 simultaneously. This unique property allows Willow to process information exponentially faster than classical computers. For example 2 qubits can represent 4 states simultaneously. Now105 qubits can represent 2¹⁰⁵ states which an astronomically large number. One of the most significant advancements in the Willow chip is its error correction threshold.Willow demonstrated an ability to exponentially reduce errors as more qubits were added, a enormous milestone toward reliable quantum computing. To stabilize qubits, the Willow system operates at 460°F below zero, or 10 millikelvin—just above absolute zero. This environment minimizes thermal noise, which destabilizes quantum states. While Willow’s achievement is a benchmark demonstration, its implications are vast for the future in areas such us Drug discovery, AI, Cryptography, Optimization Problems. Of course, it will take some years to have such chip commercially available, but the latest advancements are showing extreme progress.
-
In my recent posts, I have covered various topics in computing, from the limits of GPUs to alternative approaches like neuromorphic and photonic computing. Today, I want to dive into an area where classical computing simply cannot do the job - not because it’s slow or energy-hungry, but because it’s fundamentally incapable: modeling the world at its most basic level, down to atoms and molecules. This is where quantum computing becomes relevant. To truly solve complex problems like curing diseases or designing batteries that work under specific conditions, we need to model biology, chemistry, and physics at a molecular level, atomic level, and electron level, down to the spin states of the electrons. This is only possible via quantum computing. For example, take aspirin (acetylsalicylic acid), a relatively simple molecule with 15 atoms and 94 electrons. Each electron has two spin states, totaling 188 spin states. To model all possible configurations of these states on a classical computer, you would need 2^188 states, which translates to roughly 6.2 × 10^57 bytes of memory [Stack Exchange, ThoughtCo.] Quantum computers solve this elegantly - representing each state by a unit of information called a qubit. Modeling aspirin would require only 188 qubits - entirely feasible in the quantum realm. This is why quantum computing is essential for advancing our understanding of the physical and biological world. The field is still early, but progress is steady. Quantum-inspired software is already being used, and companies such as SandboxAQ are building real applications. Many more startups are entering the space. I shared a deeper explanation and a list of quantum computing startups in my latest newsletter for those who want to explore further: https://lnkd.in/gK3MXSBj More to come as this series continues.
-
Google Unveils Willow: A Leap Forward in Quantum Computing Google Quantum AI has introduced Willow, a cutting-edge quantum chip designed to address two of the field’s most significant challenges: error correction and computational scalability. Willow, fabricated in Google’s Santa Barbara facility, achieves state-of-the-art performance, marking a pivotal step toward realizing a large-scale, commercially viable quantum computer. It gets way geekier from here – but if you’re with me so far… Exponential Error Reduction Julian Kelly, Director of Quantum Hardware at Google, emphasized Willow’s ability to exponentially reduce errors as the system scales. Utilizing a grid of superconducting qubits, Willow demonstrated a historic breakthrough in quantum error correction. By expanding arrays from 3×3 to 5×5 and then 7×7 qubits, researchers cut error rates in half with each iteration. This achievement, referred to as being “below threshold,” signifies that larger quantum systems can now exhibit fewer errors, a challenge pursued since Peter Shor introduced quantum error correction in 1995. The chip also achieved “beyond breakeven” performance, where arrays of qubits outperformed the lifetimes of individual qubits, which is key to ensuring the feasibility of practical quantum computations. Ten Septillion Years in Five Minutes Willow’s computational capabilities were validated using the Random Circuit Sampling (RCS) benchmark, a rigorous test of quantum supremacy. According to Google’s estimates, Willow completed a task in under five minutes that would take a modern supercomputer ten septillion years—a timescale exceeding the age of the universe. This achievement underscores the rapid, double-exponential performance improvements of quantum systems over classical alternatives. While the RCS benchmark lacks direct commercial applications, it remains a critical indicator of quantum computational power. Kelly noted that surpassing classical systems on this benchmark solidifies confidence in the broader potential of quantum technology. Building Toward Practical Applications Google’s roadmap aims to bridge the gap between theoretical quantum advantage and real-world utility. The team is now focused on achieving “useful, beyond-classical” computations that solve practical problems. Applications in drug discovery, battery design, and AI optimization are among the potential breakthroughs quantum computing could unlock. Willow’s advancements in quantum error correction and computational scalability highlight its transformative potential. As Kelly explained, “Quantum algorithms have fundamental scaling laws on their side,” making quantum computing indispensable for tasks beyond the reach of classical systems. Quantum computing is still years away, but this is an exciting milestone. Considering the remarkable rate of technological improvement we’re experiencing right now, practical quantum computing (and quantum AI) may be closer than we think. -s
-
Most people still see Quantum Computing as a “someday” technology. But if you zoom out, something important becomes clear: Quantum is not competing with GenAI. It is unlocking the places GenAI cannot reach on classical hardware. Here is why this matters: 1. AI is hitting real physical limits. Frontier models require computations at a scale that would take thousands of years on a single processor. GPUs made this possible through massive parallelism, but even they are beginning to reach practical and economic ceilings. 2. Quantum changes the math itself. Superposition, entanglement and interference are not faster versions of today’s chips. They are new computational behaviors that let us explore search spaces, molecular structures and high dimensional patterns in ways classical systems cannot approximate efficiently. 3. This matters for real problems, not theoretical ones: • Drug discovery with atom level accuracy • Financial modeling across thousands of variables • Supply chain design with true combinatorial complexity • Material science and energy breakthroughs • And eventually, more efficient building blocks for next generation model training 4. Quantum does not replace AI. It expands what AI can be applied to, especially in domains that are computationally unreachable today. Classical AI is impressive but bounded. Quantum combined with AI opens new frontiers that remain closed on classical hardware. A grounded nuance: Quantum hardware is still early. Most near term progress will come from hybrid quantum classical workflows, not fully quantum systems. But understanding this shift now gives you a more realistic view of where meaningful breakthroughs may emerge. If you are serious about the future of AI, pay attention to how Quantum will shape the next wave of models, optimization methods and scientific discovery. 💾 Save this 🔁 Repost to help others see where the AI curve is heading 👉 Follow Gabriel Millien for more clarity on AI, LLM architectures and the technologies shaping the next decade CC: Bhavishya Pandit, give him a follow!
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development