Quantum Computing Developments

Explore top LinkedIn content from expert professionals.

  • View profile for Saanya Ojha
    Saanya Ojha Saanya Ojha is an Influencer

    Partner at Bain Capital Ventures

    80,178 followers

    Yesterday, Google announced it had achieved something called a “verifiable quantum advantage”. The announcement might sound like marketing mush but it’s not. It represents one of the most interesting inflection points in the story of computing since the transistor. For decades, the dream of quantum computing has dangled like science fiction: machines that use the strange rules of quantum mechanics to solve problems that would take supercomputers millennia. In 2019, Google claimed quantum supremacy - meaning their quantum computer solved a problem no classical computer could feasibly do in a reasonable timeframe. But that problem was a glorified dice roll: random number sampling. A proof of principle, not of purpose Their latest claim - quantum advantage - goes further. It says a quantum machine has outperformed the best classical algorithms on a task that’s scientifically meaningful. In their experiment, Google’s Willow processor, a 105-qubit superconducting chip, ran an algorithm called Quantum Echoes to model how information spreads and decoheres - essentially, how order unravels into chaos inside quantum systems. That’s the kind of math that underpins chemistry, materials science, and condensed-matter physics. Willow completed the task 13,000x faster than the world’s best supercomputers, while remaining verifiable - that is, its output could be independently checked. In other words, the machine wasn’t playing a party trick anymore; it was doing science. Every era of computing begins with a strange, narrow demo that later looks obvious in hindsight. ➰ The Wright brothers’ first flight lasted 12 seconds - not exactly air travel. ➰ The first transistor amplified a single signal - not exactly an iPhone. ➰ The first webpage looked like a grocery list - not exactly the internet. Google’s quantum milestone feels the same. A narrow, technical victory that, decades later, we’ll point to and say: that’s when the impossible started to feel inevitable. Of course, the hype shouldn’t outrun the hardware. Quantum systems face 3 towering challenges: ▪️ Error correction: Qubits are noisy - one stray photon can flip a bit of reality. ▪️ Scalability: Doubling qubits isn’t like doubling transistors; coherence decays exponentially. ▪️ Integration: Quantum systems must coexist with classical infrastructure - data movement, cooling, algorithms, verification. For now, the near horizon is hybrid quantum-classical computing, where quantum processors handle intractable subproblems inside classical workflows. For the past 80 years, computing has been about logic - zeros and ones manipulating symbols. Quantum computing is about reality itself: entanglement, superposition, uncertainty. It represents a paradigm where the map is the territory - where we use the universe’s own rules to understand the universe. In that sense, the shift from quantum supremacy to advantage mirrors the shift from theory to instrument - from “it works” to “it works for us.”

  • View profile for Jason Zander

    Executive Vice President at Microsoft

    40,931 followers

    Today marks a historic milestone in quantum computing, as Microsoft and Quantinuum demonstrate the most reliable logical qubits on record. This breakthrough, with a logical error rate 800x better than the physical error rate, signifies a giant leap from the noisy intermediate-scale quantum (NISQ) level (Level 1 – Foundational) to Level 2 – Resilient quantum computing.   This progress is significant as logical qubits are only useful when they have a better error rate than physical qubits themselves. The number of physical qubits is a misleading metric; it’s not how many qubits, it’s how good they are and how resilient the quantum system is to errors.   Using the logical qubits we created, we were able to successfully perform multiple active syndrome extractions, which is when errors are diagnosed and corrected without destroying the logical qubits. Active syndrome extraction helps quantum computers stay reliable even when operations are imperfect.   With the promise of a hybrid supercomputing system powered by these reliable logical qubits, we’re paving the way for scientific and commercial breakthroughs that were once deemed impossible.  This achievement is a testament to the power of collaboration and the collective advancement of quantum hardware and software.   You can learn more from my post on the Official Microsoft Blog https://lnkd.in/gnDfcUV6 and the companion technical post on the Azure Quantum blog by Dennis Tom and Krysta Svore: https://lnkd.in/gMRVPG3s. #quantum #quantumcomputing #azurequantum

  • View profile for Alex Wang
    Alex Wang Alex Wang is an Influencer

    Learn AI Together - I share my learning journey into AI & Data Science here, 90% buzzword-free. Follow me and let’s grow together!

    1,139,840 followers

    Computing just changed forever. Google Quantum AI has unveiled Willow, their most advanced superconducting quantum chip yet. Small enough to fit in your hand, it can solve problems in minutes that would take even the most advanced supercomputers, like El Capitan, longer than the age of the universe. Willow’s breakthrough lies in error correction, reducing instability that has long held quantum computing back. It’s also extended the time qubits can remain stable, a critical step toward making quantum systems more practical. While quantum computing is still in its infancy, advancements like this hint at a future where we solve problems once thought impossible. But alongside the opportunities lie challenges, especially for encryption and cybersecurity. What’s your take on quantum’s potential? Are we ready for this leap forward? PS. What is quantum computing, and its current applications (4 mins reading) https://lnkd.in/gDD3tXGb __________________ I share my learning journey here. Join me and let's grow together. For more on AI & Machine Learning, please check my previous posts. Alex Wang

  • View profile for Rajat Taneja
    Rajat Taneja Rajat Taneja is an Influencer

    President, Technology at Visa

    125,343 followers

    We may be standing at a moment in time for Quantum Computing that mirrors the 2017 breakthrough on transformers – a spark that ignited the generative AI revolution 5 years later. With recent advancements from Google, Microsoft, IBM and Amazon in developing more powerful and stable quantum chips, the trajectory of QC is accelerating faster than many of us expected.   Google’s Sycamore and next gen Willow chips are demonstrating increasing fidelity. Microsoft’s pursuit of topological qubits using Majorana particles promises longer coherence times and IBM’s roadmap is pushing towards modular error corrected systems. These aren’t just incremental steps, they are setting the stage for scalable, fault tolerant quantum machines.   Quantum systems excel at simulating the behavior of molecules and materials at atomic scale, solving optimization problems with exponentially large solution spaces and modeling complex probabilistic systems – tasks that could take classical supercomputers millennia. For example, accurately simulating protein folding or discovering new catalysts for carbon capture are well within quantum’s potential reach.   If scalable QC is just five years away, now is the time to ask : What would you do differently today, if quantum was real tomorrow ?. That question isn’t hypothetical – it’s an invitation to start rethinking foundational problems in chemistry, logistics, finance, AI and cryptography.   Of course building quantum systems is notoriously hard. Fragile qubits, error correction and decoherence remain formidable challenges. But globally public and private institutions are pouring resources into cracking these problems. I was in LA today visiting the famous USC Information Sciences Institute where cutting edge work on QC is underway and the energy is palpable.   This feels like a pivotal moment. One where future shaping ideas are being tested in real labs. Just as with AI, the future belongs to those preparing for it now. QC Is an area of emphasis at Visa Research and I hope it is part of how other organizations are thinking about the future too.

  • View profile for Philipp Kozin, PhD, EMBA

    Foresight | Scientific Intelligence | Scientific Partnerships | Innovation Leadership | Emerging Technologies | Open Innovation | External Innovation | Strategy Consulting | MBA ESSEC | PhD | Polymath | Futurist

    43,386 followers

    If you place a ball on a stationary saddle, it will inevitably roll off — the equilibrium point is unstable. But start oscillating or rotating the saddle fast enough, and something paradoxical happens: the ball stays centered. An unstable maximum turns into an effective potential well. This is dynamic stabilization. Fast periodic motion creates a time-averaged force that suppresses growing disturbances. Mathematically, this is described by equations with periodic coefficients (Floquet analysis). Physically, it appears as an effective potential that simply doesn’t exist in static conditions. The same principle underpins very real technologies: • Ion traps (Paul traps): time-varying electric fields confine charged particles where static fields cannot • Spin-stabilized systems & fusion concepts: rapid rotation or oscillating fields stabilize plasma • Inverted (Kapitza) pendulum: the upright position becomes stable under high-frequency vibration of the pivot Key takeaway: time is a resource. What cannot be stabilized in a static field can be stabilized by tuning frequency, amplitude, and phase. A rare case where “shaking” a system makes it more stable, not less. 🤯 #Physics #SystemsThinking #DynamicStabilization #Saddle #ComplexSystems #FuturesThinking #DeepTech #Science #Math #Mathematics #Experiments #NonlinearDynamics

  • View profile for Sean Connelly🦉
    Sean Connelly🦉 Sean Connelly🦉 is an Influencer

    Architect of U.S. Federal Zero Trust | Co-author NIST SP 800-207 & CISA Zero Trust Maturity Model | Former CISA Zero Trust Initiative Director | Advising Governments & Enterprises

    22,643 followers

    🚨 New OMB Report on Post-Quantum Cryptography (PQC)🚨 The Office of Management and Budget (OMB) has released a critical report detailing the strategy for migrating federal information systems to Post-Quantum Cryptography. This report is in response to the growing threat posed by the potential future capabilities of quantum computers to break existing cryptographic systems. **Key Points from the Report:** 🔑 **Start Migration Early**: The report emphasizes the need to begin migration to PQC before quantum computers capable of breaking current encryption become operational. This proactive approach is essential to mitigate risks associated with "record-now-decrypt-later" attacks. 🔑 **Focus on High-Impact Systems**: Priority should be given to high-impact systems and high-value assets. Ensuring these critical components are secure is paramount. 🔑 **Identify Early**: It's crucial to identify systems that cannot support PQC early in the process. This allows for timely planning and avoids migration delays. 🔑 **Cost Estimates**: The estimated cost for this transition is approximately $7.1 billion over the period from 2025 to 2035. This significant investment underscores the scale and importance of the task. 🔑 **Cryptographic Module Validation Program (CMVP)**: To ensure the proper implementation of PQC, the CMVP will play a vital role. This program will validate that the new cryptographic modules meet the necessary standards. The full report outlines a comprehensive strategy and underscores the federal government’s commitment to maintaining robust cybersecurity in the quantum computing era. This is a critical step in safeguarding our digital infrastructure against future threats. #Cybersecurity #PQC #QuantumComputing #FederalGovernment #Cryptography #DigitalSecurity #OMB #NIST

  • View profile for Kiran Kaur Raina

    Founder & CEO @NucleQi | Quantum Security Research Engineer & Evangelist @Vyapti Resonance | AI @IIT Madras | Classiq Brand Ambassador | 2M+ Impressions | Researcher, Speaker, Consultant, Educator & EdTech YouTuber

    20,282 followers

    Trying to enter QML in 2026? This is the path I’d take, step by step. A Quantum Machine Learning roadmap should build three pillars in parallel: 1)Mathematics & Classical ML foundations 2)Quantum Computing foundations 3)Hybrid Quantum-Classical ML implementation → Advance QML Models Think of QML as ML + Linear Algebra + Quantum Mechanics + Optimization Step 1: Mathematics, Python, ML Stack, & ML Basics Linear Algebra - vectors, matrices, eigenvalues, tensor products Probability & Statistics - distributions, expectation, variance Optimization - gradient descent, loss functions Python - NumPy, SciPy, Matplotlib PyTorch or TensorFlow, Scikit-learn Supervised, Unsupervised Learning Regression, Classification, Overfitting, Regularization Neural Networks, CNN basics Goal: You should be comfortable building classical ML pipelines Step 2: Quantum Computing Foundations Qubits, superposition, measurement, Bloch sphere Quantum gates, Entanglement and Bell states Quantum circuits, Interference Quantum Algorithms - Deutsch-Jozsa, Grover’s Algorithm, Quantum Fourier Transform, Variational Quantum Algorithms Qiskit, Cirq, Q#(1 of them) Goal: You must think in circuits before doing QML Step 3: Bridge to QML Parameterized Quantum Circuits Variational circuits Classical-quantum feedback loop Cost functions Barren plateaus Expressibility & trainability Difference between: Quantum data → quantum model Classical data → quantum embedding PennyLane, TensorFlow Quantum, Qiskit ML Goal: Understand QML is optimization on quantum parameters Step 4: Core QML Models Quantum Data Encoding Angle embedding Amplitude encoding Basis encoding Quantum Models Variational Quantum Classifier Quantum Neural Networks Quantum Kernel Methods Quantum Support Vector Machines Data re-uploading circuits Compare: Classical NN vs VQC Classical SVM vs Quantum Kernel Goal: Show measurable learning, not just circuit execution Step 5: Advanced QML Concepts Barren Plateaus Noise-aware training Hardware-efficient ansatz Quantum Convolutional Neural Networks Quantum Autoencoders QGANs QML for anomaly detection NISQ Constraints - Noise, Shot statistics, Error mitigation Goal: You understand real-world limitations and research gaps Step 6: Research Grade QML Read Papers Schuld & Killoran (Quantum ML theory) Havlíček et al. (Quantum kernel methods) McClean et al. (Barren plateaus) Cerezo et al. (Variational algorithms) Hybrid classical-quantum architectures Quantum kernels vs classical kernels Data-efficient QML Noise-resilient QML QML benchmarking 5–8 serious QML projects Implement: One paper reproduction One modification or improvement Happy Learning! Save this post for later. Repost ♻️ for Quantum & AI Learners! Check my profile for more resources on Quantum & AI Tech Follow Kiran Kaur Raina here: 📌LinkedIn: https://lnkd.in/gEpKMQ7z 📌YouTube: https://lnkd.in/gTTv2ewB 📌Topmate: https://lnkd.in/gDj-kmYW 📌Medium: https://lnkd.in/gWBppT7G 📌Instagram: https://lnkd.in/g8qZKHe7

  • View profile for Keith King

    Former White House Lead Communications Engineer, U.S. Dept of State, and Joint Chiefs of Staff in the Pentagon. Veteran U.S. Navy, Top Secret/SCI Security Clearance. Over 16,000+ direct connections & 43,000+ followers.

    43,796 followers

    IBM to Launch the Largest Quantum Computer Yet in 2025 Overview: IBM plans to build the largest quantum computer to date by linking multiple smaller quantum chips in parallel. The project, set for 2025, aims to shatter existing records for qubit count, marking a significant leap in quantum computing capabilities. IBM’s goal is to more than triple the size of the largest current quantum machine while advancing practical quantum computing applications. Key Details: 1. IBM’s Quantum Roadmap: • IBM’s largest current quantum chip, Condor, contains 1,121 qubits. • By 2025, IBM plans to interconnect multiple chips to exceed this number, ultimately aiming to triple the largest existing system. 2. Milestone Achievements: • The company has successfully demonstrated linking two quantum chips, a key step toward building larger, interconnected systems. • This modular approach allows IBM to scale quantum systems beyond the physical and error-correction limits of single chips. 3. Quantum Computing Use Cases: • IBM provides cloud access to its quantum systems, with most users currently utilizing about 100 qubits for practical tasks. • The expansion to larger systems will enable more complex computations in fields like drug discovery, materials science, and logistics optimization. The Significance of More Qubits: 1. Increased Computational Power: • More qubits enable quantum systems to solve problems exponentially faster than classical computers. 2. Error Correction: • Scaling qubits allows for improved quantum error correction, a critical barrier to achieving reliable quantum computations. 3. Broader Accessibility: • Larger systems will allow more researchers and industries to access practical quantum applications via IBM’s cloud platform. IBM’s Competition in Quantum Computing: 1. Atom Computing Holds Current Record: • Start-up Atom Computing currently holds the record for the largest quantum system, slightly surpassing IBM. 2. Tech Industry Quantum Race: • Competitors like Google, Rigetti, and IonQ are also racing to scale up their quantum systems. 3. IBM’s Modular Strategy: • IBM’s approach focuses on scaling through chip interconnection, which could sidestep the limitations of monolithic single-chip systems. The Takeaway: IBM’s 2025 quantum computer project aims to break new ground by creating the largest quantum system ever built, leveraging interconnected quantum chips to scale qubit counts. While significant technical challenges remain—particularly around error correction and chip interconnectivity—the initiative marks a critical step toward practical, large-scale quantum computing. With competitors like Atom Computing and Google also advancing rapidly, the race for quantum supremacy intensifies, promising transformative impacts across science, industry, and technology in the near future.

  • View profile for Jay Gambetta

    Director of IBM Research and IBM Fellow

    20,557 followers

    A new paper, now published in Nature Computational Science, introduces "Quantum Approximate Multi-Objective Optimization," a breakthrough from researchers at IBM, Los Alamos National Laboratory, and Zuse Institute Berlin. This work represents one of the most promising proposals for near-term demonstrations of quantum advantage in combinatorial optimization, with enormous relevance across industry and science: https://lnkd.in/ew7Pe2K5 Multi-objective optimization is a branch of mathematical optimization that deals with problems involving multiple often conflicting goals—e.g., constructing financial portfolios that minimize risk while maximizing returns. These problems can be extremely challenging for classical methods as the number of objective functions increases, even in cases where the single-objective version of the problem is easily solvable. The study demonstrates how quantum computers can approximate the optimal Pareto front, i.e., the set of all optimal trade-offs between conflicting objectives, showing better scaling than classical algorithms. Sampling good solutions from vast solution spaces is a task at which quantum computers excel, and the researchers take full advantage of that in their work. This marks an important step toward practical quantum advantage in optimization, and shows the value of exploring quantum capabilities beyond conventional problem classes. The paper is the latest outcome from our quantum optimization technical working group, and I encourage you to have a look.

Explore categories