Role of Shor Code in Quantum Technology

Explore top LinkedIn content from expert professionals.

Summary

Shor code is a quantum error-correcting method used in quantum technology to protect information from errors caused by hardware instability. Recent advances in optimizing Shor’s algorithm, which relies on such codes, mean that quantum computers will need fewer physical qubits to potentially break widely used encryption systems like RSA and elliptic curve cryptography, raising concerns about cybersecurity in the near future.

  • Stay informed: Keep up with developments in quantum error correction and algorithmic improvements, as these shift timelines for cryptographic vulnerabilities.
  • Plan migration: Begin preparing for the adoption of post-quantum cryptography to protect sensitive data from rapidly advancing quantum capabilities.
  • Coordinate security: Engage with technology and security teams to align on strategies for migrating systems—especially decentralized ones like blockchain—before quantum threats become reality.
Summarized by AI based on LinkedIn member posts
  • View profile for Charles Guillemet

    CTO chez Ledger

    13,733 followers

    Today, Google Quantum AI published a research paper that might boost the post-quantum migration. Their team has tailored Shor’s algorithm to solve the 256-bit Elliptic Curve Discrete Logarithm Problem. ECDLP is the hard mathematical problem that secures ECDSA: the signature scheme underpinning most blockchains, TLS certificates, and countless authentication systems, using fewer than 1,200 logical qubits and 90 million Toffoli gates. Translated to hardware: fewer than 500,000 physical qubits, executing in a few minutes. A few minutes. Less than a Bitcoin block time. Less than two Ethereum epochs. The long-standing argument that public keys can simply remain hidden is now moot. What exactly changed Shor's algorithm has been known since 1994 as a generic quantum approach to factoring integers and computing discrete logarithms. But "known" and "practical" are very different things. The real progress is in the engineering: how many qubits and gates you actually need once you compile the algorithm into a fault-tolerant quantum circuit. The recent algorithmic trendline is clear: every 12-18 months, the resource estimates drop significantly. And these are pure algorithmic gains: they compound on top of hardware improvements, which remain a major challenge. However, as of today, we're still far from having such a quantum computer. This didn't change. Zero Knowledge Proof Here's where it gets interesting. Google chose not to publish their optimized circuits. Instead, they released a zero-knowledge proof that their circuits achieve the claimed resource counts. We have no doubt they know how to do it, but no clue how. The reasons are likely multiple: competitive advantage, national security implications... Regardless, it establishes a powerful (and elegant) precedent. What’s ironic: Google's ZK proof is not itself post-quantum secure. What’s next? The good news is that we already have the tools: Post Quantum Cryptography, now we need to migrate. A few days ago, Google announced it is targeting 2029 for full post-quantum readiness. NIST plans to deprecate RSA signatures by 2030 and disallow all legacy algorithms by 2035. Cryptography exists to create mathematical trust in the security of systems. That trust is now being eroded, not by a working attack, but by the increasingly credible prospect of one. In security, the moment you start doubting the foundation is the moment you should be rebuilding it. What this means for blockchains For blockchain ecosystems specifically, the threat is central. ECDSA on secp256k1 (Bitcoin) and P-256 curves is the cornerstone of security. Unlike traditional systems where you can rotate certificates behind a corporate firewall, blockchain migration requires coordination across decentralized, permissionless networks. This process will likely take time. I'll be diving deeper into the concrete challenges and strategies for PQC migration on blockchains and secure systems at my keynote this Thursday at EthCC conference.

  • View profile for Alexander Bechtel

    Global Head of Digital Products

    9,916 followers

    When I published my latest column in Frankfurter Allgemeine Zeitung on #quantum computing and its implications for #Bitcoin a few weeks ago (links to the German and English article in the comments), I didn't expect the next major development to arrive this quickly. Last week, two all-star research teams spanning quantum computing, cryptography, and blockchain published two papers (links in the comments) that dramatically lower the estimated resources needed to break the elliptic curve cryptography (ECC-256) securing virtually every major blockchain. 🔍 What's the issue? Bitcoin's security rests on asymmetric cryptography, specifically on elliptic curves. Put simply, it is virtually impossible for conventional computers to derive a private key (the password) from a public key (the account number). A sufficiently powerful quantum computer, however, could solve this problem using Shor's algorithm. 🔍 What did the papers find? Google's paper (Babbush et al.) shows that Shor's algorithm could break ECC-256 with fewer than 500,000 physical superconducting qubits in as little as 9 minutes. That's a 20× improvement in efficiency over prior estimates. A 9-minute window matters enormously: it means not only bitcoins sitting on already-vulnerable addresses are at risk, but so-called "on-spend" attacks become feasible too. These attacks exploit the fact that public keys are briefly exposed when bitcoins are spent and before the transaction settles (typically around 10 minutes). Oratomic's paper (Cain et al.), from Caltech and UC Berkeley, shows that same cryptography could be broken with as few as 10,000–26,000 physical qubits, albeit over days rather than minutes. While too slow for on-spend attacks, this timeframe would be more than sufficient to target bitcoins sitting on vulnerable addresses where public keys are already permanently exposed, 🔍 What this does and doesn't mean To be clear: these papers represent algorithmic and architectural breakthroughs, not hardware breakthroughs. Quantum computers powerful enough to execute these attacks are still as likely (or unlikely) to arrive by the end of this decade as they were before. What has changed is our understanding of how little computing power would actually be needed. The gap between what's required and what's being built just got a lot smaller.

  • View profile for Duncan Jones

    General Manager at Quantinuum

    9,075 followers

    A recent paper demonstrates how to run Shor's algorithm using considerably fewer quantum gates. Using the new approach, factoring a 2048-bit RSA key might require about 100k gates instead of approximately 4 million. Despite this aggressive improvement, we may not need to panic just yet. There is rarely a free lunch in algorithm design, and this is no exception. One of the trade-offs made in this paper is an increase in the number of qubits required to implement the algorithm. As the authors acknowledge: "... an improvement in the number of gates does not necessarily translate into an improved practical implementation. Indeed, in most architectures currently being considered by industry, the space (or number of qubits) plays an important role.... It therefore remains to be seen whether the algorithm can lead to improved physical implementations in practice." Papers like this (see link below) are one of the reasons experts struggle to predict when quantum computers will break modern cryptography. The landscape is always shifting, with both the computers and the algorithms improving every year. Each bright idea potentially brings "Q Day" closer. -- ℹ️ Over 150 cyber professionals read my new weekly newsletter. Sign up using the link in the first comment 👇 #cybersecurity #cryptography #pqc #quantumcomputing #encryption

  • View profile for Frédéric Barbaresco

    THALES "QUANTUM ALGORITHMS/COMPUTING" AND "AI/ALGO FOR SENSORS" SEGMENT LEADER

    31,317 followers

    Shor’s algorithm is possible with as few as 10,000 reconfigurable atomic qubits by John Preskill (Caltech) https://lnkd.in/ethGUK8B Quantum computers have the potential to perform computational tasks beyond the reach of classical machines. A prominent example is Shor's algorithm for integer factorization and discrete logarithms, which is of both fundamental importance and practical relevance to cryptography. However, due to the high overhead of quantum error correction, optimized resource estimates for cryptographically relevant instances of Shor's algorithm require millions of physical qubits. Here, by leveraging advances in high-rate quantum error-correcting codes, efficient logical instruction sets, and circuit design, we show that Shor's algorithm can be executed at cryptographically relevant scales with as few as 10,000 reconfigurable atomic qubits. Increasing the number of physical qubits improves time efficiency by enabling greater parallelism; under plausible assumptions, the runtime for discrete logarithms on the P-256 elliptic curve could be just a few days for a system with 26,000 physical qubits, while the runtime for factoring RSA-2048 integers is one to two orders of magnitude longer. Recent neutral-atom experiments have demonstrated universal fault-tolerant operations below the error-correction threshold, computation on arrays of hundreds of qubits, and trapping arrays with more than 6,000 highly coherent qubits. Although substantial engineering challenges remain, our theoretical analysis indicates that an appropriately designed neutral-atom architecture could support quantum computation at cryptographically relevant scales. More broadly, these results highlight the capability of neutral atoms for fault-tolerant quantum computing with wide-ranging scientific and technological applications.

  • View profile for Dr. Corey O'Meara

    Chief Quantum Scientist @ E.ON | 2x Quantum Computing Innovator of the Year | Co-founder Nova Spraytec

    18,158 followers

    This could have big implications. Breaking RSA-2048 encryption using only 100k instead of 1 million physical qubits? Yesterday a paper appeared online that is quite rigorous in its analysis of resource estimations showing that a variant of Shors algorithm can be ran using only 100k physical qubits. That’s an order of magnitude reduction from the previous estimate/record of Gidney in 2025. Considering quantum hardware roadmaps predict fault tolerant machines will exist in the early 2030’s with this many qubits (and using the same error correcting code family), it could mean the standard encryption standards just got a scary amount closer to being obsolete. As an aside, my new favorite unit is the “megaqubit” (10^6 qubits) and look forward to seeing this used in the next era of fault tolerant resource estimation papers! Paper link 👉 https://lnkd.in/dtgFiUE2 Congrats to Paul Webster, Larry Cohen and the team from Iceberg Quantum for the nice result. #quantumcomputing #quantumapplications #encryption

  • View profile for Ken Wasserman

    Assistant Professor at Georgetown University School of Medicine

    4,549 followers

    Perplexity: Executing Shor’s algorithm at a cryptographically relevant scale—around 10,000 qubits—is now feasible thanks to the unique capabilities of reconfigurable neutral‑atom arrays. In these systems, qubits are encoded in long‑lived clock states trapped by optical tweezers, allowing them to move and reconfigure dynamically between gate operations. This mobility yields massive parallelism and nonlocal connectivity, enabling efficient quantum low‑density parity‑check (qLDPC) codes. Unlike planar surface codes that require millions of qubits, high‑rate qLDPC codes can encode over 1,000 logical qubits in a single block at ~30% efficiency, cutting physical qubit overhead by up to two orders of magnitude. The proposed modular architecture divides the system into functional zones: * A memory zone for stable logical data storage. * A processor zone for active computations. * An operation zone with ancillary qubits performing logical Pauli product measurements for read/write/edit tasks. * A resource zone producing “magic states” (e.g., ∣CCZ⟩ states) for universal computation. By confining operations and using verified code surgeries, the design avoids applying complex gates across all memory blocks, drastically improving efficiency. This architecture represents a major leap toward practical cryptographic applications: * ECC‑256 discrete logarithms can be solved in ≈10 days with ~26 k qubits. * RSA‑2048 factoring can be achieved in ≈97 days with ~102 k qubits, or in more compact sequential setups using 11 k–14 k qubits over ~264 days. These results outline a concrete path to utility‑scale fault‑tolerant quantum computing (FTQC) by integrating flexible neutral‑atom hardware with high‑rate codes and modular circuit design. Current systems already demonstrate universal FTQC below the error‑correction threshold, making million‑gate computations on thousands of logical qubits a near‑term reality. Since its debut in 1994, Shor’s algorithm has served as the benchmark driving quantum computation toward scale—proving superpolynomial speedups and motivating innovations in error correction, resource optimization, and reconfigurable hardware. Its continuing refinement now signals that full‑scale, industrially relevant quantum computing is within reach. https://lnkd.in/eebV4-3a https://lnkd.in/ew4fG-j7 https://lnkd.in/emPME-dJ https://lnkd.in/dszZZNwT https://lnkd.in/eCBUgM7j https://lnkd.in/eeJ3u_H9 https://lnkd.in/eGAe3Zb3 listen to the podcast: https://lnkd.in/e_u_NfCJ

  • View profile for Michaela Eichinger, PhD

    Product Solutions Physicist @ Quantum Machines | I talk about quantum computing.

    16,210 followers

    The common estimate for breaking RSA with a quantum computer? Around 20 million qubits. This week, 𝗖𝗿𝗮𝗶𝗴 𝗚𝗶𝗱𝗻𝗲𝘆 from Google 𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗔𝗜 showed how to do it with 𝗹𝗲𝘀𝘀 𝘁𝗵𝗮𝗻 𝟭 𝗺𝗶𝗹𝗹𝗶𝗼𝗻. Before you panic: this still requires a fault-tolerant quantum computer—something no one has built yet. And it would take a few 𝗱𝗮𝘆𝘀 to run. So no, your encrypted messages are not in danger. 𝗪𝗵𝗮𝘁 𝗰𝗵𝗮𝗻𝗴𝗲𝗱? Not the hardware. The trick lies in how the algorithm, Shor’s algorithm, is assembled and optimized: • 𝗦𝗽𝗮𝗰𝗲-𝘁𝗶𝗺𝗲 𝗰𝗼𝗺𝗽𝗿𝗲𝘀𝘀𝗶𝗼𝗻: Fewer qubits, longer runtime. The new architecture runs over several days instead of hours, but requires far less parallelism. • 𝗕𝗲𝘁𝘁𝗲𝗿 𝗯𝘂𝗶𝗹𝗱𝗶𝗻𝗴 𝗯𝗹𝗼𝗰𝗸𝘀: The paper uses compact arithmetic and smarter magic state generation to reduce ancilla overhead. • 𝗖𝗼𝗱𝗲-𝗹𝗲𝘃𝗲𝗹 𝘀𝗮𝘃𝗶𝗻𝗴𝘀: More efficient layout and scheduling of surface code operations cuts down the number of T factories and long-lived logical qubits. Gidney’s work is a clear reminder that meaningful progress isn’t just happening in the lab—it’s also happening in the compiler, in the fault-tolerance architecture, and in how we think about the resources at hand. 📸 Image Credits: Craig Gidney

Explore categories