Quantum Computing Requirements for Integer Factoring

Explore top LinkedIn content from expert professionals.

Summary

Quantum computing requirements for integer factoring refer to the hardware and algorithmic resources needed to use quantum computers for breaking encryption by factoring large numbers, a task central to Shor's algorithm. Recent research shows that improvements in quantum circuit design and error correction are rapidly reducing the number of qubits and time required, making the threat to cryptographic standards more tangible.

  • Monitor breakthroughs: Stay updated on advancements in quantum hardware and algorithms, as each new development can drastically reduce resource estimates for factoring large integers.
  • Plan migration: Begin preparing for a shift to quantum-resistant cryptographic standards to protect data against future quantum threats.
  • Assess architectures: Evaluate diverse quantum computing architectures, such as memory-processor separation, to understand how modular designs can lower qubit requirements and improve efficiency.
Summarized by AI based on LinkedIn member posts
  • View profile for Frédéric Barbaresco

    THALES "QUANTUM ALGORITHMS/COMPUTING" AND "AI/ALGO FOR SENSORS" SEGMENT LEADER

    31,320 followers

    Shor’s algorithm is possible with as few as 10,000 reconfigurable atomic qubits by John Preskill (Caltech) https://lnkd.in/ethGUK8B Quantum computers have the potential to perform computational tasks beyond the reach of classical machines. A prominent example is Shor's algorithm for integer factorization and discrete logarithms, which is of both fundamental importance and practical relevance to cryptography. However, due to the high overhead of quantum error correction, optimized resource estimates for cryptographically relevant instances of Shor's algorithm require millions of physical qubits. Here, by leveraging advances in high-rate quantum error-correcting codes, efficient logical instruction sets, and circuit design, we show that Shor's algorithm can be executed at cryptographically relevant scales with as few as 10,000 reconfigurable atomic qubits. Increasing the number of physical qubits improves time efficiency by enabling greater parallelism; under plausible assumptions, the runtime for discrete logarithms on the P-256 elliptic curve could be just a few days for a system with 26,000 physical qubits, while the runtime for factoring RSA-2048 integers is one to two orders of magnitude longer. Recent neutral-atom experiments have demonstrated universal fault-tolerant operations below the error-correction threshold, computation on arrays of hundreds of qubits, and trapping arrays with more than 6,000 highly coherent qubits. Although substantial engineering challenges remain, our theoretical analysis indicates that an appropriately designed neutral-atom architecture could support quantum computation at cryptographically relevant scales. More broadly, these results highlight the capability of neutral atoms for fault-tolerant quantum computing with wide-ranging scientific and technological applications.

  • View profile for Duncan Jones

    General Manager at Quantinuum

    9,076 followers

    A recent paper demonstrates how to run Shor's algorithm using considerably fewer quantum gates. Using the new approach, factoring a 2048-bit RSA key might require about 100k gates instead of approximately 4 million. Despite this aggressive improvement, we may not need to panic just yet. There is rarely a free lunch in algorithm design, and this is no exception. One of the trade-offs made in this paper is an increase in the number of qubits required to implement the algorithm. As the authors acknowledge: "... an improvement in the number of gates does not necessarily translate into an improved practical implementation. Indeed, in most architectures currently being considered by industry, the space (or number of qubits) plays an important role.... It therefore remains to be seen whether the algorithm can lead to improved physical implementations in practice." Papers like this (see link below) are one of the reasons experts struggle to predict when quantum computers will break modern cryptography. The landscape is always shifting, with both the computers and the algorithms improving every year. Each bright idea potentially brings "Q Day" closer. -- ℹ️ Over 150 cyber professionals read my new weekly newsletter. Sign up using the link in the first comment 👇 #cybersecurity #cryptography #pqc #quantumcomputing #encryption

  • View profile for Zulfikar Ramzan

    Chief Technology and Artificial Intelligence Officer, Point Wild

    5,922 followers

    Here’s a crypto post -- but not the kind that involves a ledger. A new paper from Craig Gidney at Google has sharpened the picture around a major research question in cryptography: how hard is it really to break RSA with a quantum computer? A 2019 paper by Gidney & Ekerå showed that a 2048-bit RSA key could be factored with ~20 million noisy qubits, running in about 8 hours. Gidney’s latest estimate cuts that requirement by an order of magnitude: fewer than one million qubits, and a runtime of less than a week. The improvement comes from trading space for time and deploying clever techniques like: * Approximate residue arithmetic, which compresses modular exponentiation by discarding unneeded precision * Magic state cultivation, which reduces overhead in fault-tolerant gate operations * Compact surface code layouts, which store qubits more efficiently while keeping errors in check The intuition is subtle but powerful: to extract the period associated with modular exponentiation (a key step Shor’s algorithm), you don’t need a perfect answer—just enough clean signal, handled carefully enough to preserve the interference pattern you're looking for. (Shor's algorithm uses quantum parallelism to create a periodic signal, and then uses (Quantum) Fourier analysis -- via the (Quantum) Fourier Transform -- to determine the period, which effectively turns factoring into a signal-processing problem.). So, what are the implications? First, it's not time to panic (yet). Today’s quantum hardware handles ~100 qubits, all noisy and none fault-tolerant. So, we still need to improve current quantum computing hardware by a factor of at least 10,000. However, progress will continue to be made, and that progress can be non-linear. More concretely, this paper narrows the gap between theoretical risk and engineering feasibility. The requirements for breaking RSA are concrete. For governments and organizations still relying on cryptosystems like RSA whose security is related to the complexity of factoring large integers, the message is clear: the sky isn’t falling, but the clouds are moving. Migrating between cryptographic algorithms is a slow, fragile, and complicated process. There is no switch one can flip to transition between traditional algorithms and those that considered quantum safe. And yes, I lied in the first line: if someone builds a quantum computer at this scale, Bitcoin and other cryptocurrencies would likely be among the first targets... 📄 Paper: https://lnkd.in/gGY3JRgw #quantumcomputing #cryptography #postquantum #RSA #security

  • View profile for John Engates

    Multi-Agent AI Researcher & Agentic Engineer | Distributed Systems & Agentic Architectures | 25 years in CTO Roles at Cloudflare / NTT / Rackspace | Board Director | President of SIM Chapter in San Antonio

    9,430 followers

    Recent advancements in quantum computing have significantly reduced the estimates for resources required to break RSA-2048 encryption. A 2025 study by Craig Gidney indicates that factoring a 2048-bit RSA integer could now be achieved in under a week using fewer than one million noisy qubits. This marks a substantial decrease from the previous estimate of 20 million qubits proposed in 2019. This massive reduction can be attributed to algorithmic optimizations and more efficient quantum circuit designs. As quantum hardware continues to advance the feasibility of breaking current encryption standards becomes increasingly plausible. We're quicly closing the gap between theory and reality. The convergence of these software and hardware breakthroughs suggests that a mid-2030s "Q-Day” (the date when quantum computers can compromise existing encryption methods). This is no longer just a theoretical concern but an impending reality. It’s the modern equivalent of Y2K. In the meantime, attackers are harvesting our data just waiting for the day quantum computers of sufficient capability arrive. The urgency for post-quantum cryptography is clear. Organizations must prioritize transitioning to quantum-resistant cryptographic standards to safeguard sensitive data against future quantum threats. https://lnkd.in/gY9uTA9g

  • View profile for William Munizzi

    Senior QEC Theorist @ Q-Ctrl | UCLA Postdoc | Past-Chair APS FGSA

    5,797 followers

    Scaling quantum computing isn’t just about building better qubits, it’s about designing better architectures. ⚛️ Last week Q-CTRL announced Q-NEXUS, a heterogeneous quantum computing architecture inspired by a familiar idea from classical computing: Separating the processor from memory so each component can focus on what it does best.💡 The motivation is compelling. In algorithms like Shor factoring, qubits remain idle up to 97% of the time. Holding idle data in expensive, actively error-corrected hardware is enormously wasteful. Instead, Q-NEXUS routes idle quantum data to dedicated memory modules which utilize different qubit types and error-correcting codes matched to the task. The result is striking, yielding up to a 138× reduction in physical qubit overhead and 551× reduction in algorithmic error, compared to a monolithic baseline with comparable runtime. For RSA-2048 factorization, this modular approach reduces the requirement from 900k physical qubits to 190k, with a runtime under 10 days. Perhaps most inspiring is the broader implication that there may not be a single "winning" qubit. Superconducting qubits for fast processing, trapped ions or neutral atoms for memory, photonics for interconnects, each playing to their respective strengths within a unified architecture. This philosophy reframes quantum scaling from a race to build one perfect device, into a systems engineering problem that mirrors how classical computing evolved and matured. For those of you working on scaling or large-scale architecture, how do you view this approach? 📄 arxiv.org/abs/2604.06319 #Physics #QuantumComputing #FaultTolerance #ErrorCorrection #ComputingArchitecture #Science

  • View profile for Meling Mudin

    Chief Information Security Officer

    4,251 followers

    The implications of the latest research by Craig Gidney from Google on factoring RSA-2048 are profound (https://lnkd.in/g8mKeqRh). This paper, titled "How to factor 2048 bit RSA integers with less than a million noisy qubits" suggests a pathway to breaking this widely used encryption with under a million noisy qubits, potentially within a week! The finding is a dramatic reduction from previous estimates of 20 million qubits! This shifts the conversation dramatically: Reduced Quantum Threat Threshold: The barrier for quantum computers to compromise RSA-2048 has significantly lowered, bringing the threat horizon much closer. Increased Urgency for Post-Quantum Cryptography (PQC): The timeline for potential RSA compromise may be accelerating. The need to identify, standardize, and implement robust PQC solutions is now more critical than ever. Algorithmic and Architectural Advances: This breakthrough underscores the power of clever algorithms and efficient quantum architectures in tackling complex cryptographic challenges. The path to quantum advantage is being actively researched. The Question Has Changed: We're no longer just asking if a quantum computer will break RSA-2048, but when and through which path of algorithmic and hardware development. This research offers a potential answer to the latter, making the former seem increasingly inevitable. For those of us in cybersecurity, especially in highly regulated sectors like finance, this isn't a theoretical discussion anymore. It's a call to action to understand, prepare for, and actively transition to a post-quantum future. Let's connect and discuss how we can collectively navigate this evolving landscape.

  • View profile for Ken Wasserman

    Assistant Professor at Georgetown University School of Medicine

    4,549 followers

    Perplexity: Executing Shor’s algorithm at a cryptographically relevant scale—around 10,000 qubits—is now feasible thanks to the unique capabilities of reconfigurable neutral‑atom arrays. In these systems, qubits are encoded in long‑lived clock states trapped by optical tweezers, allowing them to move and reconfigure dynamically between gate operations. This mobility yields massive parallelism and nonlocal connectivity, enabling efficient quantum low‑density parity‑check (qLDPC) codes. Unlike planar surface codes that require millions of qubits, high‑rate qLDPC codes can encode over 1,000 logical qubits in a single block at ~30% efficiency, cutting physical qubit overhead by up to two orders of magnitude. The proposed modular architecture divides the system into functional zones: * A memory zone for stable logical data storage. * A processor zone for active computations. * An operation zone with ancillary qubits performing logical Pauli product measurements for read/write/edit tasks. * A resource zone producing “magic states” (e.g., ∣CCZ⟩ states) for universal computation. By confining operations and using verified code surgeries, the design avoids applying complex gates across all memory blocks, drastically improving efficiency. This architecture represents a major leap toward practical cryptographic applications: * ECC‑256 discrete logarithms can be solved in ≈10 days with ~26 k qubits. * RSA‑2048 factoring can be achieved in ≈97 days with ~102 k qubits, or in more compact sequential setups using 11 k–14 k qubits over ~264 days. These results outline a concrete path to utility‑scale fault‑tolerant quantum computing (FTQC) by integrating flexible neutral‑atom hardware with high‑rate codes and modular circuit design. Current systems already demonstrate universal FTQC below the error‑correction threshold, making million‑gate computations on thousands of logical qubits a near‑term reality. Since its debut in 1994, Shor’s algorithm has served as the benchmark driving quantum computation toward scale—proving superpolynomial speedups and motivating innovations in error correction, resource optimization, and reconfigurable hardware. Its continuing refinement now signals that full‑scale, industrially relevant quantum computing is within reach. https://lnkd.in/eebV4-3a https://lnkd.in/ew4fG-j7 https://lnkd.in/emPME-dJ https://lnkd.in/dszZZNwT https://lnkd.in/eCBUgM7j https://lnkd.in/eeJ3u_H9 https://lnkd.in/eGAe3Zb3 listen to the podcast: https://lnkd.in/e_u_NfCJ

  • View profile for Rohan Pinto

    Ξ CTO / Founder / 1Kosmos / Security Architect / Blockchain / Identity Management Maven / Cryptography Geek / Investor / Author

    19,165 followers

    Google’s quantum researcher Craig Gidney has significantly reduced the estimated quantum resources needed to break RSA-2048 encryption. In his latest paper, he outlines a method to factor a 2048-bit RSA key using fewer than one million noisy qubits, a substantial decrease from the previous estimate of 20 million qubits in 2019. While the runtime for factoring RSA-2048 has increased to less than a week compared to the previous estimate of eight hours, the dramatic reduction in qubit requirements makes the prospect of breaking RSA Security’s encryption more feasible as quantum hardware advance. Gidney emphasizes the urgency for transitioning to quantum-resistant cryptographic systems. He suggests deprecating vulnerable systems after 2030 and disallowing them entirely by 2035 to mitigate potential security risk. This development underscores the accelerating progress in quantum computing and its potential to compromise current cryptographic standards. Organizations and governments are advised to proactively adopt post-quantum cryptography to safeguard sensitive information against future quantum threats. Read all about it here : https://lnkd.in/geyPiYE4 #QuantumComputing #CyberSecurity #PostQuantum #RSA2048 #GoogleAI #AI #Cryptography #encryption #RSA

  • View profile for Hrant Gharibyan, PhD

    CEO @ BlueQubit | PhD Stanford

    14,201 followers

    🔐 Breaking RSA with ~1M physical qubits? That’s the breakthrough outlined in a recent paper by Craig Gidney at Google: 📄 https://lnkd.in/dQZuNaHt The work proposes optimized circuit constructions and error correction layouts that reduce the qubit requirements for factoring RSA-2048 from ~20 million (2019 estimates) to just 1 million physical qubits—a 20× improvement. This dramatically shifts the horizon for practical quantum attacks on today’s cryptographic standards. ⚠️ If validated, these results substantially accelerate the urgency for quantum readiness—not in theory, but in practice. At BlueQubit, we're focused on developing quantum software solutions that help enterprises and defense organizations prepare for and transition to the post-quantum era. That means tools for identifying cryptographic risk, supporting hybrid classical-quantum architectures, and integrating quantum solutions into existing workflows. 🚀 Algorithmic advances like this reshape timelines, risk models, and strategic priorities. For sectors with long data retention or sensitive infrastructure, now is the time to take quantum threats seriously—and plan accordingly. 🛡️ #QuantumComputing #PostQuantumCryptography #Cybersecurity #QuantumReadiness #BlueQubit #ShorAlgorithm #PQCTools #EnterpriseSecurity #DefenseTech

  • View profile for Cecile M. Perrault

    Director of Innovation & Partnerships @ Alice & Bob | European Quantum Strategy Leader | VP at QuIC | DeepTech–Policy Bridge | Board Member | Bridging Industry, Research & EU Sovereignty

    5,962 followers

    𝗠𝗮𝘀𝘀𝗶𝘃𝗲 𝗣𝗿𝗼𝗴𝗿𝗲𝘀𝘀 𝗶𝗻 𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗘𝗿𝗿𝗼𝗿 𝗖𝗼𝗿𝗿𝗲𝗰𝘁𝗶𝗼𝗻: 𝗧𝗮𝗰𝗸𝗹𝗶𝗻𝗴 𝗤𝗘𝗖 𝗼𝘃𝗲𝗿𝗵𝗲𝗮𝗱 In just the last 10 days, quantum error correction (#QEC) has taken a major leap forward, bringing fault-tolerant quantum computing (#FTQC) closer than ever. When it became accepted in 2023-2024 that FTQC was the only path for large-impact quantum computing, we also accepted that the machines would be bigger. A LOT BIGGER. This is because QEC has traditionally introduced significant overhead. However, last week was full of good news: recent advancements show how quickly QEC overhead can be reduced, making large-scale quantum systems more practical and accessible. Here are two announcements worth looking at and an insight to take into account: 𝗙𝗿𝗼𝗺 𝟮𝟬 𝗠𝗶𝗹𝗹𝗶𝗼𝗻 𝘁𝗼 𝗟𝗲𝘀𝘀 𝗧𝗵𝗮𝗻 𝟭 𝗠𝗶𝗹𝗹𝗶𝗼𝗻 𝗤𝘂𝗯𝗶𝘁𝘀: In 𝟮𝟬𝟮𝟯, Google projected that 𝟮𝟬 𝗺𝗶𝗹𝗹𝗶𝗼𝗻 𝗾𝘂𝗯𝗶𝘁𝘀 would be required for fault tolerance in 𝟮𝟬𝟰𝟴-𝗯𝗶𝘁 𝗥𝗦𝗔 𝗳𝗮𝗰𝘁𝗼𝗿𝗶𝗻𝗴. Now, thanks to improved QEC techniques, this requirement has 𝗱𝗿𝗼𝗽𝗽𝗲𝗱 𝘁𝗼 𝘂𝗻𝗱𝗲𝗿 𝟭 𝗺𝗶𝗹𝗹𝗶𝗼𝗻 𝗾𝘂𝗯𝗶𝘁𝘀 ! 𝗤𝘂𝗘𝗿𝗮'𝘀 𝗡𝗲𝘂𝘁𝗿𝗮𝗹 𝗔𝘁𝗼𝗺 𝗔𝗿𝗿𝗮𝘆 𝗣𝗮𝗽𝗲𝗿: QuEra Computing Inc.’s work shows how neutral atom arrays can 𝗿𝗲𝗱𝘂𝗰𝗲 𝗼𝘃𝗲𝗿𝗵𝗲𝗮𝗱 𝗶𝗻 𝗤𝗘𝗖, completing 𝗥𝗦𝗔 𝗳𝗮𝗰𝘁𝗼𝗿𝗶𝗻𝗴 𝗶𝗻 𝗷𝘂𝘀𝘁 𝟱.𝟲 𝗱𝗮𝘆𝘀 𝘄𝗶𝘁𝗵 𝟭𝟵 𝗺𝗶𝗹𝗹𝗶𝗼𝗻 𝗾𝘂𝗯𝗶𝘁𝘀. This represents a remarkable reduction from previous estimates. 𝗦𝗵𝗼𝗿'𝘀 𝗔𝗹𝗴𝗼𝗿𝗶𝘁𝗵𝗺 𝗮𝘀 𝘁𝗵𝗲 𝗕𝗲𝗻𝗰𝗵𝗺𝗮𝗿𝗸: With these advances, Shor's algorithm is fast becoming a de facto benchmark for testing the true capabilities of quantum error correction. As we reduce the qubit overhead and increase error correction efficiency, Shor’s algorithm provides a clear metric for measuring progress toward fault tolerance in quantum systems. The progress in QEC over the last 10 days is mind-blowing. By reducing the overhead required for error correction, we are accelerating the path to practical, fault-tolerant quantum systems. The rapid pace of innovation promises that FTQC is no longer a distant dream: it's becoming increasingly achievable. #QuantumComputing #QuantumErrorCorrection #FTQC #ShorsAlgorithm #QuEra #Google #TechAdvancements #QuantumTech

Explore categories