Managing Risks in Scalable Quantum Architectures

Explore top LinkedIn content from expert professionals.

Summary

Managing risks in scalable quantum architectures means identifying and addressing the cybersecurity, operational, and technological challenges that arise as quantum computers grow powerful enough to impact data protection and digital systems. As quantum computing advances, these risks center around protecting sensitive information that could be decrypted by future quantum machines and ensuring new quantum systems are stable, reliable, and secure as they scale up.

  • Catalog critical assets: Start by mapping out all current data, encryption methods, and systems to understand what could be vulnerable if quantum computers become capable of breaking today’s security.
  • Plan for crypto-agility: Build flexibility into your systems so that encryption methods can be updated or swapped out as quantum-safe standards evolve and mature.
  • Monitor technological milestones: Stay aware of key breakthroughs in quantum hardware and error correction so you can adjust your strategy as the technology matures toward practical use.
Summarized by AI based on LinkedIn member posts
  • View profile for Jen Easterly

    CEO, RSAC | Cyber + AI | Leader | Keynote Speaker | Innovator | #MoveFast&BuildThings

    125,462 followers

    🔐Word o’ the Day | Year | Decade: Crypto-agility, Baby! Yesterday morning, I did a fun fireside chat with Bethany Gadfield - Netzel at the FIA, Inc. Expo in Chicago. We talked about cyber resilience, artificial intelligence, Rubik’s cubes, and that thing called quantum! A question came up at the end, “What can firms actually do today to begin transitioning to post-quantum cryptography?” So thought I would take the opportunity to share my thoughts more broadly on this important, but not super well understood, topic: 1. Don’t wait. The clock for quantum-safe cryptography is already ticking. NIST released its first set of post-quantum standards last year (https://lnkd.in/esTm8uPw) and CISA put out a “Strategy for Migrating to Automated Post-Quantum Discovery and Inventory Tools” last year as part of its broader Post Quantum Cryptography (PQC) Initiative (https://lnkd.in/evpF4umv). h/t Garfield Jones, D.Eng.! 2. Inventory & prioritize. Map all cryptographic usage: what keys, certificates, protocols, and data streams exist today? Which assets hold long-lived value and are at risk of “harvest-now, decrypt-later”? Build a migration roadmap that prioritizes highest-risk systems (e.g., financial settlement platforms, inter-bank links, legacy encryption). 3. Establish crypto-agility. Ensure your architecture supports swapping algorithms, updating certificates, & layering classical + post-quantum primitives without a full system rebuild. This kind of flexibility is key for resilience. 4. Pilot and migrate. Use the new NIST-approved algorithms; experiment first on less time-sensitive systems, validate performance and interoperability, then scale to mission-critical applications. NIST’s IR 8547 report provides a framework for this transition. 5. Vendor & supply-chain alignment. Ask your vendors & service providers: “What’s your PQC transition plan? When will you support NIST-approved post-quantum algorithms? Are your update paths crypto-agile?” If the answer isn’t clear or (as a former boss of mine used to say) they look at you like a “pig at a wristwatch,” you’ve got a potentially serious third-party risk. 6. Board and Exec engagement. Position this not as an IT problem but a fiduciary risk and resilience imperative. The transition to quantum-safe cryptography is multi-year and multi-layered—waiting until it’s urgent means it will be too late.

  • View profile for Prof Dr Ingrid Vasiliu-Feltes

    Quantum-AI Governance Expert I Deep Tech Diplomate I Investor & Tech Sovereignty Architect I Innovation Ecosystem Founder I Strategist I Cyber-Ethicist I Futurist I Board Chair & Advisor I Editor I Vice-Rector I Speaker

    51,801 followers

    EY’s perspective on securing against #quantum #risks emphasizes that quantum #computing is rapidly evolving from a theoretical concern into a material cybersecurity threat that requires immediate strategic action. The core issue lies in the vulnerability of widely used cryptographic algorithms, such as RSA and elliptic curve cryptography, which could be broken by sufficiently advanced quantum computers. This creates a systemic risk to sensitive data, including financial information, intellectual property, and personal records. A central concept highlighted is the “harvest now, decrypt later” threat model, in which adversaries collect encrypted data today with the intention of decrypting it in the future as quantum capabilities mature. This makes quantum risk a present-day problem, particularly for data requiring long-term confidentiality. EY stresses that organizations must adopt a proactive and structured approach to quantum readiness. A foundational step is to conduct a comprehensive cryptographic inventory, identify sensitive #data, and map existing #encryption methods. This enables organizations to assess which systems are most exposed and prioritize remediation efforts. Transitioning to post-quantum cryptography (PQC) is a complex, multi-year transformation that requires careful planning, integration into existing #technology roadmaps, and alignment with emerging standards. Organizations are encouraged to build crypto-agility, allowing them to adapt encryption methods as technologies and standards evolve. EY also highlights the importance of #governance, #compliance, and #workforce readiness. Quantum resilience requires enterprise-wide coordination, including policy development, regulatory alignment, continuous monitoring, and personnel training. EY frames quantum cybersecurity not just as a technical upgrade but as a strategic #transformation initiative. Organizations that act early can strengthen resilience, improve cyber maturity, and gain a competitive advantage, while those that delay risk long-term exposure to data breaches, regulatory challenges, and erosion of #digital #trust.

  • View profile for Marin Ivezic

    CEO Applied Quantum | PostQuantum.com | SANS Instructor | Former CISO, Big 4 Partner, Quantum Entrepreneur

    34,175 followers

    There’s already far too much "quantum prediction by vibes" - bold dates with no underlying model, or a misleading one. If you want a Q-Day forecast driven by evidence, not intuition, here’s the update. Every week I see confident claims that RSA will fall in 2026, or 2065. Most with no explanation other than e.g. "trust me, I’m a cryptographer". I just released an updated (and renamed) version of my CRQC Quantum Capability Framework and the Q-Day Estimator - https://lnkd.in/g39Sk7QR This is the methodology that national security analysts, policy bodies, and researchers have relied on for years. I’ve now expanded it and rewritten parts based on new experiments, vendor roadmaps, and architectural advances. If you genuinely want to understand which quantum computing milestones matter, what needs to mature for a cryptographically relevant quantum computer, and how these capabilities reinforce or constrain each other - this is the guide. I cover: - Nine core capabilities every CRQC must achieve - Deep-dive explainers on QEC, syndrome extraction, thresholds, connectivity, magic states, decoders, continuous operation, scaling, algorithm integration... - One cross-cutting engineering capability (manufacturability) that determines if any of this can actually be built - Clear descriptions of interdependencies - how progress in one domain shifts requirements in another - Updated TRL assessments - Historical milestones, failure modes, “gotchas,” and realistic bottlenecks - A direct mapping to the CRQC Readiness Benchmark (LQC / LOB / QOT) and the fully interactive Q-Day Estimator tool This is not a hype piece. The framework and individual capability articles are a long read by design - a technical map for those who actually want to reason about quantum risk instead of repeating soundbites. If you really want to track Q-Day properly, you need to know what to look for: - When a breakthrough in connectivity shifts logical-cycle times - When a new distillation protocol collapses overhead - When decoding latency becomes the bottleneck - When manufacturability overtakes physics as the gating factor - When end-to-end integration becomes possible in practice - When engineering constraints matter more than lab demos The updated framework helps you recognize these signals - and understand what they mean for cryptographic timelines. And if you want to play with the numbers, the interactive CRQC Readiness Benchmark & Q-Day Estimator is here: https://lnkd.in/gkcbwuNB Again, the full framework is available here: https://lnkd.in/g39Sk7QR Happy to discuss, debate, or collaborate - this is an open, evolving piece of work, and the community benefits when we reason from evidence, not vibes. #QDay #Y2Q #PQC #QuantumSecurity #QuantumReadiness #QuantumResilience #QuantumResistance #CRQC

  • View profile for Davide Maniscalco

    Head of Legal, Regulatory & Data Privacy Officer | Special Adv DFIR | Auditor ISO/IEC 27001| 27701 | 42001 | CBCP | Italian Army (S.M.O.M.) Reserve Officer ~ OF-2 |

    19,803 followers

    A recent comprehensive study, issued by Federal Office for Information Security (BSI) on the Status of #Quantum #Computer #Development provides a sober, evidence-based assessment of progress, risks, and timelines, particularly relevant for #cryptography, #cybersecurity, and strategic planning, with a focus on applications in #cryptanalysis. Key takeaways: • Quantum advantage is real, but still narrow Quantum computers have demonstrated advantage only on highly specialized benchmark problems. Broad, application-relevant superiority remains out of reach. • Cryptography is the primary strategic risk driver Shor’s algorithm continues to pose a credible long-term threat to RSA and elliptic-curve cryptography, while symmetric cryptography (e.g. AES) remains comparatively resilient with appropriate key lengths. • Fault tolerance is the true bottleneck Error rates not qubit counts are the dominant constraint. Scalable, fault-tolerant quantum computing requires massive overheads in error correction and infrastructure. • Leading hardware platforms are converging Superconducting qubits, trapped ions, and neutral atoms (Rydberg) currently lead the field, with rapid progress but no clear single winner. • #NISQ systems are not a near-term cryptographic threat Noisy Intermediate-Scale Quantum (NISQ) devices lack the depth and reliability needed for meaningful cryptanalysis, despite frequent hype. • A realistic timeline is emerging Based on verified advances in error correction, a cryptographically relevant quantum computer may be achievable in ~10–15 years—not decades, but not imminent either. • “Harvest now, decrypt later” remains a credible risk Sensitive data encrypted today may be vulnerable in the future, reinforcing the urgency of post-quantum cryptography migration. • Security preparedness must start now Transition planning, crypto-agility, standards development, and quantum-readiness assessments are no longer optional for governments and critical sectors. 👉 Bottom line: quantum computing is progressing steadily, not explosively, but its long-term implications for cybersecurity and digital trust demand early, structured, and risk-based action today. https://lnkd.in/eMui-D_W

  • View profile for Keith King

    Former White House Lead Communications Engineer, U.S. Dept of State, and Joint Chiefs of Staff in the Pentagon. Veteran U.S. Navy, Top Secret/SCI Security Clearance. Over 16,000+ direct connections & 44,000+ followers.

    43,861 followers

    Harvard Achieves Breakthrough Toward Fault-Tolerant Quantum Computing Introduction Harvard physicists have unveiled a major leap in quantum error correction, demonstrating an integrated, scalable neutral-atom architecture that brings fault-tolerant quantum computing significantly closer. This Nature-published work combines error detection, mid-circuit correction, and universal gate operations in a 448-qubit system—addressing the fragility long considered the field’s greatest barrier. Key Advances and Technical Highlights • The system uses laser-cooled rubidium atoms arranged in optical tweezers, enabling dynamic reconfiguration and highly controlled quantum operations. • Harvard achieved logical operation error rates below 0.5 percent, surpassing widely recognized thresholds for scalable quantum error correction. • Quantum teleportation allows identification and removal of errors without stopping computation, functioning like surgical repair during live operation. • Mid-circuit measurements and real-time feedback stabilize qubits against environmental noise—historically the Achilles’ heel of quantum hardware. • The architecture integrates detection, correction, and universal computation in one platform, validating concepts first proposed by Shor three decades ago. • This work builds on Harvard’s earlier continuously operating machine, now expanded to hundreds of qubits while preserving coherence at scale. • The design provides a path to the thousands of logical qubits required for true quantum advantage. Industry Context and Strategic Implications • Harvard’s neutral-atom system advances alongside global competitors: Google’s Willow chip, Quantinuum’s modular Helios, and major government investments across the US and UK. • The research aligns with a rising global race, featured at the Chicago Quantum Summit and reinforced through MIT-Harvard collaborations on multi-thousand-qubit systems. • Error-corrected architectures accelerate timelines for applications in drug discovery, advanced materials, logistics optimization, cryptography, and financial modeling. • As fault tolerance becomes achievable, quantum systems will hasten the need for post-quantum encryption and reshape cybersecurity strategy. • Investors and scientific leaders view 2025 as a inflection point, with quantum technologies poised to influence markets nearing a projected trillion-dollar scale by 2035. Why This Breakthrough Matters Harvard’s achievement validates a long-sought blueprint for fault-tolerant quantum computing, turning theoretical constructs into a functioning, scalable system. By demonstrating stable computation within a live error-correcting architecture, the work meaningfully shortens the timeline to practical quantum machines. The implications span national security, economic competitiveness, scientific discovery, and the future architecture of global computing. Keith King https://lnkd.in/gHPvUttw

  • View profile for Eviana Alice Breuss, MD, PhD

    Founder, President, and CEO @ Tengena LLC | Founder and President @ Avixela Inc | 2025 Top 30 Global Women Thought Leaders & Innovators

    8,243 followers

    QUANTUM COMPUTERS RECYCLE QUBITS TO MINIMAZE ERRORS AND ENHANCE COMPUTATIONAL EFFICIENCY Quantum computing represents a paradigm shift in information processing, with the potential to address computationally intractable problems beyond the scope of classical architectures. Despite significant advances in qubit design and hardware engineering, the field remains constrained by the intrinsic fragility of quantum states. Qubits are highly susceptible to decoherence, environmental noise, and control imperfections, leading to error propagation that undermines large‑scale reliability. Recent research has introduced qubit recycling as a novel strategy to mitigate these limitations. Recycling involves the dynamic reinitialization of qubits during computation, restoring them to a well‑defined ground state for subsequent reuse. This approach reduces the number of physical qubits required for complex algorithms, limits cumulative error rates, and increases computational density. Particularly, Atom Computing’s AC1000 employs neutral atoms cooled to near absolute zero and confined in optical lattices. These cold atom qubits exhibit extended coherence times and high atomic uniformity, properties that make them particularly suitable for scalable architectures. The AC1000 integrates precision optical control systems capable of identifying qubits that have degraded and resetting them mid‑computation. This capability distinguishes it from conventional platforms, which often require qubits to remain pristine or be discarded after use. From an engineering perspective, minimizing errors and enhancing computational efficiency requires a multi‑layered strategy. At the hardware level, platforms such as cold atoms, trapped ions, and superconducting circuits are being refined to extend coherence times, reduce variability, and isolate quantum states from environmental disturbances. Dynamic qubit management adds resilience, with recycling and active reset protocols restoring qubits mid‑computation, while adaptive scheduling allocates qubits based on fidelity to optimize throughput. Error‑correction frameworks remain central, combining redundancy with recycling to reduce overhead and enable fault‑tolerant architectures. Algorithmic and architectural efficiency further strengthens performance through optimized gate sequences, hybrid classical–quantum workflows, and parallelization across qubit clusters. Looking ahead, metamaterials innovation, machine learning‑driven error mitigation, and modular metasurface architectures promise to accelerate progress toward scalable systems. The implications of qubit recycling and these complementary strategies are substantial. By enabling more complex computations with fewer physical resources, they can reduce hardware overhead and enhance reliability. This has direct relevance for domains such as cryptography, materials discovery, pharmaceutical design, and large‑scale optimization.

  • View profile for Benjamin Scott, M.S.

    Director, Critical Infrastructure & OT Strategy & Programs - US Public Sector at Fortinet | Ohio Cyber Reservist | Adjunct Professor

    30,297 followers

    Quantum computing is advancing rapidly, bringing unprecedented processing power that threatens traditional encryption methods. The "collect now, decrypt later" strategy underscores the urgency of preparation, adversaries are already harvesting encrypted data with the intent to decrypt it once large-scale quantum computers become viable. Fortinet is leading the way in quantum-safe security, integrating NIST PQC algorithms, including CRYSTALS-KYBER, into FortiOS to safeguard data from future quantum-based attacks. "A recent real-world demonstration by JPMorgan Chase (JPMC) showcased quantum-safe high-speed 100 Gbps site-to-site IPsec tunnels secured using QKD. The test was conducted between two JPMC data centers in Singapore, covering over 46 km of telecom fiber, and achieved 45 days of continuous operation." "The network leveraged QKD vendor ID Quantique for the quantum key exchange, Fortinet’s FortiGate 4201F for network encryption, and FortiTester for performance measurement." This is not just a theoretical concern, organizations are already deploying quantum-safe encryption solutions. As quantum computing capabilities advance, organizations must adopt quantum-resistant security architectures and take proactive steps now to safeguard their sensitive information against future quantum-enabled attacks. These proactive methods include: -adopting hybrid cryptographic approaches, combining classical and PQC algorithms, ensuring interoperability and a phased transition -implementing crypto-agile architectures, for seamless updates to encryption mechanisms as new quantum-resistant standards emerge -leveraging PQC capable HSMs and TPMs -evaluating network security architectures, such as ZTNA models -ensuring authentication and access controls are resistant to quantum threats. -identifying mission-critical and long-lived data, that must remain secure for decades. -implementing sensitivity-based classification, determine which datasets require the highest level of post-quantum protection. -conducting risk assessments to evaluate data exposure, storage locations, and current encryption standards. -transitioning to quantum-resistant encryption algorithms recommended by NIST’s PQC standardization efforts. -establishing data-at-rest and data-in-transit encryption policies, mandate use of PQC algorithms as they become available. -strengthening key management practices -developing GRC frameworks ensuring adherence to post-quantum security. -implementing continuous cryptographic monitoring to detect and phase out vulnerable encryption methods. -enforcing regulatory compliance by aligning with emerging PQC standards. -establishing incident response plans to handle quantum-driven cryptographic threats proactively. Fortinet remains committed to pioneering quantum-safe encryption solutions, enabling organizations to stay ahead of emerging cryptographic threats. Read more from Dr. Carl Windsor, Fortinet’s CISO!

  • View profile for Albert Evans

    Director, Cybersecurity | CISO Advisory | OT/IT Convergence & AI Security | TCS

    9,751 followers

    The Quantum Security Imperative: Why Your 2025 Data Needs Protection Today If you’re still thinking quantum computing is a distant threat, you’ve already missed the window. Recent quantum security research from leading institutions emphasizes a critical reality: the “Harvest Now, Decrypt Later” threat is widely assessed by governments as a credible ongoing risk. Nation-state adversaries are believed to be harvesting encrypted traffic at scale. Once Cryptographically Relevant Quantum Computers arrive (estimated 2030-2035), Shor’s algorithm could retroactively decrypt previously harvested data. Mosca’s Theorem makes this concrete: If your data needs secrecy for 10 years, migration takes 5 years, and quantum arrives in 12 years, you’re already 3 years late. In healthcare, finance, and national security, that inequality has become a critical risk. The CASCADE Framework Applied: PEOPLE: Your teams need quantum literacy now. CISOs, architects, and developers must understand PQC implications. Start cross-functional quantum readiness teams today. DATA: Build your Cryptographic Bill of Materials. You can’t protect what you don’t inventory. Prioritize patient records, financial transactions, and trade secrets with 10+ years of confidentiality requirements. PROCESS: Implement crypto-agility as standard architecture. When algorithms break (like SIKE in 2022), you need to swap them without recompiling your stack. Embed PQC into procurement, development lifecycles, and vendor management. TECHNOLOGY: Deploy hybrid encryption now. Wrap data in both classical (ECC) and post-quantum (ML-KEM/Kyber) algorithms. NIST finalized FIPS 203, 204, and 205 in August 2024. Start piloting in non-production environments. BUSINESS: U.S. government directives, including NSM-10, mandate federal preparation and planning for PQC migration. Under GDPR and HIPAA, retroactive quantum decryption creates significant regulatory and liability risk. Board-level risk committees need PQC on the agenda now. The execution framework: Prevent (crypto-agility architecture, quantum-resistant algorithms, vendor PQC roadmaps), Detect (CBOM scanning, automated RSA/ECC discovery, traffic analysis), Recover (hybrid encryption, quantum-resistant backups, re-encryption strategies). Early PQC migration planning significantly reduces transition costs. In IoT-heavy industries (automotive, manufacturing, utilities), the cost of physical device replacement escalates exponentially with delay. The dual-track strategy: Offensive (pilot quantum computing for portfolio optimization, supply chain logistics, molecular simulation), Defensive (treat PQC migration as critical infrastructure). Bottom line: Quantum computing’s promise remains years away. The data-collection phase of the quantum threat is already active. What’s your organization’s crypto-agility roadmap? #QuantumComputing #Cybersecurity #PostQuantumCryptography #RiskManagement #CISO

  • View profile for Gaby Frangieh

    Finance, Risk Management and Banking - Senior Advisor

    29,934 followers

    Published in January 2025 in the British Actuarial Journal by Mr. Muhammad Amjad and as per its abstract, this paper extends previous research on using 𝙦𝙪𝙖𝙣𝙩𝙪𝙢 𝙘𝙤𝙢𝙥𝙪𝙩𝙚𝙧𝙨 𝙛𝙤𝙧 𝙧𝙞𝙨𝙠 𝙢𝙖𝙣𝙖𝙜𝙚𝙢𝙚𝙣𝙩 to a substantial, real-world challenge: 𝗰𝗼𝗻𝘀𝘁𝗿𝘂𝗰𝘁𝗶𝗻𝗴 𝗮 𝗾𝘂𝗮𝗻𝘁𝘂𝗺 𝗶𝗻𝘁𝗲𝗿𝗻𝗮𝗹 𝗺𝗼𝗱𝗲𝗹 𝗳𝗼𝗿 𝗮 𝗺𝗲𝗱𝗶𝘂𝗺-𝘀𝗶𝘇𝗲𝗱 𝗶𝗻𝘀𝘂𝗿𝗮𝗻𝗰𝗲 𝗰𝗼𝗺𝗽𝗮𝗻𝘆. Leveraging the author’s extensive experience as the former Head of Internal Model at a prominent UK insurer, the practical bottlenecks in developing and maintaining quantum internal models are closely examined. The work seeks to determine whether a #quadratic speedup, through quantum #amplitude estimation can be realised for problems at an industrial scale. It also builds on previous work that explores the application of quantum computing to the problem of asset liability management in an actuarial context. Finally both the obstacles and the potential opportunities that emerge from applying quantum computing to the field of insurance risk management are examined. #riskmanagement #internalmodel #modelrisk #quantumcomputing #technology #futurerisk #insurancerisk #solvency #quantitativeriskmanagement #FRM #financialrisk #ALM #assetliabilitymanagement #actuarial #MRM #futurerisk #riskassessment #riskquantification #riskmeasurement #riskmitigation #modelgovernance #liquidityrisk #information #resources #knowledge

Explore categories