Challenges in Reproducing Early Quantum Technologies

Explore top LinkedIn content from expert professionals.

Summary

The challenges in reproducing early quantum technologies refer to the difficulties scientists and engineers face when trying to create quantum devices that consistently perform as expected, especially as these systems grow in complexity. Quantum computers and related technologies are incredibly sensitive to their environment, making it tough to scale and reliably reproduce their results compared to classical computers.

  • Address material issues: Focus on improving fabrication techniques and materials to reduce inconsistencies in quantum devices, like tackling grain boundary grooving in superconducting qubits.
  • Advance error correction: Invest in developing error correction methods and calibration workflows to minimize noise and boost the reliability of quantum operations.
  • Modernize training and tools: Encourage a new generation of engineers to master multidisciplinary skills and update engineering toolkits for quantum-scale challenges, including tight integration with classical systems.
Summarized by AI based on LinkedIn member posts
  • View profile for Michaela Eichinger, PhD

    Product Solutions Physicist @ Quantum Machines | I talk about quantum computing.

    16,214 followers

    Why can’t we scale superconducting qubits like transistors? Qubits, even those on the same chip or wafer, often show big frequency variations. Here’s the thing: qubit frequency is directly tied to the Josephson Junction (JJ), the core circuit component in superconducting qubits. And while we’ve mastered transistor fabrication at nanometer precision, JJs remain a challenge. Why? Turns out, the issue isn’t what you’d expect. It’s something rarely discussed: 𝗚𝗿𝗮𝗶𝗻 𝗕𝗼𝘂𝗻𝗱𝗮𝗿𝘆 𝗚𝗿𝗼𝗼𝘃𝗶𝗻𝗴. A Josephson Junction is a trilayer (Al-AlOx-Al), typically made by oxidizing the bottom aluminum layer before depositing the top one. The problem is that aluminum grains form grooves at their boundaries. The oxide layer inherits this roughness, leading to an uneven thickness across the barrier. And that’s where the chaos begins. Because the barrier thickness impacts the critical current, which in turn dictates the qubit frequency. Even tiny variations in the AlOx barrier have a big impact on hitting target frequencies. 𝗦𝗼, 𝗵𝗼𝘄 𝗱𝗼 𝘄𝗲 𝗳𝗶𝘅 𝗶𝘁? We have quite few levers to pull. For instance, • 𝗙𝗹𝘂𝘅 𝗧𝘂𝗻𝗮𝗯𝗶𝗹𝗶𝘁𝘆 We design the qubit as a SQUID loop to tune the frequency using magnetic flux. This has become the state-of-the-art architecture, however it adds to the wiring overhead (one line per qubit). • 𝗣𝗼𝘀𝘁-𝗙𝗮𝗯𝗿𝗶𝗰𝗮𝘁𝗶𝗼𝗻 𝗧𝗿𝗶𝗺𝗺𝗶𝗻𝗴 We can use techniques like Laser Annealing to permanently trim the junction resistance 𝘢𝘧𝘵𝘦𝘳 fabrication. This allows us to "edit" qubits to the hit their target frequency.    • 𝗕𝗲𝘁𝘁𝗲𝗿 𝗠𝗮𝘁𝗲𝗿𝗶𝗮𝗹𝘀 The field is relentlessly trying to improve the hardware stack. One example is growing epitaxial aluminum films. It’s the superior physical solution, but currently expensive and difficult to integrate into standard fabrication workflows. What are you doing to improve qubit reproducibility ? Applied Materials imec Quantum Foundry Copenhagen IQM Quantum Computers Infineon Technologies Intel Foundry TSMC

  • View profile for Mykola Maksymenko

    Co-founder & CTO, Haiqu | Scaling Quantum & AI for Real-World Impact | Deep-Tech R&D & Commercialization

    8,299 followers

    To understand real momentum in #quantum, compare the last 2–3 years, not the last 2–3 months. The progress is inspiring, but are we close to any kind of inflection point? 𝗪𝗵𝗲𝗻 Richard Givhan 𝗮𝗻𝗱 𝗜  𝘀𝘁𝗮𝗿𝘁𝗲𝗱 𝗛𝗮𝗶𝗾𝘂 𝗶𝗻 𝗲𝗮𝗿𝗹𝘆 𝟮𝟬𝟮𝟯 𝗶𝗻 Creative Destruction Lab, 𝗾𝘂𝗮𝗻𝘁𝘂𝗺 𝗰𝗼𝗺𝗽𝘂𝘁𝗶𝗻𝗴 𝘄𝗮𝘀 𝗹𝗮𝗿𝗴𝗲𝗹𝘆 𝗮 𝘄𝗼𝗿𝗹𝗱 𝗼𝗳 𝘁𝗼𝘆-𝘀𝗰𝗮𝗹𝗲 𝗽𝗿𝗼𝗼𝗳𝘀 𝗼𝗳 𝗰𝗼𝗻𝗰𝗲𝗽𝘁: few-qubit algorithms on simulators, and “real hardware” demos (often limited by tens-of-qubits devices and unstable performance) where the goal was to confirm the ability to extract any signal in the noise rather than solving anything practical. 𝗔 𝗳𝗲𝘄 𝗿𝗲𝗮𝗹-𝗹𝗶𝗳𝗲 𝗮𝗻𝗲𝗰𝗱𝗼𝘁𝗲𝘀 𝗼𝗳 𝘁𝗵𝗮𝘁 𝘁𝗶𝗺𝗲. In one of our early benchmarks, a publicly available QPU produced nearly random noise with no signs of declared performance specs. As we later learned, the device's cooling system was broken, causing significant thermal noise. On another public device, the algorithm's fidelity fluctuated 2x between calibration cycles. 𝗧𝗵𝗲 𝗼𝘂𝘁𝗹𝗼𝗼𝗸 𝗳𝗼𝗿 𝗿𝘂𝗻𝗻𝗶𝗻𝗴 𝘀𝗼𝗺𝗲𝘁𝗵𝗶𝗻𝗴 𝗽𝗿𝗮𝗰𝘁𝗶𝗰𝗮𝗹 𝗶𝗻 𝘁𝗵𝗶𝘀 𝘀𝗲𝘁𝘁𝗶𝗻𝗴 𝗳𝗲𝗹𝘁... 𝗱𝗶𝘀𝘁𝗮𝗻𝘁. At the same time, some of the one-off “quantum supremacy” experiments were already hinting at a different path. Even on these noisy QPUs, very shallow circuits can create entangled states that are hard to reproduce classically. The obvious question is: can such states be utilised for any useful computation, without the need for handcrafted deep circuits that hardware noise destroys? This reminds me of early #perception #AI systems: millions of lines of handcrafted logic in computer vision or signal processing were replaced by comparatively “shallow” neural nets - once the right training infrastructure and software stack emerged. ⏩ 𝗜𝗻 𝗷𝘂𝘀𝘁 𝗮 𝗰𝗼𝘂𝗽𝗹𝗲 𝗼𝗳 𝘆𝗲𝗮𝗿𝘀: 𝟭𝟬𝟬+ 𝗾𝘂𝗯𝗶𝘁 𝗱𝗲𝘃𝗶𝗰𝗲𝘀 𝗮𝗿𝗲 𝗻𝗼𝘄 𝗿𝗼𝘂𝘁𝗶𝗻𝗲𝗹𝘆 𝗮𝗰𝗰𝗲𝘀𝘀𝗶𝗯𝗹𝗲 (big thanks to IBM Quantum for this move), and 𝘄𝗲’𝘃𝗲 𝘀𝗲𝗲𝗻 𝗮 𝘀𝘂𝗿𝗴𝗲 𝗼𝗳 𝗹𝗮𝗿𝗴𝗲-𝘀𝗰𝗮𝗹𝗲 𝗾𝘂𝗮𝗻𝘁𝘂𝗺 𝗲𝘅𝗽𝗲𝗿𝗶𝗺𝗲𝗻𝘁𝘀 𝗶𝗻 𝗽𝗵𝘆𝘀𝗶𝗰𝘀, 𝗰𝗵𝗲𝗺𝗶𝘀𝘁𝗿𝘆, 𝗼𝗽𝘁𝗶𝗺𝗶𝘇𝗮𝘁𝗶𝗼𝗻, etc. Many of these applications are heuristic and shallow-circuit by design. However, running a reliable experiment at utility-scale is still hard. Reproducibility, noise, calibration, and cost still limit quantum runs at that scale. That’s the gap we’re closing at Haiqu - 𝘁𝘂𝗿𝗻𝗶𝗻𝗴 𝗲𝘅𝗲𝗰𝘂𝘁𝗶𝗼𝗻 𝗼𝗳 𝗮𝗹𝗴𝗼𝗿𝗶𝘁𝗵𝗺 𝗼𝗻 𝗤𝗣𝗨𝘀 𝗶𝗻𝘁𝗼 𝗮 𝗿𝗲𝗽𝗲𝗮𝘁𝗮𝗯𝗹𝗲, 𝗯𝘂𝗱𝗴𝗲𝘁𝗮𝗯𝗹𝗲 𝘄𝗼𝗿𝗸𝗳𝗹𝗼𝘄 𝘄𝗶𝘁𝗵 𝗮 𝗽𝗿𝗲𝗱𝗶𝗰𝘁𝗮𝗯𝗹𝗲 𝗵𝗶𝗴𝗵 𝗽𝗲𝗿𝗳𝗼𝗿𝗺𝗮𝗻𝗰𝗲. We’re doubling down on making this capability accessible to many more researchers and engineers. Even if reliable quantum hardware appears tomorrow, applications for broad commercial adoption need to be discovered. The inflection point is when prototyping becomes fast and cheap enough to validate practical use cases at scale.

  • View profile for Keith King

    Former White House Lead Communications Engineer, U.S. Dept of State, and Joint Chiefs of Staff in the Pentagon. Veteran U.S. Navy, Top Secret/SCI Security Clearance. Over 16,000+ direct connections & 44,000+ followers.

    43,833 followers

    Quantum Computers Have Arrived—But Are They Useful Yet? Despite rapid advances in quantum computing, the technology has yet to prove its real-world utility beyond experimental applications. While hundreds of companies, from startups like Alice & Bob to tech giants like Microsoft, are racing to commercialize quantum devices, the key challenges of scalability and error rates remain unresolved. The Current State of Quantum Computing • Quantum Computers Are “Working”—But Not Practically • Quantum machines can perform calculations, but they haven’t yet demonstrated clear advantages over classical computers in solving real-world problems. • “Some 10 years ago, quantum computing was mostly a lab experiment,” says Laurent Prost of Alice & Bob. Now, an entire global ecosystem has emerged, but practical breakthroughs remain elusive. • Major Challenges: Size and Error Rates • Scalability Issues: Useful quantum computing requires thousands to millions of qubits, far more than today’s leading systems. • Quantum Errors: Qubits are highly unstable, meaning calculations quickly degrade due to noise and decoherence. • Trade-Off Between Size and Stability: Adding more qubits increases computational power, but also amplifies errors, requiring complex error correction. When Will Quantum Computers Become Useful? • Breakthroughs Needed in Fault Tolerance: • Companies like IBM, Google, PsiQuantum, and Microsoft are working on error correction methods to make quantum systems more reliable. • Without fault-tolerant qubits, quantum computers will remain stuck in the experimental phase. • Industry-Specific Applications Could Come First: • Drug discovery, materials science, and financial modeling could see early quantum benefits, even before general-purpose quantum computing matures. • Some firms are exploring hybrid quantum-classical approaches, using quantum computers to speed up select subproblems in classical algorithms. • Timeline Uncertain: • Some experts believe practical quantum computing could arrive within a decade, while others remain skeptical, citing the complexity of large-scale quantum error correction. What’s Next? • Scaling Quantum Hardware: Companies are racing to build systems with 1,000+ qubits while reducing error rates. • Quantum Software & Algorithms: Developing new algorithms that leverage quantum advantages will be critical to unlocking real-world applications. • Government & Industry Investment: Nations like the U.S., China, and the EU are heavily funding quantum research to maintain technological and security advantages. While quantum computing is no longer just a theoretical pursuit, its practical impact remains unclear. The race is on to overcome technical hurdles and unlock the revolutionary potential of these once-exotic machines.

  • View profile for Dikla Levi

    Data Center and Lab Design Expert

    13,135 followers

    We’re still trying to build tomorrow’s machines with yesterday’s toolboxes. As quantum computing shifts from lab prototypes to scalable systems, one issue keeps coming up across the industry: our current training and engineering frameworks aren’t built for quantum-scale challenges. Experts repeatedly point to the same gaps: * Workforce skills aren’t aligned. Quantum development requires a blend of physics, materials science, cryogenics, microwave engineering, and system-level thinking, a mix traditional programs don’t yet produce at scale. * Engineering toolchains need to evolve. Scaling superconducting qubits demands wafer-level fabrication, ultra-low-noise environments, and error-mitigation workflows far beyond classical hardware norms. * Hybrid systems are the new standard. Quantum processors rely on tight integration with classical electronics, control systems, and software, requiring new models and methods. * Collaboration must accelerate. Moving quantum devices from research to manufacturable platforms depends on shared standards, shared data, and real industry-academia alignment. These aren’t future predictions. They are today’s bottlenecks. If we expect the next generation to build quantum-era technology, we need toolboxes designed for the quantum era, not the classical one. #QuantumComputing #DeepTech #QuantumEngineering #FutureOfTechnology #QuantumInfrastructure #NextGenInnovation

  • View profile for Abdulla Ahmed Alkaabi

    Senior Director, Technology Consulting @ PwC Middle East | AI & Digital Transformation Leader | MBZUAI Advisory Board | Cornell MBA

    7,491 followers

    Day 5: Overcoming Challenges in Quantum Computing Title: “Quantum Hurdles: Addressing the Challenges Ahead” As we explore the frontiers of quantum computing, it’s crucial to acknowledge and address the significant challenges that lie ahead. While the potential of quantum computing is vast, the path to realizing this potential is fraught with technical and conceptual hurdles. Here are some of the main challenges facing quantum computing today and the efforts underway to overcome them: Error Rates and Qubit Stability: Quantum bits, or qubits, are the fundamental building blocks of quantum computers. However, they are highly susceptible to errors caused by environmental interference. Researchers are actively developing error correction techniques and exploring more stable qubit technologies to mitigate these issues. Scalability: Building a quantum computer with a sufficient number of qubits to perform complex calculations is a monumental challenge. Scaling up requires innovations in quantum technology, including the development of new materials and architectures that can support thousands, or even millions, of qubits. Quantum Decoherence: The quantum state of qubits needs to be preserved long enough to perform calculations, but qubits tend to lose their quantum properties through a process called decoherence. Extending the coherence time of qubits is a critical area of research, involving advances in qubit isolation and control techniques. Software and Algorithms: While hardware challenges are significant, developing the software and algorithms capable of harnessing quantum computing’s power is equally crucial. This includes creating quantum programming languages, developing quantum algorithms, and simulating quantum systems on classical computers. Integration with Classical Systems: Even as quantum computers become more powerful, they will need to work in tandem with classical systems for the foreseeable future. Developing effective hybrid quantum-classical systems is essential for practical applications. Ethical and Security Implications: As quantum computing advances, it will also pose new ethical and security challenges, particularly in fields like cryptography. Preparing for a post-quantum cryptography world is essential to ensuring data security in the future. Call-to-Action: What do you see as the biggest challenge facing quantum computing? How can researchers, industries, and policymakers work together to overcome these hurdles? Join the discussion on navigating the path forward for quantum computing.

Explore categories