IBM Successfully Links Two Quantum Chips to Operate as a Single Device Key Insights: • IBM has achieved a significant milestone by linking two quantum chips to function as a single, cohesive system, enabling them to perform calculations beyond the capability of either chip independently. • This accomplishment supports IBM’s modular approach to building scalable quantum computers, a strategy aimed at overcoming the limitations of single-chip architectures. • The linked chips demonstrated successful cooperation, marking a step closer to larger and more powerful quantum systems capable of addressing complex real-world problems. The Modular Quantum Computing Approach: • IBM employs superconducting quantum chips, manufactured using processes similar to traditional semiconductor technology, allowing scalability and integration with existing hardware infrastructure. • Modular quantum systems involve linking smaller quantum processors, rather than relying on a single massive chip, reducing fabrication challenges and improving scalability. • This architecture allows multiple chips to share quantum information seamlessly, paving the way for constructing larger quantum systems without exponentially increasing hardware complexity. Addressing Key Challenges in Quantum Computing: • Scalability: Connecting multiple chips is a critical step toward scaling quantum computers to thousands or even millions of qubits. • Error Reduction: Larger quantum systems increase susceptibility to errors. Modular architectures provide pathways for better error management and correction across linked processors. • Coherence Across Chips: Maintaining the delicate quantum states across separate chips is technically challenging, and IBM’s success suggests progress in solving this issue. Implications of IBM’s Achievement: • Enhanced Computational Power: Linked quantum chips unlock the potential for more complex simulations and problem-solving capabilities. • Practical Quantum Applications: Industries like pharmaceuticals, cryptography, and materials science may soon benefit from more robust and scalable quantum computing solutions. • Competitive Advantage: IBM’s progress underscores its leadership in modular quantum computing, positioning it strongly in the competitive quantum technology landscape. Future Outlook: IBM’s successful demonstration of inter-chip quantum communication validates the modular quantum computing strategy as a viable path to scaling up systems. Future advancements will likely focus on enhancing chip-to-chip communication fidelity, increasing the number of interconnected chips, and reducing overall error rates. This breakthrough brings us one step closer to practical, large-scale quantum computing systems capable of solving problems previously deemed unsolvable by classical computers.
Scalable Quantum Systems Trends for the Next Five Years
Explore top LinkedIn content from expert professionals.
Summary
Scalable quantum systems refer to quantum computers that can grow in size and power without running into fundamental obstacles, making them practical for solving problems that today’s classical computers cannot handle. Over the next five years, advancements such as linking multiple quantum chips, improving error-correction techniques, and deploying collaborative infrastructure are expected to push quantum technology from the lab to real-world industry applications.
- Invest in readiness: Begin educating your teams on quantum applications relevant to your industry so you’re positioned to benefit from new breakthroughs as they become available.
- Monitor integration progress: Keep an eye on developments in chip-to-chip connectivity and error correction, as these will determine when quantum computing moves from experimental to widespread practical use.
- Explore collaborative models: Look into partnerships or shared access programs, as international cooperation is becoming key to building robust quantum infrastructure and sharing expertise.
-
-
2026 is quantum's deployment year—and the infrastructure layer is wide open. IBM just confirmed what we've been positioning for: quantum advantage hits next year. Not benchmarks. Not demos. Real problems that classical computing can't solve. Here's what changed: The hardware is crossing the utility threshold. IBM's Kookaburra system (1,386 qubits, 5,000+ gate operations) isn't a science project anymore. It's enterprise-ready infrastructure. That's the difference between "interesting" and "investable." Sovereign capital is flooding in. The UK alone committed £1.67B through 2030—front-loaded. When governments move from grants to infrastructure budgets, the risk profile shifts. This is the semiconductor playbook, circa 1987. The winner won't be the best qubit—it'll be the best integration layer. The real value capture happens in middleware: error correction, hybrid classical-quantum orchestration, and vertical-specific tooling. That's where our portfolio is concentrated. The gap between "believers" and "deployers" is closing fast. The companies building quantum-ready workflows now—in pharma simulation, financial modeling, materials discovery—will own their categories. Everyone else will rent. LPs positioning today have 18 months of alpha. By the time quantum advantage is proven in production, institutional capital will reprice every adjacent market. We're seeing it already in our pipeline—deal flow quality is up 3x since Q3. The question for allocators: are you funding quantum research, or are you capturing the infrastructure layer before it gets crowded? We're deploying into the latter. DM if you want the full thesis and access
-
We may be standing at a moment in time for Quantum Computing that mirrors the 2017 breakthrough on transformers – a spark that ignited the generative AI revolution 5 years later. With recent advancements from Google, Microsoft, IBM and Amazon in developing more powerful and stable quantum chips, the trajectory of QC is accelerating faster than many of us expected. Google’s Sycamore and next gen Willow chips are demonstrating increasing fidelity. Microsoft’s pursuit of topological qubits using Majorana particles promises longer coherence times and IBM’s roadmap is pushing towards modular error corrected systems. These aren’t just incremental steps, they are setting the stage for scalable, fault tolerant quantum machines. Quantum systems excel at simulating the behavior of molecules and materials at atomic scale, solving optimization problems with exponentially large solution spaces and modeling complex probabilistic systems – tasks that could take classical supercomputers millennia. For example, accurately simulating protein folding or discovering new catalysts for carbon capture are well within quantum’s potential reach. If scalable QC is just five years away, now is the time to ask : What would you do differently today, if quantum was real tomorrow ?. That question isn’t hypothetical – it’s an invitation to start rethinking foundational problems in chemistry, logistics, finance, AI and cryptography. Of course building quantum systems is notoriously hard. Fragile qubits, error correction and decoherence remain formidable challenges. But globally public and private institutions are pouring resources into cracking these problems. I was in LA today visiting the famous USC Information Sciences Institute where cutting edge work on QC is underway and the energy is palpable. This feels like a pivotal moment. One where future shaping ideas are being tested in real labs. Just as with AI, the future belongs to those preparing for it now. QC Is an area of emphasis at Visa Research and I hope it is part of how other organizations are thinking about the future too.
-
Google and IBM believe first workable quantum computer is in sight - meanwhile Europe offers a more collaborative vision Yesterday, both Google and IBM signalled that quantum computing is entering its engineering phase: Google’s Willow chip, introduced in December 2024, demonstrated scalable error correction: as more qubits were added, error rates dropped exponentially. It completed a benchmark task in under five minutes - one that would take today’s fastest supercomputer an unimaginable 10⁻²⁵ years (i.e., 10 septillion years). IBM revealed a detailed blueprint for industrial-scale quantum, outlining a path to building a fault-tolerant quantum supercomputer by late 2029. Meanwhile, real-world applications are already emerging: IBM and Moderna have collaborated to simulate the longest mRNA sequence (60 nucleotides) ever modelled on a quantum computer, using 80 of the 156 qubits on IBM’s Heron chip. They applied a clever algorithm (CVaR-based VQA) that has made earlier attempts at 42 nucleotides seem modest. Now contrast that with Europe’s collaborative approach. Instead of centralised lab efforts, Europe is deploying nine quantum systems across at least seven countries - spanning superconducting, ion-trap, and annealing technologies - integrated with national supercomputing centers for shared access and resilience. I recently visited the Poznań Supercomputing Centre in Poland to witness one of these systems in action. Europe’s model is about collective strength, diversity, and building long-term quantum infrastructure - demonstrating that the race isn’t just about breakthroughs, but also how you organise for scale and inclusivity.
-
⚛️ Two quantum breakthroughs this week just moved us significantly closer to practical quantum computers that could solve real-world problems. Alice & Bob in Paris achieved something remarkable: their "Galvanic Cat" qubits can now resist errors for over an hour - that's millions of times longer than standard qubits that typically last only microseconds. This solves quantum computing's biggest challenge: keeping information stable long enough to perform meaningful calculations. Meanwhile, Caltech physicists assembled the largest qubit array ever built: 6,100 neutral atoms trapped by 12,000 laser "optical tweezers" with 99.98% accuracy. Think of it as building a quantum city where every atom is perfectly positioned and controlled. 🏗️ Here's why this matters for every industry: 💊 Pharmaceutical companies could simulate molecular interactions in hours instead of years, accelerating drug discovery 🔋 Materials scientists could design better batteries and solar panels by understanding quantum behavior 🧬 Medical researchers could unlock new treatments by modeling complex biological systems 🏦 Financial institutions could optimize portfolios and detect fraud with unprecedented precision These cat qubits could reduce quantum computer hardware requirements by up to 200 times compared to competing approaches - making quantum computers not just more powerful, but dramatically cheaper and more accessible. 💰 The actionable insight: Start preparing your teams now. Companies that understand quantum applications in their field will have a massive competitive advantage when these systems become commercially available in the next 5-7 years. What quantum applications could transform your industry? Share your thoughts below! 👇 https://lnkd.in/ea4p9Sby https://lnkd.in/e8Urf97w
-
NVIDIA CEO Jensen Huang recently claimed that practical quantum computing is still 15 to 30 years away and will require NVIDIA #GPUs to build hybrid quantum/classical supercomputers. But both the timeline and the hardware assumption are off the mark. Quantum computing is progressing much faster than many realize. Google’s #Willow device has demonstrated that scaling up quantum systems can exponentially reduce errors, and it achieved a benchmark in minutes that would take classical supercomputers countless billions of years. While not yet commercially useful, it shows that both quantum supremacy and fault tolerance are possible. PsiQuantum, a company building large-scale photonic quantum computers, plans to bring two commercial machines online well before the end of the decade. These will be 10,000 times larger than Willow and will not use GPUs, but rather custom high-speed hardware specifically designed for error correction. Meanwhile, quantum algorithms are advancing rapidly. PsiQuantum recently collaborated with Boehringer Ingelheim to achieve over a 200-fold improvement in simulating molecular systems. Phasecraft, the leading quantum algorithms company, has developed quantum-enhanced algorithms for simulating materials, publishing results that threaten to outperform classical methods even on current quantum hardware. Algorithms are improving 1000s of times faster than hardware, and with huge leaps in hardware from PsiQuantum, useful quantum computing is inevitable and increasingly imminent. This progress is essential because our existing tools for simulating nature, particularly in chemistry and materials science, are limited. Density Functional Theory, or DFT, is widely used to model the electronic structure of materials but fails on many of the most interesting highly correlated quantum systems. When researchers tried to evaluate the purported room-temperature superconductor LK-99, #DFT failed entirely, and researchers were forced to revert to cook-and-look to get answers. Even cutting-edge #AI models like DeepMind’s GNoME depend on DFT for training data, which limits their usefulness in domains where DFT breaks down. Without more accurate quantum simulations, AI cannot meaningfully explore the full complexity of quantum systems. To overcome these barriers, we need large-scale quantum computers. Building machines with millions of qubits is a significant undertaking, requiring advances in photonics, cryogenics, and systems engineering. But the transition is already underway, moving from theoretical possibility to construction. Quantum computing offers a path from discovery to design. It will allow us to understand and engineer materials and molecules that are currently beyond our reach. Like the transition from the stone age to ages of metal, electricity, and semiconductors, the arrival of quantum computing will mark a new chapter in our mastery of the physical world.
-
🚀 Quantum Computing: Transitioning from Lab Theory to Operational Reality 2025 marks a definitive shift as the UN celebrates the International Year of Quantum Science and Technology. We are moving past speculative demos toward productive and operational utility, integrating quantum into core workflows to solve "intractable" problems. 🏆 Top 10 Quantum Achievements of 2025: 1. Verifiable Quantum Advantage: Google’s Willow chip achieved a 13,000x speedup over the world’s fastest supercomputers using the "Quantum Echoes" algorithm to model real physical experiments. 2. Topological Stability: Microsoft unveiled Majorana 1, achieving a 1,000-fold reduction in error rates using hardware-protected topological qubits. 3. The "Four-Nines" Barrier: IonQ reached a world-record 99.99% two-qubit gate fidelity, dramatically reducing the physical qubits needed for fault-tolerant operations. 4. Operational Scaling: Caltech researchers assembled a 6,100-qubit neutral-atom array, maintaining superposition for 13 seconds without compromising quality. 5. Extended Coherence: Alice & Bob created "cat qubits" that resisted bit-flip errors for more than one hour, essential for long-running operational algorithms. 6. Quantum Internet Breakthrough: T-Labs demonstrated high-fidelity (99%) transmission of entangled photons across 30km of commercial fiber for 17 days. 7. GPS-Denied Navigation: Q-CTRL achieved the first commercial advantage in sensing, using quantum magnetometers for navigation 100x more accurate than conventional systems without GPS. 8. Continuous Operation: Harvard and QuEra ran a 3,000-qubit array for over two hours by replenishing atoms mid-computation. 9. Standard Hardware Integration: IBM successfully ran quantum error correction algorithms on commercially available AMD chips, accelerating practical scalability. 10. Modular Interconnects: Oxford University achieved quantum gate teleportation between separate modules, proving that distributed quantum computing is viable. 🛠️ How to Prepare for the Operational Transition: • Operational Resilience: The timeline for Cyber-Resilience has accelerated; research shows that 1 million physical qubits could break RSA-2048 encryption in just one week. Experts recommend deprecating vulnerable systems by 2030, making the migration to Post-Quantum Cryptography (PQC) a current operational priority. • Infrastructure Integration: Utilize hybrid cloud-quantum architectures via Amazon Braket or Azure Quantum to test readiness without heavy capital investment. • Logistics Optimization: Organizations like D-Wave are already delivering an 80% reduction in scheduling efforts for complex supply chains. Quantum is no longer a "future" tech; it is an operational differentiator for the next decade. #QuantumComputing #Innovation #SupplyChain #CyberSecurity #CloudComputing #FutureOfTech
-
The quantum landscape is shifting faster than most people realize. In the last 72 hours alone, we’ve seen three signals that define where the next decade is heading: 1. Industrial quantum manufacturing is no longer theoretical. Companies capable of building repeatable, export‑ready quantum systems at scale are separating from the pack. The shift from prototype culture to manufacturing culture is now the real competitive frontier. 2. Frontier materials science just broke a thermal barrier. University of Southern California ’s new 1300°F (700°C) memristor demonstrates that computation can survive and compute in environments where silicon dies instantly. That unlocks AI and quantum‑adjacent systems for aerospace, geothermal, fusion, and defense applications previously considered impossible. 3. Quantum materials are beginning to harvest energy from the environment. The nonlinear Hall effect (NLHE) work from QUT/NTU shows that imperfections and lattice vibrations can be engineered to convert ambient AC signals directly into DC power. Imagine sensors, chips, and edge devices operating without batteries powered by the quantum behavior of the material itself.These aren’t isolated breakthroughs. They’re converging.Quantum is becoming an industrial ecosystem spanning manufacturing, materials, energy, and computation.And the organizations that understand how these pieces fit together will define the next era of infrastructure.For teams navigating this transition from national programs to enterprise R&D I help map these signals into strategy: manufacturing readiness, substrate alignment, deployment pathways, and cross‑ecosystem positioning.The next decade belongs to the builders who can see the whole board.🖤🔥 #QuantumComputing #QuantumHardware #DeepTech #QuantumMaterials #IndustrialQuantum #AIInfrastructure #NextGenElectronics #QuantumEcosystem
-
Quantum scaling just hit its real bottleneck and it’s not qubits, it’s fabs. IonQ buying SkyWater Technology is about control, not capacity. And that changes the quantum race. What stands out isn’t the $1.8B price tag. It’s the shift from outsourced experimentation to vertically integrated execution. Quantum progress is no longer gated by theory. It’s gated by design-build-test speed and secure manufacturing access. IonQ is collapsing that cycle. Proof point: Design-to-sample time for a 256-qubit chip drops from 9 months to 2 months. That’s a 4.5× acceleration, not incremental improvement. BREAKDOWN • Cycle time: 9 → 2 months → Faster learning beats bigger labs • Scale target: 200,000-qubit samples by 2028 → Early system-level validation • Supply chain: Category 1A Trusted Foundry → Federal-grade security baked in Quantum is being treated as critical infrastructure, not optional compute. Whoever controls trusted manufacturing controls the roadmap. This deal quietly moves quantum from hype cycles into industrial execution mode. Does quantum’s next breakthrough come from better physics or better fabs? #QuantumComputing #Semiconductor #SupplyChain #Geopolitics #AdvancedPackaging #TrustedFoundry #DeepTech
-
A quiet but important milestone in quantum computing!!! While most headlines focus on AI chatbots, something significant just happened in quantum computing. The U.S. Department of Energy (DOE) ’s National Quantum Information Science Research Centers announced a major milestone toward building scalable quantum computers. This isn’t a flashy “quantum supremacy” claim. It’s something more important: - Progress toward scalability. - Why scalability is the real challenge Building a few high-quality qubits is hard. Building hundreds or thousands of stable, controllable qubits? That’s the real barrier. Scalable systems require: - Better qubit coherence - Lower error rates - Advanced error correction - Precise control at larger system sizes DOE-backed research centers are working exactly on these hardware and system-level bottlenecks. And that’s what ultimately determines whether quantum becomes practical. Why does this matter for us?? Quantum progress doesn’t usually come as a single dramatic breakthrough. It comes as: - Incremental stability improvements - Better integration between materials + control systems - Coordination across national labs, academia, and industry This milestone suggests that the ecosystem is maturing. Question for you: What will matter more for quantum progress in the next 3–5 years? 1. Hardware scalability 2. Error correction breakthroughs 3. Commercial use cases 4. Talent & research funding Comment 1 / 2 / 3 / 4. More information: https://lnkd.in/gJ2JruDy #QuantumComputing #DeepTech #QuantumHardware #Innovation #FutureOfComputing
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Event Planning
- Training & Development