A fault-tolerant quantum computer by 2028 is one of the most ambitious timelines the industry has seen. The U.S. Department of Energy recently announced a grand challenge to deliver the first generation of fault-tolerant quantum computers capable of scientifically relevant calculations within three years. Rather than building the system itself, the agency is inviting quantum computing companies to provide solutions, remaining hardware-agnostic across superconducting qubits, trapped ions, neutral atoms, and other approaches. The scale of the challenge is significant. Current error correction estimates suggest it could take roughly 1,000 physical qubits to produce a single reliable logical qubit. Most devices today feature only a few hundred physical qubits at best. There are reasons for optimism. Recent breakthroughs have demonstrated that quantum error correction works in practice, not just in theory. Renewed institutional investment, including $625 million to extend national quantum research centers, signals serious commitment to solving these scientific hurdles. However, real obstacles remain. A recent industry report highlights a critical talent gap. Only an estimated 600 to 700 professionals worldwide specialize in quantum error correction, while the field may need up to 16,000 by the end of the decade. Training these experts can take up to 10 years. Whether or not 2028 proves achievable, this kind of bold target serves an important purpose. Grand challenges focus attention, attract funding, and accelerate collaboration across the ecosystem. Even if the timeline stretches, the momentum it creates could prove invaluable for the entire quantum computing industry. #QuantumComputing #QuantumTechnology #QuantumErrorCorrection #Innovation #FutureOfTech
Qubit Value’s Post
More Relevant Posts
-
Better quantum computing stock: D-Wave Quantum vs. Rigetti Computing - MSN Recent financial analysis of the quantum technology sector highlights D-Wave Quantum as outperforming Rigetti Computing in commercial bookings, largely due to its specialized hardware approach, though both companies remain unprofitable. To understand this market, we must look at the underlying science. The foundation of this industry is the qubit. Unlike classical computer bits that process data as strictly 0s or 1s, quantum computers use qubits to leverage the properties of quantum mechanics. This enables them to process complex data in minutes that would take conventional computers centuries to calculate. Building these systems requires distinct engineering strategies. Rigetti focuses on a gate-based approach using superconducting qubits. While these systems offer immense computational speed, maintaining qubit stability is extremely difficult. The hardware is highly sensitive to its environment, making the system error-prone. Currently, Rigetti achieves around 99.5% 2-gate fidelity (a measure of accuracy), showing that error reduction remains a significant hurdle. D-Wave took a different path called quantum annealing. Instead of building a general-purpose computer, annealing is specialized for complex optimization tasks, such as manufacturing schedule creation. This focus has allowed D-Wave to secure commercial partnerships and generate early revenue. D-Wave is now also expanding into traditional gate-based computing using fluxonium qubits. What this means: In the nascent quantum hardware race, specialized applications are currently providing a clearer path to revenue than early-stage, general-purpose systems. What this does not mean: The hardware race is not over. Both companies hold large cash reserves to fund ongoing research, as the industry remains years away from full commercialization. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #QuantumHardware #SuperconductingQubits #QuantumAnnealing https://lnkd.in/ers9BqTU
To view or add a comment, sign in
-
-
Better quantum computing stock: D-Wave Quantum vs. Rigetti Computing - MSN Financial analysts recently evaluated D-Wave Quantum and Rigetti Computing, finding that D-Wave is currently capturing more revenue and securing larger contracts. Meanwhile, Rigetti was eliminated from a DARPA program and delayed its new 108-qubit machine due to system fidelity issues. To understand this contrast, we must look at how quantum hardware operates. Classical computers process information in bits of 0 or 1. Quantum computers use qubits, which leverage superposition to represent 0 and 1 simultaneously. There are different architectures for utilizing qubits. Rigetti focuses on gate-based quantum computing. Similar to a traditional computer, a gate-based system applies sequences of logic gates to solve algorithms. The challenge is that qubits are extremely fragile. Environmental noise causes them to lose their quantum state, creating calculation errors, which is known as a fidelity problem. Because robust error correction does not yet exist, building large, accurate gate-based systems remains exceedingly difficult. D-Wave utilizes a specialized approach called quantum annealing. Rather than using step-by-step logic gates, an annealing system maps an optimization problem into a physical energy landscape. The qubits naturally settle into the lowest energy state, which represents the optimal solution. While this method only solves specific optimization problems, such as schedule creation, it is currently easier to commercialize. D-Wave is now leveraging its annealing business to develop its own traditional gate-based systems. This development means specialized quantum approaches are finding commercial footing faster than traditional gate-based systems. It does not mean the race to build a perfect quantum computer is over. Both companies are unprofitable, and the sector still faces immense technical hurdles before error-free computing becomes a reality. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #QuantumHardware #QuantumAnnealing #QuantumErrorCorrection https://lnkd.in/ers9BqTU
To view or add a comment, sign in
-
-
Quantum bits (qubits) are the fundamental building blocks of quantum information processing. A novel qubit platform invented at the U.S. Department of Energy's (DOE) Argonne National Laboratory exhibits noise levels thousands of times lower than those of most traditional qubits. "Noise" refers to disturbances in the environment that diminish a qubit's performance. The platform was built by trapping single electrons on the surface of frozen neon gas. The recent finding positions Argonne's platform as a strong contender in the field of high-performance quantum technologies. https://lnkd.in/eF_bqNeX
To view or add a comment, sign in
-
A major step toward utility-scale quantum computing has emerged as Monarch Quantum and Oratomic announce a strategic partnership to accelerate the commercialization of #faulttolerant quantum systems. The collaboration brings together two powerful approaches—integrated photonics and neutral atom architectures—to address one of the biggest challenges in quantum computing: scalability. Monarch Quantum will act as the photonics systems integrator, delivering advanced Quantum Light Engines and enabling large-scale manufacturing and system engineering. By combining #photonic control systems with #neutral atom qubit arrays, the partnership aims to build quantum computers capable of supporting tens of thousands of physical qubits and thousands of #errorcorrected logical qubits by the end of the decade. This significantly reduces earlier assumptions that millions of qubits would be required, marking a critical inflection point in making quantum systems commercially viable. Beyond technological integration, the alliance reflects a broader industry shift—from experimental prototypes to deployable, scalable quantum infrastructure. If successful, this approach could unlock real-world applications across industries, from #drugdiscovery and #materialsscience to #cryptography and #optimization. This partnership signals that the race toward practical quantum computing is no longer theoretical—it is entering the phase of engineering, manufacturing, and real-world deployment. #QuantumComputing #QuantumTechnology #FaultTolerantQuantum #DeepTech #Photonics #NeutralAtoms #Innovation #FutureOfComputing #QuantumEcosystem #Oratomic #MonarchQuantum https://lnkd.in/gTjyai5h
To view or add a comment, sign in
-
🚀 Quantum Breakthrough Slashes Qubit Requirements, Accelerating Path to Practical Computing A major advance in quantum computing architecture has dramatically reduced the number of qubits required for error correction, potentially bringing practical, large-scale quantum machines much closer to reality. Researchers from Caltech and startup Oratomic have shown that systems once thought to need millions of physical qubits may now be possible with just tens of thousands. The core problem in quantum computing has always been error correction. Traditional approaches require roughly 1,000 physical qubits to create one stable logical qubit, an enormous overhead that has blocked scalability. This new architecture slashes that ratio dramatically, in some cases to as few as five physical qubits per logical qubit, an order-of-magnitude improvement. The breakthrough comes from neutral-atom quantum systems. Individual atoms act as qubits and are precisely manipulated using laser-based optical tweezers. Unlike fixed architectures, these atoms can be dynamically repositioned and connected across larger distances, enabling far more efficient error-correction codes and significantly less redundancy. The implications are huge: - Engineering complexity, cost, and physical size of quantum computers could drop dramatically. - Fully functional systems may now be achievable with just 10,000–20,000 qubits, a range that aligns with current technological roadmaps. - Real-world applications in cryptography, materials science, drug discovery, and optimization could arrive years earlier than previously expected. This isn’t just incremental progress, it’s a fundamental shift from theoretical scalability challenges to practical engineering solutions. By directly tackling one of the biggest bottlenecks in the field, the industry just took a major step toward making quantum computing a deployable, high-impact technology. What do you think, will this accelerate the quantum timeline more than most people expect? I’d love to hear your perspective in the comments 👇 #QuantumComputing #QuantumBreakthrough #NeutralAtoms #ErrorCorrection #FutureOfComputing #TechInnovation #Caltech
To view or add a comment, sign in
-
https://lnkd.in/gv9QYt-m Insider Brief... • A new qubit platform developed at Argonne National Laboratory uses electrons trapped on solid neon and demonstrates noise levels 10–10,000 times lower than most semiconductor-based qubits, positioning it as a strong candidate for scalable quantum computing. • The system achieves a coherence time of about 0.1 milliseconds—nearly 1,000 times longer than prior semiconducting qubits—while maintaining high gate fidelity, indicating improved stability and accuracy in quantum operations. • Researchers attribute the low noise to neon’s chemically inert, impurity-free properties, though remaining challenges include mitigating stray electrons and surface imperfections to further optimize performance. ...Image by Xu Han/Argonne National
To view or add a comment, sign in
-
Citi Research Explores Quantum Innovation For National Security And Infrastructure - quantumzeitgeist.com Citi Research recently evaluated the transition of quantum technology from theoretical potential to practical applications in national security and infrastructure, featuring insights from Infleqtion. At the core of this shift are qubits. Unlike classical computing bits that register as strictly 0 or 1, qubits use superposition to exist in combinations of both states. When linked through a property called entanglement, qubits can process highly complex variables simultaneously. Fully fault-tolerant quantum computers remain in development, requiring extensive error correction to protect these fragile qubit states from outside interference. Yet, early hardware is already beginning to run complex algorithms. However, the immediate breakthrough highlighted in the Citi assessment is quantum sensing. Quantum sensors harness the extreme environmental sensitivity of quantum states to measure physical changes. The exact same fragility that causes data errors in quantum computing makes qubits exceptional sensing instruments. They react to the slightest shifts in motion, time, or magnetic fields. This development means quantum technology is actively delivering ultra-precise navigation, timing, and threat detection today. These tools provide resilient positioning capabilities for defense and critical infrastructure in environments where classical systems struggle to maintain accuracy. This does not mean large-scale, error-free quantum computers are currently deployed. Instead, it demonstrates a dual reality: quantum sensing offers immediate, tangible security upgrades, while quantum computing hardware and algorithms steadily advance toward broader commercial utility. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #QuantumSensing #NationalSecurity #Infrastructure https://lnkd.in/eErrf-2y
To view or add a comment, sign in
-
-
Neutral atom vapor cells can be used not just for timing and #sensors but maybe for #quantum computing. Here's insight from Bernard Murphy. #Rb #Cs #Be #quantumcomputing #vaporcells Nick Werstiuk Marcelo Zuffo Paulo Nussenzveig Josh Moles Wale lawal, Ph. D. https://lnkd.in/geCAtV4n
To view or add a comment, sign in
-
Is Rigetti Computing the Best Quantum Computing Stock to Buy Right Now? - AOL.com Rigetti Computing recently achieved a technical milestone: up to a 99.9 percent two-qubit gate fidelity. In simple terms, when a calculation passes through two processing gates, there is only a one in a thousand chance of an error. To understand this, we must look at how quantum hardware operates. Quantum computers process information using qubits, the foundational units of quantum systems. To perform algorithms, qubits must interact, which is managed by quantum logic gates. A two-qubit gate directs operations between individual qubits to process complex calculations. The primary hurdle in the quantum computing industry today is accuracy. While processing gates execute calculations, they are highly prone to errors. Fidelity measures this accuracy. High fidelity is necessary to ensure computations produce correct results without data loss or corruption. While a 99.9 percent fidelity is a step forward, it is important to explain the technology's current limitations. As the number of qubits in a system increases, accuracy quickly declines. For example, Rigetti's larger 108-qubit system currently operates at a lower 99 percent two-qubit gate accuracy. Furthermore, competitor IonQ holds a world record of 99.99 percent fidelity achieved in a research and development lab, which is slated for a 256-qubit system in 2026. Ultimately, this development shows progress in gate accuracy, but it highlights that the industry is still working to overcome the severe roadblocks required to make quantum computers commercially viable. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #QuantumHardware #LogicGates #GateFidelity https://lnkd.in/eqb4XYr9
To view or add a comment, sign in
-
-
Quantum Data Transfer Speeds up Using Multiple Continuous Signals Reducing the time needed to transfer a quantum state from multiple qubits to continuous variables has long presented a computational bottleneck. Now, transferring an n-qubit state has moved from a runtime of O(2<sup>n</sup>) using a single qumode, to O(2<sup>n/m</sup>) with m qumodes. This advance unlocks a pathway to realising the n-qubit quantum Fourier transform with a scaling of O(m2<sup>n/m</sup>/ε + m<sup>2</sup>), offering gains for quantum computing and communication. #quantum #quantumcomputing #technology https://lnkd.in/ehxWRMA7
To view or add a comment, sign in
More from this author
Explore related topics
- Quantum Computer Error Correction Challenges
- Quantum Error Correction Innovations
- Fault-Tolerance Testing for Quantum Error Correction Codes
- Fault-Tolerant Quantum Computing Methods
- How Error Correction Affects Quantum Computing
- Quantum Error Challenges for Enterprise Adoption
- Quantum-HPC Solutions for Error Correction
- Quantum Computing Applications in Forward Error Correction
- Quantum Supremacy and Fault Tolerance in Computing
- Improving Quantum Computing Fault Tolerance Thresholds
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development