A major obstacle in quantum computing may be more reversible than we thought. One of the persistent challenges in building reliable quantum computers is quantum scrambling, a process where information encoded in qubits spreads across a system and becomes effectively lost. It is a fundamental barrier to performing reliable calculations at scale. New research published in Physical Review Letters by physicists at the University of California, Irvine reveals that scrambled quantum information may not actually be destroyed. Instead, it disperses in extraordinarily complex ways across many interacting particles, and under the right conditions, it can be recovered. Here is why this matters: The underlying laws governing quantum systems are, in principle, reversible. The research team demonstrated that with extremely precise control, a carefully tuned intervention can effectively drive a quantum system backward, allowing dispersed information to refocus near its original location. The key finding is that this reversibility appears to be a universal property across many quantum systems, including quantum computers. That universality is what makes this research particularly significant. It suggests that the path to error resilience may not require avoiding scrambling entirely, but rather learning to undo it. There is an important caveat. Reversing scrambling demands an exceptionally fine level of system control, which remains a significant engineering challenge. But understanding that recovery is theoretically possible gives the field a concrete target to work toward. This is the kind of foundational research that quietly reshapes the trajectory of an entire technology. #QuantumComputing #QuantumPhysics #DeepTech #TechInnovation
Qubit Value’s Post
More Relevant Posts
-
New research clarifies where near-term quantum computers hit their limits, and that is actually good news for the field. A study recently published in Nature Physics by an international team of researchers examined the practical boundaries of quantum computing in the near-term regime, where systems operate without full error correction. Here is what they found: Quantum computers remain extraordinarily sensitive to environmental disruption. Even the smallest interference can cause decoherence, erasing the computational advantage these systems promise. The research focused on gate fidelity, which measures how accurately a quantum gate performs its intended operation compared to an ideal, noise-free version. Their conclusion: near-term quantum computing without full fault tolerance can only handle complex calculations to a limited extent. But here is the important nuance. If gate fidelity is high enough, quantum computers can still perform large, practically relevant calculations. This finding does not close a door. It draws a clear map showing exactly where the threshold sits and where the opportunity begins. Why this matters for the industry: Studies like this help organizations make better decisions about where to invest time and resources. Rather than chasing theoretical possibilities, the quantum ecosystem can focus on pushing gate fidelity higher and identifying applications that fall within demonstrated capabilities. The work also highlights the growing strength of international collaboration in quantum research, with contributors spanning institutions across Europe and the United States. Clarity about limitations is not a setback. It is the foundation for building something real. #QuantumTechnology #DeepTech #QuantumResearch #Innovation #QuantumComputing
To view or add a comment, sign in
-
-
If you’ve been curious about quantum computing but found the terminology hard to navigate, this guide offers an accessible entry point. Using a dance metaphor it explains key concepts like qubits, superposition and interference in an intuitive way, highlighting how quantum systems behave less like independent components and more like tightly choreographed ensembles. As quantum computing moves from theory toward practical impact, developing a shared vocabulary will be essential—not just for researchers, but for business and policy leaders looking to understand where this technology could enable breakthroughs in materials discovery, chemistry, and beyond. #QuantumComputing #DeepTech #Innovation #FutureOfComputing #Qubits #Science #EmergingTechnology #DigitalTransformation #Quantum
To view or add a comment, sign in
-
Stanford researchers did not set out to find something that breaks the rules. They found it anyway. A newly discovered crystal has demonstrated quantum properties that existing theoretical models did not predict and cannot fully explain, exhibiting coherence times, entanglement stability, and information retention at room temperature that every previous quantum material required near absolute zero conditions to achieve. The crystal maintains its quantum state in environments that should destroy it instantly according to the physics that governed quantum material research until this discovery landed on the table.Quantum technology has been commercially constrained for years by a single brutal requirement: the systems that make it work need to be cooled to temperatures colder than deep space to function. That requirement makes quantum computers expensive, immobile, and inaccessible at the scale needed to deploy them as practical infrastructure. A crystal that holds quantum coherence at room temperature does not improve quantum technology. It removes the single biggest barrier between quantum computing and the rest of the world. Stanford did not find a better quantum material. It may have found the one that finally makes quantum technology something ordinary people actually use.#QuantumCrystal #StanfordResearch #QuantumComputing
To view or add a comment, sign in
-
-
Quantum computing: A tech race Europe could win? - BBC European technology companies are emerging as strong contenders in the global race to develop practical quantum computers. With several promising firms making steady progress, Europe is demonstrating that it can compete in the highly advanced quantum technology sector. To understand why this field is so intensely competitive, we must examine how quantum systems process information from the ground up. Classical computers rely on bits, which function as microscopic switches set to a definitive 0 or 1. Quantum computers operate using quantum bits, or qubits. Through a core principle called superposition, a qubit can exist in a combination of both 0 and 1 at the same time. The computational power deepens further through entanglement. When qubits become entangled, the state of one qubit becomes fundamentally linked to another. As researchers add more high-quality qubits to a system, its processing capacity scales exponentially. By applying operations known as quantum gates to these entangled qubits, scientists can run advanced quantum algorithms designed to solve highly complex problems that classical supercomputers simply cannot process. This recognition of European progress means the global quantum ecosystem is diversifying, which can accelerate innovation in different hardware approaches. However, it does not mean that the race is over or that everyday quantum computing is imminent. Today's qubits are highly sensitive to environmental noise, which introduces calculation errors. Scaling up hardware while successfully developing robust error correction remains a formidable barrier. The pursuit of a fully fault-tolerant quantum computer is a marathon, requiring years of continued scientific research. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #EuropeanTech #TechRace #QuantumHardware https://lnkd.in/eYvc5VtQ
To view or add a comment, sign in
-
-
How Sensitive Are The Computers Of The Future? - Eurasia Review A team of researchers, including physicists from Freie Universität, recently published a study in Nature Physics establishing the precise limitations of near-term quantum computing. They found that current noisy systems can only perform complex calculations to a limited extent, fundamentally restricted by how accurately their individual operations function. Conventional computers process information in classical bits, representing a zero or a one. Quantum computers run on qubits, which can exist as a zero, a one, or a superposition of both. This superposition allows scientists to manipulate many states at once, providing the power to solve problems classical computers cannot, like factorizing incredibly large numbers. However, quantum systems face a severe sensitivity problem. They are the Goldilocks of technology; everything must be exactly right. The slightest external disruption causes decoherence, a loss of quantum information that nullifies the system's computing advantage. To deal with this, scientists explore the near-term regime, accepting that errors will occur while running systems as reliably as possible despite the noise. The study found this approach is dictated by gate fidelity, which measures how accurately a quantum gate performs its operation compared to an ideal, noise-free version. What this does and does not mean: This study does not mean near-term quantum computing is a dead end. Instead, it provides a theoretical limit for these systems. It proves that if engineers push gate fidelity high enough, imperfect quantum computers can still execute large, practically relevant calculations, offering a specific direction for future hardware development. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #GateFidelity #Decoherence #NaturePhysics https://lnkd.in/eszhXeTQ
To view or add a comment, sign in
-
-
QuEra Emphasizes Co-Designed Path to Fault-Tolerant Quantum Computing - TipRanks QuEra Computing recently shared insights on how their neutral-atom quantum systems are shifting from academic experiments to a structured engineering roadmap. The focus is on building a fault-tolerant system through a tightly co-designed technology stack. To understand this approach, we must start with the qubit. Qubits hold complex states of information but are highly sensitive to their environment, which leads to physical computation errors. To build reliable systems, scientists must achieve fault tolerance. This involves grouping multiple fragile physical qubits together to form a single, more stable logical qubit. Once formed, logical qubits can detect and correct errors, allowing them to run complex algorithms without losing information. According to QuEra's chief scientist, achieving this fault tolerance requires coordinated advancements across the entire system rather than isolated breakthroughs. The roadmap highlights several necessary technical steps: maintaining low physical error rates, ensuring analog processes operate with digital-like precision, and extracting entropy to sustain long computations. By developing basic science, engineering, and applications in parallel, the collaboration between QuEra, Harvard, and MIT aims to build a fully integrated ecosystem. This development means that developers are treating large-scale quantum computing as a cohesive engineering challenge, which could accelerate the transition to scalable hardware and improve prospects for long-term partnerships. However, it is crucial to note the limitations of this update. The shared content is a high-level research strategy. It does not provide concrete timelines, immediate commercial commitments, or clear financial implications. Creating practical quantum computers remains a steady, ongoing scientific effort. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #FaultTolerance #NeutralAtoms #LogicalQubits https://lnkd.in/esNkFu-i
To view or add a comment, sign in
-
-
Scientists find quantum computers forget most of their work! Will this change as Quantum error rates gets better or this is physical limitation? What effect will it have on computation? https://lnkd.in/gR9vjg9E
To view or add a comment, sign in
-
UC Irvine physicists discover method to reverse ‘quantum scrambling’ - UC Irvine News Physicists at the University of California, Irvine, recently published a study in Physical Review Letters detailing a method to reverse quantum scrambling, a process that causes information loss in quantum systems and was previously thought to be irreversible. To understand this, we start with the fundamental unit of a quantum computer: the qubit. While classical computers rely on bits that store data as either a 0 or a 1, a qubit can store information as a 0, a 1, or both at the same time through superposition. Researchers encode data into these individual qubits to perform calculations. As qubits exchange information within a quantum chip, a challenge emerges. When information is locally encoded into specific qubits, their interactions cause that data to spread across many other qubits. As complexity increases, the data diffuses so widely that it effectively disappears. This spreading is called quantum scrambling, and it prevents the system from retrieving information or completing calculations. The physicists analyzed how this scrambling emerges and found a method to preserve data that would typically vanish. By discovering a way to reverse the scrambling process, they showed that the original encoded information is not permanently lost and can be successfully retrieved. This development means there is a potential pathway to overcome a specific source of information loss, aiding in the design of more reliable quantum hardware. It does not mean that all error correction challenges in quantum computing have been solved, but rather that this single mechanism of data dispersion is now reversible. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #QuantumScrambling #QuantumInformation #Physics https://lnkd.in/emiPtw6j
To view or add a comment, sign in
-
-
A new kind of useful, you say? 60 qubits just outperformed every classical computer; not on speed, but memory. Finally, something in tech that remembers less yet performs better. A new quantum computing paper from Google, Caltech, and MIT showed that fewer than 60 qubits can solve tasks that no classical machine can match, regardless of how much RAM you give it. The data streams in one point at a time, leaves its mark on a quantum state, and gets discarded. The system builds the full picture without ever storing the dataset, greatly reducing the need for memory storage. The researchers tested it on real tasks: IMDb movie sentiment analysis and single-cell RNA sequencing. On both tasks, classical methods needed four to six orders of magnitude more memory to match the quantum system's performance. Even if quantum computers never get faster, that memory advantage over classical computing is going to be hard to beat.
To view or add a comment, sign in
-
Information encoded into qubits spreads across quantum systems and becomes unreadable. Until recently, the honest answer to "can you get it back" was no. UC Irvine researchers published a paper this month showing that quantum scrambling is reversible. A precisely tuned intervention can drive the system backward and refocus the dispersed information. I come to this as an engineer, not a physicist. The mechanism exists, the math checks out. What remains is building hardware precise enough to execute it at scale. More in my blog post. https://lnkd.in/eb23uXyF #QuantumComputing #QuantumInformation #QuantumHardware #Engineering #QuantumErrorCorrection #STEM
To view or add a comment, sign in
More from this author
Explore related topics
- Quantum Computer Error Correction Challenges
- Understanding Quantum Computing Reliability
- Quantum-Resilient Systems Research Trends
- How Error Correction Affects Quantum Computing
- Latest Quantum Code Breaking Challenges
- How to Increase Quantum Computing Reliability
- How Hardware Errors Affect Quantum Algorithm Performance
- Quantum Computing Resilience in NISQ Era
- Solving Quantum Dynamics Challenges in Research
- Quantum Error Suppression Using Bosonic Encoding
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development