UC Irvine physicists discover method to reverse ‘quantum scrambling’ - UC Irvine News Physicists at the University of California, Irvine, recently published a study in Physical Review Letters detailing a method to reverse quantum scrambling, a process that causes information loss in quantum systems and was previously thought to be irreversible. To understand this, we start with the fundamental unit of a quantum computer: the qubit. While classical computers rely on bits that store data as either a 0 or a 1, a qubit can store information as a 0, a 1, or both at the same time through superposition. Researchers encode data into these individual qubits to perform calculations. As qubits exchange information within a quantum chip, a challenge emerges. When information is locally encoded into specific qubits, their interactions cause that data to spread across many other qubits. As complexity increases, the data diffuses so widely that it effectively disappears. This spreading is called quantum scrambling, and it prevents the system from retrieving information or completing calculations. The physicists analyzed how this scrambling emerges and found a method to preserve data that would typically vanish. By discovering a way to reverse the scrambling process, they showed that the original encoded information is not permanently lost and can be successfully retrieved. This development means there is a potential pathway to overcome a specific source of information loss, aiding in the design of more reliable quantum hardware. It does not mean that all error correction challenges in quantum computing have been solved, but rather that this single mechanism of data dispersion is now reversible. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #QuantumScrambling #QuantumInformation #Physics https://lnkd.in/emiPtw6j
UC Irvine Physicists Reverse Quantum Scrambling Process
More Relevant Posts
-
How Sensitive Are The Computers Of The Future? - Eurasia Review A team of researchers, including physicists from Freie Universität, recently published a study in Nature Physics establishing the precise limitations of near-term quantum computing. They found that current noisy systems can only perform complex calculations to a limited extent, fundamentally restricted by how accurately their individual operations function. Conventional computers process information in classical bits, representing a zero or a one. Quantum computers run on qubits, which can exist as a zero, a one, or a superposition of both. This superposition allows scientists to manipulate many states at once, providing the power to solve problems classical computers cannot, like factorizing incredibly large numbers. However, quantum systems face a severe sensitivity problem. They are the Goldilocks of technology; everything must be exactly right. The slightest external disruption causes decoherence, a loss of quantum information that nullifies the system's computing advantage. To deal with this, scientists explore the near-term regime, accepting that errors will occur while running systems as reliably as possible despite the noise. The study found this approach is dictated by gate fidelity, which measures how accurately a quantum gate performs its operation compared to an ideal, noise-free version. What this does and does not mean: This study does not mean near-term quantum computing is a dead end. Instead, it provides a theoretical limit for these systems. It proves that if engineers push gate fidelity high enough, imperfect quantum computers can still execute large, practically relevant calculations, offering a specific direction for future hardware development. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #GateFidelity #Decoherence #NaturePhysics https://lnkd.in/eszhXeTQ
To view or add a comment, sign in
-
-
Gauge theory could give quantum error correction a boost - Physics World Researchers used gauge theory to reduce the qubits needed for quantum error correction. Scientists from IBM Quantum and the University of Sydney showed how widespread quantum information can be measured using only local checks, significantly lowering overhead. Unlike classical bits (0 or 1), quantum computers use qubits, which can exist in a combination of both states at once and become entangled. These properties allow quantum algorithms to solve certain problems faster. However, qubits are highly sensitive to environmental disturbances. This fragility introduces errors, making large-scale hardware difficult to build. To protect data, researchers use fault-tolerant error correction, storing information from one logical qubit across many fragile physical qubits. Standard approaches require massive numbers of extra qubits to perform operations and run checks, creating a huge resource cost. This new work addresses that cost using gauge theory, a physics concept where local interactions connect distant system parts. Instead of running complex global measurements, researchers add helper qubits to break the process into small, local checks. Combining these local outcomes reconstructs the overall result. The extra qubit requirement grows only slightly faster than the measurement size, bypassing the severe overhead of earlier methods. This means scientists have a flexible approach for a wide class of error-correcting codes. It does not mean the physical sensitivity of qubits is solved or that large-scale quantum computers are finished. Rather, it provides a theoretical framework to reduce resource barriers, accelerating the development of practical hardware. #QuantumComputing #QuantumTechnology #QuantumScience #Qubits #QuantumErrorCorrection #GaugeTheory #FaultTolerance https://lnkd.in/erF5jH6x
To view or add a comment, sign in
-
-
Recent research highlights the potential of "poor man's Majoranas"—minimal Kitaev chains composed of two quantum dots coupled by a superconductor—as sensitive quantum spin probes. Unlike long chains that offer topological protection, these short chains are highly responsive to local perturbations, enabling the detection and characterization of nearby quantum spins through their spectral signatures. This approach leverages the vulnerability of unprotected Majorana modes, offering a practical tool for quantum sensing and suggesting new experimental strategies for manipulating quantum states, even in non-ideal systems, prior to the realization of robust topological quantum computing platforms.
To view or add a comment, sign in
-
In the pursuit of powerful and stable quantum computers, researchers at Chalmers University of Technology, Sweden, have developed the theory for an entirely new quantum system. #Engineering #Computing #Research
To view or add a comment, sign in
-
A major obstacle in quantum computing may be more reversible than we thought. One of the persistent challenges in building reliable quantum computers is quantum scrambling, a process where information encoded in qubits spreads across a system and becomes effectively lost. It is a fundamental barrier to performing reliable calculations at scale. New research published in Physical Review Letters by physicists at the University of California, Irvine reveals that scrambled quantum information may not actually be destroyed. Instead, it disperses in extraordinarily complex ways across many interacting particles, and under the right conditions, it can be recovered. Here is why this matters: The underlying laws governing quantum systems are, in principle, reversible. The research team demonstrated that with extremely precise control, a carefully tuned intervention can effectively drive a quantum system backward, allowing dispersed information to refocus near its original location. The key finding is that this reversibility appears to be a universal property across many quantum systems, including quantum computers. That universality is what makes this research particularly significant. It suggests that the path to error resilience may not require avoiding scrambling entirely, but rather learning to undo it. There is an important caveat. Reversing scrambling demands an exceptionally fine level of system control, which remains a significant engineering challenge. But understanding that recovery is theoretically possible gives the field a concrete target to work toward. This is the kind of foundational research that quietly reshapes the trajectory of an entire technology. #QuantumComputing #QuantumPhysics #DeepTech #TechInnovation
To view or add a comment, sign in
-
-
Harnessing quantum computing for protein analysis offers a significant speed advantage. By modeling proteins and algorithms for quantum computers, we can simultaneously process atomic and electronic interactions. This drastically accelerates the measurement of desired final properties. For instance, calculating a protein's energy can be achieved by inputting all relevant information into qubits and directly obtaining the results, showcasing the power of quantum computation in scientific research. #QuantumComputing #ComputationalChemistry #ProteinAnalysis #ScientificResearch #Technology
To view or add a comment, sign in
-
In a new study, researchers including Yihui Quek and Armando Angrisani from EPFL shows how the noise in today’s quantum computers limits how much work their circuits can really do, and how this affects training and simulation. EPFL School of Computer and Communication Sciences Freie Universität Berlin Massachusetts Institute of Technology Helmholtz-Zentrum Berlin Fraunhofer Heinrich Hertz Institute HHI Université Claude Bernard Lyon 1 https://lnkd.in/eehHYeWn
To view or add a comment, sign in
-
The quantum computing timeline just shifted. A new breakthrough means we might need thousands of qubits, not millions, to build a useful machine. For years, the consensus was clear: a practical quantum computer would need millions of physical qubits to create a single, stable 'logical' qubit capable of real work. The engineering challenge was staggering. Two recent advances are changing that math. First, researchers at Caltech and startup Oratomic demonstrated that neutral-atom qubits—atoms held in place by lasers—can form a logical qubit from just five physical ones. That's a massive reduction from the roughly thousand previously assumed. Second, a team at ETH Zurich showed how to make quantum operations on these atoms more error-resistant. They used the geometry of the atoms' motion itself, which is more stable than trying to perfectly time laser pulses. Together, this means the total qubit requirement for a usable machine could drop from millions to the 10,000–20,000 range. Caltech has already built arrays with over 6,000 of these neutral-atom qubits, proving the scalability. The implications are profound: 🔬 Drug discovery and material science could accelerate dramatically. 💡 Energy grids and financial models could be optimized in new ways. 🔐 Our current cryptographic security needs a proactive rethink. This isn't science fiction anymore. It's an engineering problem with a clearer, nearer path. What industry do you think will be transformed first by practical quantum computing? #QuantumComputing #TechInnovation #FutureOfTech 𝐒𝐨𝐮𝐫𝐜𝐞: https://lnkd.in/gX5W3vNy
To view or add a comment, sign in
-
Information encoded into qubits spreads across quantum systems and becomes unreadable. Until recently, the honest answer to "can you get it back" was no. UC Irvine researchers published a paper this month showing that quantum scrambling is reversible. A precisely tuned intervention can drive the system backward and refocus the dispersed information. I come to this as an engineer, not a physicist. The mechanism exists, the math checks out. What remains is building hardware precise enough to execute it at scale. More in my blog post. https://lnkd.in/eb23uXyF #QuantumComputing #QuantumInformation #QuantumHardware #Engineering #QuantumErrorCorrection #STEM
To view or add a comment, sign in
-
Quantum information might not be as fragile as we thought. One of the persistent challenges in quantum computing is quantum scrambling, the process by which information encoded in qubits spreads across a system and becomes effectively lost. It is a fundamental obstacle to reliable quantum computation and data retrieval. New research published in Physical Review Letters by physicists at the University of California, Irvine, offers a compelling insight: scrambled quantum information may not actually be destroyed. Instead, it disperses in highly complex ways across many interacting particles, and under the right conditions, that process can be reversed. The key finding rests on a principle rooted in quantum mechanics. At the microscopic level, the laws governing particle interactions are time-reversible. The research team demonstrated that this reversibility extends to many quantum systems, including quantum computers. With extremely precise control, it may be possible to drive a system backward, allowing dispersed information to refocus near its origin. Why this matters for the industry: - Quantum error and information loss remain among the biggest barriers to practical quantum computing. - If scrambling can be systematically reversed, it could open new pathways for preserving qubit coherence and improving computational reliability. - The finding is described as a universal property, suggesting broad applicability across different quantum architectures. This is still early-stage research, and the level of fine-tuned control required is significant. However, it represents a meaningful step in understanding how quantum information behaves and how we might protect it. Foundational science like this is what moves quantum computing from promise toward practice. #QuantumComputing #QuantumPhysics #QuantumTechnology #Innovation
To view or add a comment, sign in
-
Explore related topics
- Quantum Computer Error Correction Challenges
- Quantum Computing Error Correction Methods
- Quantum Information Transfer Between Qubits
- How Error Correction Affects Quantum Computing
- Quantum Error Correction Innovations
- Quantum Error Correction for Data Security
- Quantum-HPC Solutions for Error Correction
- Minimizing Errors in Quantum Qubit Operations
- How Hardware Errors Affect Quantum Algorithm Performance
- Quantum Error Suppression Using Bosonic Encoding
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development