Flip-Flop Qubits could Revolutionize Quantum Computing
I admit I have missed this important piece of research in September last year when it was announced in some online sources, and only a recent background article in the December 2017 issue of Scientific American (SCIAM) brought it to my attention. Nonetheless, as it applies to probably the most exciting technology trend in computer architectures - quantum computing - I think it's still worth taking bit of a closer look.
Now we all have certainly heard that quantum computers will, once available at reasonable scale, put an end to a lot of computational limitations as we know them today. Quantum computers are based on two quantum-mechanical effects called superposition and entanglement. Superposition allows elementary particles such as electrons or atoms to exist in multiple states at the same time with a given probability each - they in fact represent quantum bits (qubits) which are in a superposition of states. Entanglement links the actions of multiple such qubits such that their respective quantum states are dependent on each other and can only be described together (you may have heard of Einstein's famous "spooky action at a distance" phrase which also relates to entanglement). When combining such entangled qubits in the right way within a quantum computer, the net result is that the compute power of this set grows exponentially with the number of these qubits. The SCIAM article notes that, conceptually, a quantum computer with about 300 qubits could run more computations in parallel than there are atoms in the whole universe (in case you're interested in that number, it's estimated to somewhere in the range of 10 to the power of 80, but that of course doesn't factor in dark matter yet ... let's just settle for "a lot").
Now that's a large number, and it is no wonder that a lot of major companies are investing in this research to unleash this compute power. IBM has the "IBM Quantum Experience" live on the internet where one can test algorithms on an actual quantum computer. The Q Experience site initially featured a system with 5 qubits to run your own tests. This was upgraded to 16 qubits later and is now looking at a 20 qubit system. The main research features a prototype with 50 (!) qubits even. No wonder that there is also a lot of interest by intelligence agencies to drive quantum computing forward since that immense compute power would readily allow cracking encryption algorithms which is far beyond the reach of today's compute architectures. This will be a severe threat for a key foundation of our use of the internet for the secure exchange of data, typically with PKI-based encryption. This again gave rise to yet another research area called post-quantum cryptography which looks at encryption methods which are considered "quantum resistant".
Now one might still be tempted to think: well, this is research, it will take years and years to put this into real production. And there's still lots of challenges, both from a physics perspective as well as from a technology and engineering perspective, right? Not so fast ... and this brings me back to the piece of research I mentioned in the beginning: one of the big challenges is the distance between the individual particles representing the qubits which, in the designs so far, was limited to something like 15 nanometers. Any larger distance causes entanglement to fail and that does it for the quantum computer. This minuscule distance puts a high technical challenge and barrier on actually manufacturing such quantum chips. The research published by Andrea Morello and Guilherme Tosi of the University of New South Wales in Australia and his colleagues, however, has produced an approach called "flip-flop qubits" which allows for separation distances of about 500 nanometers whilst retaining entanglement. The novel design uses both the nucleus and the electron of a phosphorous atom as a new type of qubit where the binary states of '0' and '1' are defined via the spins of the electron and the nucleus in relation to each other. This large separation distance could make large-scale manufacturing of such chips much easier and cheaper.
Advances like these show that there can be unexpected "quantum" leaps (pun intended) that drive technology forward at a surprising pace and allow us to overcome seemingly massive hurdles. Over time, such systems can be assumed to hit the "real" world ... and since the Australian researchers envision being able to "put a million qubits on a square millimeter" with that design, such compute power would go beyond any comprehension.
Graphic by pixabay.com - published under CC0 Creative Commons license
Latest update: Google’s new 72-qubit quantum processor is the largest yet. Called Bristlecone, it was presented on 5 March at a meeting of the American Physical Society in Los Angeles. https://www.newscientist.com/article/2162894-googles-72-qubit-chip-is-the-largest-yet/
An eloquently succinct article with enlightening brevity. I enjoyed reading your summary of the latest developments in quantum computing. Thanks for sharing
Thanks Peter! There is a lot of material out there already on quantum programming, and several specialized languages have been created. Microsoft recently even released Q# for quantum programming :-) It will require some new way of thinking and breaking down algorithms to fit the quantum computer model, but I'm interested to see how this field evolves as more and more really hard problems are approached with quantum computing. Not sure if we know already classes of problems that will not be a great fit due to their very nature.
Nice article Matthias, thanks. Looks like some great progress for this exciting technology - at least on the level of physics. Good news for all areas that require massive computing power, such as AI. It will be interesting to see how massive programming will be affected by this new way of computing. Will probably add a whole host of new challenges and complexities, then for the software developers.