A quantum leap for open data?
We’re all familiar with the phrase “a quantum leap”, which in non-physics terms is usually defined as “an abrupt change, sudden increase, or dramatic advance”. In other words, we typically use it to describe exponential changes in the capability of an existing system.
We are now at the cusp of just such a quantum leap - in terms of computing power itself. The advent of quantum computing will be upon us in the coming years, how soon exactly is still up for debate. A simple definition of a quantum computer is that it “uses quantum mechanical properties to perform its calculations”. Quantum computing and quantum mechanics are very complex subjects, so I recommend checking out some of the many primers online such as this one. The bottom line is that due to this new ability to harness the power of quantum bits (qubits) when performing calculations, it is estimated that quantum computers will be many times more powerful than the most powerful supercomputers we have in operation today.
Recent investments in quantum computing research
Investment in quantum computing research has been ramping up, and we can expect this to increase significantly in the next few years. Recently the US government passed the National Quantum Initiative Act into law. This act “provides $1.2 billion over the next five years to establish a coordinated framework between federal research labs, academia and the private sector to advance QIS technologies”. Other countries, such as China, France and the UK, have been developing their own quantum computing research capabilities.
Much of the research to-date has been performed by the private sector. Several major technology companies, such as Microsoft, IBM and Google, have advanced quantum research programs. Other companies, such as D-Wave Systems, focus solely on quantum computing. A useful overview of the main private sector research efforts can be found here.
What quantum applications could government avail of?
What are the key applications that quantum computers will provide those quantum leaps in capability for? Several key scenarios are currently envisioned, such as tackling issues related to cybersecurity, building advanced machine learning capabilities, improved logistics, advances in the search for new drugs and cures for disease, tackling climate change and improving energy consumption, to name but a few.
How might government leverage the power of quantum computing? At the Federal level, much of the research focus appears to be centered on cybersecurity and encryption concerns related to national security.
However, it turns out that there is another key application that could prove to be extremely useful to government at all levels: solving optimization problems. It turns out that quantum computers will have amazing capabilities when it comes to using machine learning to solve incredibly complex optimization problems, far in excess of what would be possible today using classical computing environments.
Another important distinction between classical and quantum computing environments is that the output from a quantum system is probabilistic, and not deterministic like with a classical computer. By this we mean, if you run a program with the same inputs on a classical computer, the output will always be the same. With a quantum computer, this will not always be the case. For optimization problems, this could in fact be a very desirable outcome. A new class of quantum algorithms will need to be developed to take advantage of this probabilistic capability that quantum computers will provide.
What role could Open Data play?
So where does open data fit into all of this? Let’s try to think through that question by instead asking “what optimization problems might governments have?”. As you can imagine, at the Federal government level there would be many highly complex optimization and logistical issues that a government of the scale of the US government faces constantly – from national defense to humanitarian relief to service provision on a massive scale. Instead, let’s think about possible scenarios that are much closer to our everyday lives and how local governments may be able to tap into the power of quantum computing. Municipal governments across the US are implementing ‘smart cities’ technologies to a greater or lesser extent, and this trend will likely continue as new IoT technologies combined with the rollout of 5G networks will spur more real-time data collection on the current conditions within our urban environments, like traffic conditions and hyper-local weather conditions for example.
Since we’re talking about quantum mechanics, let’s perform a thought experiment. Imagine a single computing environment powerful enough to consume data from all vehicles traveling through a metropolitan region in real-time, combined with weather data and data from other independent variables that impact traffic (such as accidents, planned road closures, etc.) so that every traffic signal sequence in a city and its surrounding area could be automatically adjusted to provide the optimal conditions for moving traffic around that area. This hypothetical scenario could potentially be achievable within a quantum computing environment using quantum algorithms.
Governments across the US are sitting on massive troves of data, much of it yet to be shared with the public via open data. I have firsthand experience of this from my time running a large municipal open data program. Apart from the sheer volume of data, governments typically have data going back many years, which can improve models and future projections. Government handles some of the most complex optimization problems there are, so leveraging this new quantum capability could be a natural next step in the evolution of open data.
Open data is not limited to government. The private sector has a crucial role to play here. A great example is the Amazon Open Data team, who work with governments and private research agencies across the world to release very large open datasets on critically important topics that enable researchers to avail of these datasets and the underlying advanced analytics computing environments. Analyzing massive datasets like these within a quantum computing environment seems like a natural extension. Indeed there could be interesting public - private partnership opportunities re: open data and quantum.
But quantum is still years away…
If actual physical quantum computers are still years away from becoming a mainstream reality, why should we care now? It’s a fair question. Like all emerging technologies, and especially ones that have the promise of providing breakthrough capabilities, it is worthwhile to assess what impacts they could have on the systems and capabilities we have today.
Here are a couple of reasons why I think government at all levels could start to include quantum computing in their future thinking today.
First, since quantum computers don’t yet have a storage capacity that can take advantage of these quantum mechanical properties (qRAM), the current cloud computing platforms will act as a storage bridge between classical computing and quantum computing for some time. This provides yet another reason for governments to continue to push their data into the cloud (publicly as open data, or data hosted for internal purposes), and position their data assets to be ready to take advantage of new capabilities like quantum in the coming years. This is an opportunity for government to be in the vanguard of this quantum technological advance.
Second, governments can be thinking about these future developments when making long-term decisions re: investments in data analysis, the implementation of ‘smart cities’ technologies, and so on. The ability to better manage very difficult optimization problems could provide the impetus needed to design and release a new generation of very large open datasets targeted around those scenarios, and help governments to think about how to better leverage the huge volumes of data from ‘smart cities’ technologies that could be analyzed in real-time. Let’s think of it as ‘Quantum Open Data’.
This is something government can begin to experiment with today. Microsoft has a quantum development kit (and a new quantum-focused programming language Q#) where developers can run programs in a simulated quantum environment in the cloud. Similarly IBM, Google and D-Wave all offer the ability to leverage their internal quantum systems and simulations.
As with all massive technological advances, with great power comes great responsibility. Quantum algorithms will present another layer of complexity to the existing issues related to the use of algorithms in our decision-making processes. Similarly, experts in the areas of privacy and other concerns related to the collection of data at scale will need to factor quantum into their thinking in the years to come. Acknowledging these challenges, by initially focusing on a narrow set of optimization problems State and Local government could begin to harness the power of quantum computing to help them better solve everyday logistics and mobility problems. Learning from this narrow set of applications, other exciting quantum applications will be invented to help our government and society, with open data at the core.
Will quantum computing result in a quantum leap in how we will leverage the power of open data? Like Fox Mulder, I want to believe…
Note: Special thanks to Kate Garman (Tech Policy Advisor to Mayor Durkan @ City of Seattle) and Richard Todd (Data Scientist @ City of Seattle Innovation Team) for their input into this article.