Cryptography - The Hidden Nightmare
Cryptography – The hidden nightmare
Introduction
If we reflect on the pace of evolution of the technical estate in the last 3 years and the tremendous shift of emphasis in hyperconnected business models that are increasingly reliant upon and enabled by XaaS. We’ve seen great performance enhancements across the estate with virtualization in the server room, virtualisation in the Networks with the adoption of SDN. As a consequence, we see ever greater adoption of services being delivered and consumed across hybrid on-prem and multi-cloud environments. However, in this melee of innovation in the estate and the business models leveraging value from this, we have not seen the same degree of evolution, innovation even, in one of the critical enabling constructs, namely the Cryptographic model.
Our increasingly hyperconnected world relies on cryptography. We need it to ensure trust in our digital assets, to assure the confidentiality and integrity of the assets that underpin the transactions in our daily lives. Cryptography is at the heart of the security of what we do today and central to what we will rely upon tomorrow in financial transactions, distributed blockchains, connected vehicles, critical infrastructure, IoT devices and those many things we haven’t thought of yet.
However, the cryptographic model upon which we rely was defined and designed decades ago to support a digital world that was largely located on premises. Since the cryptographic model was first established, we have seen the evolution of the internet era; we have probably evolved beyond internet and are probably now living in the inter-cloud era with complex individual cloud dependencies needed to support processes. As this hyper connectivity continues to evolve with ever greater IoT adoption and enablement, we will evolve to multi-cloud era that depends heavily on logical allocation of network assets and cloud assets to deliver and consume services. Shorthand, things are about to get a lot more complex …..FAST. Sudden and unpredictable cryptographic compromises or failures can leave organisations at risk – it is a major resilience issue and risk leaders need to understand the problem. In most organisations however, it is a hidden nightmare.
Discussion of the Problem Space
This hyper connectivity evolution means that digital transactions that historically were relatively simple to secure cryptographically for example, for what was once a simple database query of the in house Oracle database, we now see has evolved into a dynamic context changing data aggregation, disaggregation and reaggregation data journey across multiple virtual environments some owned by the entity and many by other 3rd parties. What’s more, the dynamic nature of our own environment that sees version and release challenges all the time between devices, applications and storage as well as across network and communications media as a problem is multiplied N times by the number of 3rd party environments and communities with whom we engage to deliver and consume services.
This dynamic nature of multiple 3rd party environments and our increasing dependency creates a major problem for organisations seeking to provide surety internally and externally to their customers of the processes that underpin the product and service Delivery they seek to provide to their clients X as a Service. Take for example the delivery of Remote Condition Monitoring Services to a pipeline operator. As our service offering evolves to a cloud delivery model, our ability to provide surety of the service delivery is dependent upon the predictability of the cryptographic continuity for the process being delivered. However, we as the service provider have a dependency on understanding the precise version and release states of the software, devices and infrastructure that is part of the value chain delivering those services in order to ensure the cryptography version and release state at each handshake is compatible.
The simple truth is that we have no sight of the version and release status of the cryptographic assets in use in that process, this cryptographic lifecycle management represents a major resilience issue for all organisations. Imagine adopting a new version of Microsoft Office in only 1 of 5 cloud environments required to deliver a service where the new version requires the configuration change to a series of firewalls everywhere the data travel for that query-supporting service only. Failure to make the configuration changes could expose the process and client to exploitation or the config misalignment prevents the data transaction taking place. A plausible outcome here is that the organisation could be prevented from delivering the service to the client.
Here the problem gets worse because resolution of this use case requires us to identify quickly all the crypto assets associated with the digital assets involved in the service delivery. It demands that we understand the interoperability of the version and release (Crypto lifecycle) states of the different crypto assets, acknowledging that such an asset could be a 3rd party device with obsolescent embedded cryptographic firmware, and then to effect the appropriate configuration change or update roll out. The obvious challenge here is that, if we thought it was difficult to catalogue our digital assets in a meaningful way, it is even harder to do that for cryptographic assets where the algorithm is long-lived but the deployment specificity in the hyperconnected world is extremely short life cycled. The scale of this cryptographic cataloguing challenge is very great: the ability to manage the lifecycle of cryptographic assets moving forward in our increasingly connected environments means the problem is getting bigger in a non-linear fashion at speed.
So far, we’ve looked at the prospect of technical failure, but the challenge doesn’t end there. The clarity of the management of the relationship between public key management and distribution, asset and individual identity management and digital certificate management are core capabilities of crypto operations. The recent Microsoft patch Tuesday worries about digital certificates brings a spotlight onto the crypto environment not just from the prospect of technical failure (discussed so far) perspective but by reminding us of the malevolent intent of bad actors and their continuous efforts to find the next big thing.
There are a number of facets to the Crypto problem space:
· Cryptography is a scary subject for most people, hard maths, complex technology in complex environments – crypto is hard and we take an approach of , “If it ain’t broke, don’t fix it” The challenge with this is that we don’t know for sure that we can have confidence that it ain’t broke.
· Ageing cryptographic algorithms can break suddenly leaving applications at risk
· In many organisations, there is a low level of awareness of cryptography upon which they rely; which applications are using what, how it is used, release status and configuration requirements are changing broadly and rapidly. Its difficult to manage digital certificates if you haven’t identified your digital assets
· New Quantum crypto standards are not likely to be in place before 2022-2024 – this introduces an added layer of complexity for organisations developing new XaaS service propositions and the architectural ramifications for decisions on quantum-safe infrastructure and applications
· Development communities are often blind to the details of cryptography. They may select unsafe algorithms or library, use an inappropriate set of cryptographic parameters.
· Hard-coded dependencies can make patching expensive and make embedded firmware opaque to the product development community
· Use of Open Source crypto libraries in implementations invariably lack actual security review overlooking countermeasures for more advanced threat vectors e.g. side-channel attacks because there is a false sense of security that comes from their extensive public exposure
Unfortunate but necessary technical detour
We have a further problem to address, the advent of Quantum Computing. Classic example of a Known Unknown: it is KNOWN that it is coming; it is UNKNOWN exactly when it will arrive. Quantum computers have the likelihood to undermine elements of the Cryptographic landscape that is in common use today. The nature of quantum computers means that a quantum characteristic, super positioning, when harnessed with the computers Qbit architecture means that a character can be both either or neither a zero or one all at the same time.
We don’t need to go further into this, a, its not needed and b, I’m approaching limit of knowledge; but this characteristic introduces a possibility to harness new quantum algorithms (notably Shor’s algorithm) to make use of this “Polynomial” acceleration of analysis to break much of the existing cryptographic landscape applied to very large use communities.
Most of us know a bit, namely, that there are two main families of cryptographic algorithms in use today, Symmetric Cryptography and Asymmetric Cryptography, (Public Key Cryptography).
Symmetric Cryptography of which AES is the best-known example, is less vulnerable to Quantum attack although it is susceptible to Grover’s algorithm. However, Symmetric cryptography has scaling issues. It is less easily used in very large communities like multiple transactions in a network.
It is in large use communities that Asymmetric cryptography is widely deployed. All such algorithms are based upon the assumptions of the intractability in a reasonable timescale of 3 hard maths problems: Factorisaton problem, RSA the best-known example; Discrete Logarithm Problem, Diffie-Hellman the original example; Elliptic Curve Discrete Logarithm Problem where DSA best known example. Shor’s quantum algorithm busts these open.
The image in Figure 1 below gives a view of the landscape.
The need therefore to catalogue Cryptographic assets as outlined in the discussion of the problem space above becomes amplified. Its critical to understand your critical crypto assets in order to prioritise the lifecycle management of these critical crypto assets to meet the quantum computing challenge.
Figure 1 Post Quantum Cryptography Problem-space
Tomorrow’s problem today
There is a temptation to ignore the crypto issue, particularly if Quantum is being used as the catalyst for discussion because clearly there are crocodiles closer to the canoe than advent of Quantum. Its equally attractive to kick it in the long grass because Crypto is scary and hard and we might expose a lack of knowledge. But truly, crypto is tomorrow’s problem that has to be addressed today.
Not-with-standing the immediate imperative of crypto lifecycle management needed to support the increasingly dynamic multi-cloud service delivery environment, preparation for quantum proofing will take a long time to accomplish. It took years for organisations to catalogue their data assets rigorously and to develop data models to support modern environments. Many still haven’t achieved it. The cryptography cataloguing challenge is every bit as hard possibly more so because fewer of us know what we are doing. We all need to recognize it and get on with the job.
Organisations will have to prioritise their critical crypto assets to be quantum-proofed and this can’t be done without a crypto catalogue and a new crypto model to address XaaS and Quantum. Quantum resistant crypto standards are unlikely to be available and established much before a 2022-2024 window. This means that organisations looking to design XaaS service propositions now for deployment in a quantum world face additional layers of complexity as the develop their software services. Intimate understanding and knowledge of appropriate crypto versions and crypto libraries will be needed, in order to anticipate the new environment.
Why this matters more broadly
In an environment where many companies are seeing their markets become more and more cost sensitive and as their organisation evolve to a less centralized models, Divisions in turn become more cost sensitive both to the relative cost of IS and Security services, but also the discretionary nature of the spend in relation to return on Capital.
The Cryptographic problem is one that is ubiquitous across all businesses and divisions. It is also likely ubiquitously the case that almost all businesses will not have started to address the problem. This means not having quantified the incremental exposure either in terms of the internal cross-divisional boundaries but also, as their value propositions become more aligned to the delivery of X as a Service, not addressing the cross-entity boundaries of the exposure they face.
Together with the evolution of the service delivery model to multi-cloud (as discussed above), in an organizational phase where Divisions have responsibility for provisioning their own risk Capital, every Division has a need to reflect upon the implications for their exposure in their existing risk portfolio of this hidden cryptographic nightmare.
As an example, if a division has a service model that delivers remote condition monitoring services to a global oil and gas company it is reasonable to assume that one of the critical Divisional risks in the risk matrix is the inability to deliver the service for a period in excess of N days with an assumed financial impact or risk capital provision of X. It is almost certain that for every Division the probability of occurrence of such an interruption is underestimated because the likelihood of such a failure consequent upon either malevolent crypto interference or crypto configuration (or other) failure has not been embraced.
This is really important because it highlights the material nature of the treatment of cyber risks, not as discrete cyber risks but instead as the quantification of how much worse (how much larger the exposure) those risks an organisation is already worried about would be consequent upon cyber vulnerabilities like this crypto problem being exploited by error/omission or malevolent intent.
Opportunity for CISOs
In this increasingly cost-sensitive, de-centralised divisional structure, there is a constant challenge of how to maintain the perception and reality of the relevance and value of central services during a period where the very act of decentralization can create potential for factional rivalries and friction. In the same way that the organizational evolution occurs and also the evolution of the external value proposition to an XaaS delivery model, the nature of central services and the representation of value of those services need to evolve in order to maintain relevance. The key here is to find a common theme of issue, in this case Crypto Lifecycle Management, and to be able to link that to a current and critical Business outcome measure. For all businesses ultimately that is going to be return on capital and the management and reduction of risk capital provision.
This crypto challenge represents an opportunity to demonstrate the value of a central service that is “Sold” to the internal customer community on the basis of the benefit to the Divisional P&L from a Risk capital perspective and directly upon the return on capital employed. It establishes the precedent of valuing services on the basis of risk capital and ROCE rather than price to the user seat and it highlights the value to the whole enterprise of certain benefits and economies of scale for adoption of certain central services which in this case would lead to a reduction in the overall cyber risk profile of the broader enterprise. It would also establish a precedent and potentially an appetite for the establishment of an indigenous organisational capability at Divisional level of quantifying cyber risk in terms of impact upon risk profile for existing critical risks and therefore risk capital with the associated impact upon capital returns of decisions made at divisional level. This would support the achievement of central security and capital performance improvement goals but executed at Divisional level. Genuinely an opportunity for a win-win.
Thought provoking piece, thank you Peter Armstrong! Adding cryptography enablers to an organisation's IT asset register would certaintly go a long way toward future preparedness. In the near term, I can see that would help ensure SSL certificates are properly in place and renewed on time!
“...valuing services on the basis of risk capital and ROCE rather than price to the user seat...” - nicely put, Peter.
Exponential connectivity, aka the hyper connectivity, is brittle in that every interface for XaaS is a potential failure point, one source of failure is the cryptographic model, another might be the transaction protocol version. Service assurance of this end to end or rather meshed hyper connectivity is a challenge. Traditional network monitoring doesn't give you the visibility needed as you are observing single points in the mesh. Incremental models try to give visibility of the cloud from an edge, e.g. with Cloud Access Service Brokers, but this isn't pervasive to the mesh of connectivity. Some vendors have models of many observation points in the cloud or mesh but these will always be a partial view. Some style of agent that is part of the mesh connectivity itself is a possible solution but then you end up in a circular argument of how to secure the agent ... oh yes trying to use cryptography... back to square one.
Maybe we need to shift the gears and open up the systems and use smart agents instead (much like the real life spies) to detect early intrusions..