2-speed data strategy
Two-speed architecture has been a much talked-about and important concept in vogue in recent years. Its genesis lies in the rapid expansion of the digital space and the need to adopt agile methods and innovations for the digital channels while keeping the underlying technologies and infrastructure stable and not subjected to the same level of disruption. In essence, it really boils down to bringing quick changes and enhancements to the front-ends of the technology food chain (things which are visible to the end users and customers ) but in a way which can allow the back-ends to adapt and evolve more slowly – somewhat like the evolution of the cars where we see rapid changes and innovations in gadgetry, cool designs, cameras, seating comfort, etc. but the internal combustion engine technology which powers the car has remained pretty much the same for decades (only now gradually shifting to full electrical :-)
The 2 speed framework can be applied to other aspects of Tech management as well -for example in the world of data. There was a time until few years ago when data was viewed as merely a useful asset for reporting and customer analysis and the focus was on the efficient accumulation of it for MIS, trend analysis, credit scoring, etc. EDWs which emerged from those days represented an effort to build a scalable version of the data infrastructure to support these traditional needs, and was usually defined as a strategic long-term program spanning several months and years. The world has changed tremendously since then, and now data has become a critical and complex asset, the management of which has become a specialized subject matter in its own right, attracting ever richer levels of talent, technology and professionalism. The demands on data come from multiple directions, and have varying degrees of urgency, criticality, and strategic application. To address these challenges require a 2-speed thinking style also. Solving for it needs an overarching strategy but not a single enterprise-wide data program. Instead a variety of short, medium, and long-term efforts to solve for different kind of asks. For example, when implementing a data quality strategy to solve for a corporate-wide regulatory, the approach will differ from the one needed for a short-term customer survey, which will differ from what’s required to train a machine learning algorithm to do segmentation. Similarly a program to install a state of the art reporting tool does not need to come in the way of providing a capability for a quick visual analytics on recent sales data which may be done without going through many layers of integration and testing. Same applies to master data management programs which can be tailored to address the varying level of rigor depending upon the use cases. Having said that, we must not lose sight that even as we need to pursue different, sometimes overlapping, data projects, they should be part of a strategy and should have a convergence goal defined as well.
Data practitioners and managers can no longer afford to be blinded by the traditional view of data which can mask the evolving demands from several fronts - digitization, compliance, marketing, risk to name a few, and should adapt the governance practices, including the tools & technology, in a way which can address the enterprise's needs more effectively. Binary choices for tools (one database VS the other, 1 reporting tool vs the other, sql vs nosql) are no longer applicable, but a co-existence of seemingly overlapping capabilities and programs with converging goals will be the order of the day. In short, the data strategy should be able to deliver capabilities at varying speeds depending upon the needs of the digital and data world we live in.
Look forward to feedback and comments…. Happy holidays