Top 10 things to consider when delivering a Big Data programme

Top 10 things to consider when delivering a Big Data programme

Having delivered this topic at TM Forum Digital Transformation Asia, a few have asked me to provide the information once again. I recently provided a very high level view of what big data is and the purpose of it - available here. This time I’d like to share some of my experiences with regards to Big Data delivery programmes. Below are the top 10 things you must consider when you or your business is delivering a big data transformation, regardless of where you are in your journey, whether you are about to start, you’re mid-way through a delivery, or you have completed your Big Data implementation and want to move into the next phase, these 10 tips will help to ensure you are well on your way to a successful outcome. 

1. Planning, Planning, Planning

The age old adage of “if you fail to plan, you plan to fail” holds true for your Big Data transformation. In fact this is the case for any type of transformation, project or change. A key methodology is to truly understand and visualise what your end goal looks like and work backwards to ensure all of the big ticket items have been considered. Stephen Covey highlights this point in his book the “7 habits of highly successful people”, where you must begin with the end in mind. Without understanding the end goal it is difficult to get to where you want to go. Another good practise is to have different views of your overall plan for different audiences i.e. very detailed to very high level.



2. The Agile vs Waterfall conundrum

As the world adopts a more agile way of working in every area of business, it is true that no single way is the right way. For your particular environment a blend of the two approaches may perhaps be more fitting. The degree of that blend will also be very specific to your particular organisation and industry. Some aspects of waterfall would be better suited to linear activities e.g. infrastructure, and others would be appropriate for Agile where you need to iteratively develop products and outcomes e.g. software or process automation. However, there are many benefits to ensuring your teams are highly geared to delivering activities using an agile trained mindset to ensure stakeholders see and receive benefits of working products sooner rather than later.

 

3. The Input, process, and output machine 

When building a Big Data machine, the inputs are just as important as your outputs I.e. if you don’t put good quality data in, you will not get good quality data out. Moreover, if you do not diversify your data input types, your output will never be an extended 360 view. By extended 360, we are talking about an ever growing single view of the customer.


4. Look after both your internal and external customers 

Without wider business & stakeholder management, there’s no buy-in or adoption. If that’s not there, you don’t have a business case that justifies your big data build. Ensure all stakeholders are involved, and challenges are addressed early and upfront with continuous engagement. With fluid communication, there should be little room for assumptions to creep in. 


5. Quality Assurance

Ensure you are building something that is future proof and of high quality. Quality is not an act, its a habit. Delivering Big Data can be very complex, and you need to ensure the checks and balances are set up from the outset. Achieving the much anticipated end goal will not happen if what is being built, lacks the correct quality control at each stage of the build. 


6. Data Governance 

In today’s day and age, it has become ever more paramount to govern extensively the data that comes into your custody, ownership and into your enterprise data lake environment. Without the proper controls, your data is open to unauthorised usage from inside. Hence for that reason the Availability, Usability, Integrity & Security of the data held inside of your Big Data ecosystem needs to be properly governed. Peter Drucker rightly stated “what gets measured, gets managed”. Ultimately your Big Data machine must be a yardstick of high quality as it will be the central point of your enterprise’s information and intelligence. Data is an asset that you’ll use to support and build your business further. If you don’t properly manage it, it will very quickly become a liability. 


7. Cyber & Information Security 

The number of cyber attacks are growing. A simple google search shows you the most recent attacks that have taken place. As we move more and more data to enterprise data warehouses and data lakes, whether that be on premises or in the cloud, security continues to be a challenge that continues to grow. Data protection laws globally are moving to now hold not only companies but individuals accountable also. Customer data must be protected. After all, it is the customer that has allowed us to collect their data in an effort to ultimately serve them better. Cyber security is an area where a lot of attention and focus must go.


8. Analysis, Analytics & Advanced Analytics 

Today more and more people are talking about data science and analytics. It seems as though this new field of study has come out from nowhere to take the world by storm. The truth is that it has been happening for a very long time. Descriptive analytics is the activity of trying to describe everything up to the current point in time, whereas predictive analytics tries to describe activities that are “likely” to occur based on past behaviour. Finally, prescriptive analytics are simply what our actions would be on those predictions. Of course this is very simplified, however you would look to take the most advantageous course of action taking into account time, cost, and quality. 


9. Business As Usual

When delivering your big data programme, and its subsidiary components, there will come a time when you complete your planned activities. At such time you must make sure that the business and all stakeholders are ready to take control of the big data machine you have built. What has worked time and time again is the notion of a phased handover, i.e. to handover little morsels of responsibility, over time. This helps to reduce BAU teams becoming overwhelmed. Ultimately the success or failure of a business programme such as a Big Data transformation is judged by the acceptance and ongoing use by various users across the business and ecosystem. 


10. Maintain the machine well … 

Just like anything, if you do not maintain a certain level of hygiene, it decays over time. The same can be said of an enterprise data lake. It is a living and breathing machine that needs to be well oiled and regularly maintained and optimised. If you look after it, it will look after you. After all, it is to be the asset you will drive a data driven culture and capability from, and to ultimately navigate your business through an ever expanding and increasingly competitive world.

If you are going through a Big Data transformation programme currently and need some guidance, feel free to reach out to me directly here on LinkedIn.

#bigdata #ai #machinelearning #iot #artificialintelligence #technology #datascience #business #analytics #deeplearning #machinelearning #data #programmemanagement #managementconsultancy #consulting


To view or add a comment, sign in

More articles by Sunny Nirala, CMgr FCMI

Others also viewed

Explore content categories