Conquering Data overload !!

Conquering Data overload !!

We are drowning with A rapidly increasing rate of new Data being produced every second.  Every single interaction between human and Machine, or between Machines, gives a birth to new data. In addition organisations have also shown flexibility in leveraging Data coming from outside their enterprise firewalls . This is seen as a significant differentiator to gain competitive advantage. 

With more information coming from more sources, many organisations struggle to manage data’s exploding volume, diversity and complexity. Not only is storing data costly, but conventional data management practices can strain IT resources.

 Organisation need to onboard right data strategy to conquer "Data overload".

1. Managing Data Deluge (MDD) : Data explosion has resulted in Volume, Variety, Velocity and Veracity. Likes of HDFS have solved the problem with data storage and cost but unfortunately IT teams are still spending lot of time on data collection, preparation, cleansing, harmonising and blending of data.  Data deluge have also created a need for a robust data management platform which can support heterogeneous Data stores. There is no "one single tool" approach to solve this problem due to complexities involved.  Below are key topics which need to be carefully addressed to architect a robust and scalable data analytics platform.

 Architecture : A Future looking reference architecture that not only addresses today’s business problems but also provides flexibility and capabilities to solve tomorrow’s business problems!

Data Processing:  You need to consider Diverse Ingestion methods to cater to sourcing of disparate data in a loosely coupled fashion. One hand we would need to consider traditional ETL to handle structured bulk data, on other hand we also need to think of tools supporting enrichment processes like NLP, Ontology Modeling to process social, devices, interactions, etc.

Data storage: Platform should have ability to store Hybrid Data types into same repository so that interact with each other for maximum value creation. How do we govern usage, access, processing and consumption pattern ? How do we address data security of external vs internal data.

2. Minimum viable Insight (MVI) :  Minimum viable Insights is the Ability to get "Maximum Value" with existing datasets in shortest turn around time. This is only possible with test & learn sandbox and right data science skills. It is hard to find data science expertise who can discover and Produce new business insights in short turn around time (as short as few hours). Organisations are struggling to get right technology and domain expertise to unlock hidden opportunities and insights.  Below are two key building blocks for better MVI.

Data & Insights : Integrating and blending disparate data sources is inherently a must have capability to gain new insights. Business needs a Quick sandbox setup for test & learn so that they can be quicker in applying insights into actions. It is also a challenge to continuous orchestration of  insight generation which matches the speed of business thoughts and hypothesis and hence enough thought need to be put in terms of analytics delivery methods.

Consumption framework: It is not only about getting the right insights but also presenting those insights in a simple intuitive way so that insights get converted into actions. you need to have Multiple visualisation tool approach depending on the maturity of business community and target audience. traditional Bi reporting tool need to be supported with API enablement and advanced visualisation tools for enhanced self service Bi capability and visual data discovery to promote data experimentation culture. 

organisations need to enhance their capabilities to manage the data deluge and same time continuously increase the analytics maturity to deliver meaningful and actionable insights so that they can continue to Maximise Information explosion...

                          source : Wales Higher Education Libraries Forum

To view or add a comment, sign in

More articles by Rakesh sAnchetI

  • From Passive Tools to Active Agents: The Rise of Agentic AI Frameworks

    As generative AI continues to evolve, the emergence of the Agentic AI framework marks a profound shift in how we…

    1 Comment
  • Decentralized AI : Can Buck stop with Blockchain ?

    Over the last few years, we have seen rise of Blockchain, BigData and AI technologies. It can't be undermined that…

    1 Comment
  • Reimagining the Service Economy with Automation and AI

    The speed of technological change is accelerating the wider acceptance of cognitive and AI realm. Recent advances in…

    1 Comment
  • Route to GDPR

    In my Previous article on GDPR I briefly highlighted about new European Data protection regulation and it's impacts for…

  • MIND THE GAP - GDPR AHEAD

    The regulatory Environment is becoming more and more complex with new regulations being introduced across the world…

  • 5 Steps to True Data Science

    I would try to debunk complex defintion of DATA SCIENCE and give a very simple and easy to understand view of data…

    3 Comments
  • Success Mantra for curating your most important data asset

    As more and more enterprises are becoming data driven, focus towards Curating data has drastically increased. For data…

  • Finding Mojo : Tale of Chief Data officer...

    While every bank is pursuing their digital swag to become a truly digital bank, they are equally increasing their focus…

    2 Comments
  • Do you have holes in your Data?

    What if i told you that "Holes in your Data" can cost you as little as Millions of dollars annually !! Don't Believe me…

  • Unleashing possibilities - BI in the Cloud

    Are you one of the lucky companies born in "Internet Era" and doesn't have to grapple with issues related to legacy…

Others also viewed

Explore content categories