Why Analytics Fail!
What does data analytics really mean and how can it help you today? If we Google for the term “data analytics” a more than 39 zillion results will appear.
To put it simply, data analytics can help you envisage the upcoming and take choices with relative certainty. It provides a competitive advantage today, analytics-based decisions are norm across industries. Yet, analytics has its challenges and, done wrong, can quickly drain your resources without leading to actionable results.
"Many spend years reflecting—often out of a combination of organizational inertia, competing priorities, and skepticism of business analytics—before taking the first step toward embracing these methods," Challenges often stem from misconceptions about analytics, and fruitless quests for data and statistical faultlessness."
1. Fallacies around analytics:
It's not uncommon for some stakeholders to be skeptical of the value of analytics. Their scepticism frequently stems from misapprehensions about analytics and what it can and can't do, business analytics is sometimes mistaken for off-the-shelf software somehow purporting to "predict the future," some dismiss it as business fad. Others erroneously believe that analytics solutions will provide them with a kind of absolute truth.
Both notions about analytics software are detrimental to analytics projects, says Guszcza, national predictive analytics lead for Deloitte Consulting LLP's Advanced Analytics & Modeling practice. The former notion makes business analytics seem too good to be true, which makes it difficult to sell the concept within the organization, he says. The latter can lead to unrealistic expectations—and disappointment—when predictive analytical models fail to provide absolute truth. It can also lead companies to rely on analytics alone to make complicated decisions when, in fact, they should bring in the professional judgment and domain knowledge of skilled employees who can provide proper checks and balances, he adds.
"The goal of any business analytics is to convert raw data into insights, inferences or predictive models that can lead to better decisions, what we know about the Law of Large Numbers: You don't need absolute truth. You just need, 'true enough to be useful.”
The most common misunderstanding about analytics is that if you look at data hard enough, you will find insights. Staring at daily dashboards in the hope that insights will miraculously reveal themselves is often overwhelming, confusing and unsuccessful. Successful analytics start by identifying the question you’re trying to answer from the data. For example, if site conversion is an issue, narrow down your efforts to a specific question. In this case, it might be “How can we increase conversion from 23% to 26%?” This approach allows you to focus on finding actionable drivers of conversion that can have impact.
2. Concerns about data quality, type of data:
Many organizations think their data needs to be in flawlessly clean, reliable shape to begin an analytics program.
Analytics can fail after following a “hypothesis-driven structured approach” with involved stakeholders if the organization doesn’t have easy access to clean and reliable data. The data needn’t be perfect for successful analytics, just cleaner and with fewer data issues. Data maturity is thus a prerequisite for analytics maturity.
The pursuit of "perfect" statistical models. While some organizations get hung up on having perfect data, some statisticians lose sight of the practical, business contexts of their modeling projects and get caught up in developing theoretically ideal statistical models.
"By engaging in a hunt for the perfect model and by striving for impractical degrees of accuracy, statisticians sometimes sacrifice the benefits that could result from models that are imperfect but still useful".
3. Lack of communication between data people and decision people:
Data modelers and analysts can do more effective work when they maintain an ongoing interchange with the decision makers for whom their work is intended. Two-way communication helps reduce the risk of unfortunate downstream surprises, expensive implementation snags, and unmet expectations that manifest only at the close of a project.
For example, Lucker and Guszcza say they have been privy to number of predictive modeling projects that ended badly because the business people "outsourced" required critical thinking entirely to analytics personnel. While skilled, the analytics employees did not have the appropriate perspective to properly design the analysis and interpret the results.
An analytics project can still fail even when it begins with a business question and a structured approach for analysis if the hypotheses used to narrow down the scope of the problem are weak. Weak hypotheses result from failure to follow due process with the right stakeholder.
4. Over-confident analysts:
Skilled analysts can be overly confident in their abilities and in the accuracy of their judgments. Though they possess uncommon skills that are sometimes viewed as obscure, business analytics experts are human, just like the decision makers for whom their models are intended. Any team embarking on data mining, predictive analytics, and predictive modeling must be very well versed in math and statistics.
It’s imperative, but often overlooked, that analysis is done early on to reconcile what business leaders think they want, versus what they need, versus what they can achieve.
While some get hung up on having perfect data, some statisticians lose sight of the practical, business contexts of their modeling projects and get caught up in developing theoretically ideal statistical models, some are of the view that mathematical models represent repositories of truth. For Any team embarking on data must be very well versed in math and statistics.
5. The Old School Mindset:
Moving up the internal organizational hierarchy, the leadership team also has a role to play in why analytics projects fail. Many business leaders have a “green bar report” approach, meaning they trust the old way of doing things. Instead of basing their decisions on actionable results, they want to review each bar, a time-consuming process.
In addition, there’s a lack of continuous involvement or sponsorship by executives in the analytics process. It’s the very antithesis of Agile IT.
Consider a typical analytics project. During the kick-off meeting, everyone is in the room. Yet, a meeting with leadership involvement may not reconvene. They ask for an update and find that the outcomes are not what they were looking for or priorities have shifted, having a lackluster approach.
Conclusion:
The Age of Analysis is here, and will stay for long as these are truly revolutionary times if both business and technology professionals continue to work together and deliver on the promise.
Ashish Chutke brilliant analysis. If we got to make this a success, collaboration and communication between the right people is the key.
Well articulated Ashish!
Very well written Ashish.. I liked the fact that the article stresses more on the usage of analytics and then the benefits one can draw. HR professionals I have come across (including me) are not so digit savvy hence many of the parameters that the data can throw gets ignored easily. It requires a specialised skill and instant liking for one to look thru data and then relate it with the organisational trends. As you have written, we shouldn't be looking out or waiting for a perfect data.. there isn't any such pefection.. data will always show imperfection and it is upon the us to correlate it with the irregularities and lacuanas in the system, process and policies.. I liked this para the way you have written *"The goal of any business analytics is to convert raw data into insights, inferences or predictive models that can lead to better decisions, what we know about the Law of Large Numbers: You don't need absolute truth. You just need, 'true enough to be useful.”* Very insightful and must read for all of us.. well written