I've spent over 4,000 hours in stakeholder requirement-gathering meetings! Save hours of your life by asking these questions: 1. What do they plan to use the data for? 1. What initiative are they working on? 2. How will this initiative impact the business? 3. Is this for reporting or optimizing existing workflows? Understanding the purpose of the data helps you define its impact. 2. How do they plan to use the data? Will they access it via SQL, BI tools, APIs, or another method? 1. Do they have a workflow to pull data from your dataset? 2. Do they just do a `SELECT *` from your dataset? 3. Do they perform further computations on your dataset? This determines the schema, partitions, and data accessibility needs. 3. Is this data already present in another report/UI? 1. Is this data already available in another location? 2. Do they have parts of this data (e.g., a few required columns) elsewhere? Ensuring you're not recreating work saves time and avoids redundancy. 4. How frequently do they need this data? 1. How frequently does the data actually need to be refreshed? 2. Can it be monthly, weekly, daily, or hourly? 3. Is the upstream data changing fast enough to justify the required latency? Understanding frequency helps you determine the pipeline schedule. 5. What are the key metrics they monitor in this dataset? 1. Define variance checks for these metrics. 2. Do these metrics need to be 100% accurate (e.g., revenue) or directionally correct (e.g., impressions)? 3. How do these metrics tie into company-level KPIs? Memorize average values for these metrics; they’re invaluable during debugging and discussions. 6. What will each row in the dataset represent? 1. What should each row represent in the dataset? 2. Ensure one consistent grain per dataset, as applicable. 7. How much historical data will they need? 1. Does the stakeholder need data for the last few years? 2. Is the historical data available somewhere? Ask these questions upfront, and you'll save countless hours while delivering exactly what stakeholders need. - Like this post? Let me know your thoughts in the comments, and follow me for more actionable insights on data engineering and system design. #data #dataengineering #datastakeholder
Questions to Ask for Data-Driven Decisions
Explore top LinkedIn content from expert professionals.
Summary
Making data-driven decisions means using facts and analysis instead of guesses or opinions when choosing a direction for your business or project. Asking the right questions before starting any analysis ensures that your efforts actually lead to clear, helpful actions and support real business outcomes.
- Clarify the purpose: Always ask what decision the data will influence and why this question matters now, so you can focus your analysis on what’s truly important.
- Check data readiness: Find out what data is available, how reliable it is, and whether it matches the needs of your project before diving into analysis.
- Define success: Decide upfront what a good result would look like and who will use your insights, so everyone measures progress the same way and can take meaningful action.
-
-
I've watched 3-week analyses get ignored in 3-minute meetings You can build the cleanest dashboard. Run the most advanced analysis. And still… nothing changes. Not because the numbers are wrong. Because the right business questions were never asked. 𝐁𝐞𝐟𝐨𝐫𝐞 𝐲𝐨𝐮 𝐬𝐭𝐚𝐫𝐭: → What decision will this influence? If no decision changes, the analysis adds no real value. → Who is the decision maker? Insights need an owner; otherwise, they die in slides. → Why does this problem matter now? Timing defines relevance more than sophistication. → What would success look like? Without clear success criteria, results are meaningless. 𝐃𝐮𝐫𝐢𝐧𝐠 𝐚𝐧𝐚𝐥𝐲𝐬𝐢𝐬: → What action will be taken if numbers change? Metrics without actions create dashboards, not outcomes. → What assumptions are we making? Unchallenged assumptions lead to confident mistakes. → What is the cost of being wrong? False positives and false negatives are not equal. → What data do we actually need? More data often adds noise, not clarity. 𝐁𝐞𝐟𝐨𝐫𝐞 𝐩𝐫𝐞𝐬𝐞𝐧𝐭𝐢𝐧𝐠: → What context is missing from the numbers? Seasonality, market shifts, and policy changes change everything. → How will this insight be communicated? If it can't be explained simply, it won't be used. Good analysts don't start with data. They start with decisions, actions, and consequences. For me, the most skipped question is "What decision will this influence?" I've seen entire projects die because no one asked it upfront. Which one do you see skipped most often? ♻️ Repost if someone in your network works with data — 📚 Get 150+ real interview questions (with solutions & frameworks) in our Data Analyst Interview Prep Book: https://lnkd.in/dyzXwfVp 𝐏.𝐒. I share insights on data analytics & career growth in my free newsletter. Join 20,000+ readers here → https://lnkd.in/dUfe4Ac6
-
Most leaders trust the dashboard. Few can explain why it deserves that trust. That gap is where data programs quietly fail. Every KPI, forecast, or AI output is the result of choices made long before a chart appears. If leaders don’t understand those choices, they’re managing outcomes, not systems. Here’s what’s actually happening underneath and what leaders should ask: 1. Where data starts Ask: Does this data reflect how the business really operates, or just what systems happen to capture? 2. How data gets in Ask: What breaks if this arrives late, incomplete, or duplicated? Who owns that risk? 3. Raw data storage Ask: Can we re-answer questions when strategy changes, or are we locked into old assumptions? 4. Shaping the data (ETL / ELT) Ask: Which business decisions were hard-coded into the platform without anyone realizing it? 5. Cleaning & enrichment Ask: Where are assumptions being added, and who is accountable for them? 6. Fast, trusted storage Ask: Do teams actually use this, or do they export data to spreadsheets to get work done? 7. Defining metrics Ask: If two leaders reference the same KPI, do they mean the same thing? 8. Quality checks Ask: How do we know numbers are wrong before decisions are made? 9. Dashboards & reports Ask: What decision should change because of this view? 10. AI & advanced use cases Ask: What weaknesses will AI expose the moment we try to scale it? 11. Monitoring the system Ask: How do we know the platform is drifting or degrading before results suffer? 12. Governance & access Ask: Where are we protecting the business, and where are we unintentionally slowing it down? 13. Continuous improvement Ask: Who owns evolution once the system is considered “live”? The best data platforms don’t feel complex. They make confident decisions easier and repeatable. If leaders can’t trace how insight becomes action, AI will never move beyond pilots. 🔁 Repost to help other leaders ask better questions about their data and AI 👤 Follow Gabriel Millien for clear, executive-level insights on AI, data, and transformation Image credit: Shalini Goyal. Give her a follow!
-
People usally start data analysis with dashboards. Good analysts start with questions. Data doesn’t create insights on its own. The quality of analysis depends on the clarity of thinking before any query is written or chart is built. This framework highlights the key questions experienced analysts ask before analyzing any dataset - ensuring analysis leads to decisions, not just reports. 👇 • Define the real business problem before touching the data, because unclear decisions lead to meaningless analysis. • Clearly understand what success looks like by identifying metrics, benchmarks, and expected outcomes. • Verify what data is actually available to avoid building analysis on incomplete or misunderstood sources. • Assess data reliability early, since poor data quality weakens even the best analytical models. • Challenge assumptions continuously to prevent bias, false correlations, and misleading conclusions. • Choose the right dimensions for segmentation to uncover patterns hidden inside aggregated numbers. • Identify the target audience so insights match the level of technical depth and business context required. • Decide the output format intentionally, because how insights are presented shapes how they are used. • Focus on the action the analysis should drive - because analysis without decisions creates no impact. Great analysis isn’t about tools or dashboards. It’s about asking better questions before searching for answers. What’s the first question you ask before starting a data analysis project? 👇
-
The Baker's Dozen of Questions to Ask Before Beginning Any Significant Analysis Project Preparation is vital to success in any analysis project. As highlighted by Babette Bensoussan, MBA, in our book, Business and Competitive Analysis 2e (2015, Pearson), here are 13 essential questions to ask before starting a complex and high-value project for others: 1. Why is this project being proposed? 2. Has anyone attempted it before? If so, what were the results? 3. Are there any barriers to performing the analysis process that I should be aware of? Could any of these barriers halt progress? 4. What data or information has already been gathered on this topic? What additional information is needed? 5. What analysis processes and systems will be required? 6. Who else in the organization has a stake in the outcomes influenced by this insight work? 7. What Plans, Choices, Actions, and Decisions (PCADs) will be made based on my requested insights? 8. How quickly is an answer needed or desired? What critical factors might impact this timing? 9. What are the client's or customer’s expectations of me? How will they define my success? 10. What does the customer want, need, or not want/need to hear? 11. What resources are available to support me? 12. Can I accomplish what is being asked of me? 13. Is the potential PCAD more valuable than the effort and resources required for analysis? By effectively managing expectations, analysts can foster mutual respect and trust with decision-makers, leading to a clearer understanding of the challenges involved. This proactive approach can help prevent disconnects between the analysis planning process and the decision-making that follows, ultimately benefiting the enterprise. #Preparation #ProjectManagement #Analysts #SettingExpectations #ManagingExpectations
-
📌 PART 1 : How Data Analysts Collect Company Data (Beginner-Friendly Guide) Lately I’ve been talking to a lot of new analysts who feel confused about what to actually do when a company says: “Help us analyze our data.” I’ve been there, unsure of where to start, what questions to ask, or how real analysts collect business data. So today, I want to break it down in the simplest way possible. If you’re a beginner, read this carefully and save it. 1️⃣ Start With a Requirements Discovery Call Before touching Excel, SQL, or Power BI, your first job is to understand the business, not the dataset. Ask the company questions around: Business Goals • What problem are we solving? • What decisions will this analysis improve? • What does success look like? Data Needs • What data sources exist? • Where is the data stored (Excel, SQL, CRM, POS)? • What time period should be analyzed? Output Expectations • Dashboard, report, or cleaned dataset? • Which KPIs matter the most? • Should the report update weekly or monthly? Access & Security • Will you need login access? • Any sensitive columns to anonymize? This is how professionals avoid confusion and build trust early. 2️⃣ Ask for the Right Data Files Depending on the industry, request the correct tables: Retail / E-commerce Orders, Customers, Products, Inventory, Returns. Finance Transactions, Ledger, Forecasts, Budgets. Healthcare Appointments, Billing, Encounters, Lab results. HR Employees, Payroll, Hiring funnel, Performance. Always request the files in Excel or CSV, with proper column names. And ask for a data dictionary, it explains what each column means. 👉 If this helped you, watch out for Part 2 where I’ll break down cleaning, analyzing, and delivering insights like a pro.
-
🚀 A Life-Changing Lesson I Learned at Google — That Every Analyst Needs to Hear At Google, I learned the fastest way to generate impact isn't writing code. It's mastering conceptual reasoning before you touch a tool. Let's take Exploratory Data Analysis (EDA). 🙅♀️ Most analysts treat it as a technical race. A checklist of commands to run. 💡 But EDA isn't a coding competition. It's a framework for thinking. It’s not about the commands you run; it’s about the questions you ask. Here’s the framework we used 👇 Notice how the "So What?" is built in from the very beginning. 1. Find the Shape (Observe, Don't Analyze) Before you run a single command, get the 30,000-foot view. Ask: What's the scale (thousands or millions)? What are the extremes? Is the data skewed by a few massive values? Purpose: To understand the landscape before you get lost in the details. 2. Understand the Components (Univariate) Now, zoom in on one variable at a time. Ask: How is this metric distributed? Is it stable, volatile, or clustered? Are outliers mistakes, or are they your most valuable insights? Purpose: To understand the behavior of each individual character in the story. 3. Connect the Dots (Bivariate) Step back and see how the characters interact. Ask: When one metric goes up, what does another do? Which relationships are worth paying attention to — and which are noise? Are you seeing signs of dependency (e.g., engagement rises, then conversions follow)? Purpose: To identify potential cause-and-effect patterns—not to prove them, but to know where to look deeper. 4. Add Context (Time & Segments) Data doesn't exist in a vacuum. Ask: How has this changed over time? What's driving it (seasonality, a product launch)? Which segments (geographies, demographics) behave differently? Purpose: To connect abstract patterns to real-world business decisions. 5. Deliver the "So What" (The Decision) This is the only step that matters. An analysis is useless until it forces a decision. Ask: What does this mean for the business? What should we do next? Purpose: To move from description ("what")->>> interpretation ("so what") ->>> action ("now what"). 💬 The Takeaway: You don’t need a complex tool to master analytics. You need to learn how to observe, connect, and reason. Tools can compute. Analysts must interpret. Comment 👍 if you need my full EDA framework guide
-
Focusing on how AI decisions are made, by whom, and whether an entity has exercised judgment rather than delegation can help insurers meaningfully access an insured’s AI risk potential. I’ve put together this list of questions to help with that: 1. Governance and Accountability Who inside the organization is accountable for AI or automated decision risk today? Is AI risk formally included in the enterprise risk or compliance framework? ☐ Yes ☐ No ☐ In progress Has AI risk been discussed at the executive or board level in the last 12 months? ☐ Yes ☐ No If yes, what came out of that discussion? Why: find out whether AI risk lives somewhere real or nowhere at all 2. Awareness of Use How does the entity identify where AI or automated decision tools are in use? Does it maintain an internal inventory or register of AI-enabled tools? ☐ Yes ☐ No ☐ Partial Is there a defined approval or intake process before a new AI tool is deployed? ☐ Yes ☐ No Why: you have to know about it to control it 3. Change Management and Oversight How are material changes to AI tools or models reviewed and approved? What events or thresholds would trigger reassessment, rollback, or suspension of a tool? Are outputs or performance monitored after deployment? ☐ Yes ☐ No ☐ Informally Why: is AI treated as a living system or a one-time implementation 4. Data Use and Protection Are any categories of data expressly prohibited from being used in AI tools? What controls exist to prevent sensitive or regulated data from being introduced into external models? Has the entity ever declined or limited a tool because of data concerns? ☐ Yes ☐ No Why: disciplined processes > assurances 5. Third-Party and Vendor Risk Is AI-related risk assessed as part of vendor onboarding? ☐ Yes ☐ No Do vendor contracts address any of the following? ☐ Data usage limits ☐ Audit or transparency rights ☐ Representations regarding model training ☐ AI-related indemnification Has the entity ever renegotiated or exited a vendor relationship due to AI risk? ☐ Yes ☐ No Why: who has leverage when something goes wrong 6. Incident Readiness What would trigger treating an AI failure or error as a reportable incident? Who would be involved in responding to an AI-driven incident? Have AI failure scenarios been included in tabletop exercises? ☐ Yes ☐ No ☐ Planned Why: rehearsing failure > not thinking about it 7. Culture and Escalation Are employees trained to escalate concerns about automated decisions? ☐ Yes ☐ No Can the organization point to an example where deployment was paused due to internal concern? Are there protected channels for raising AI-related risk issues? ☐ Yes ☐ No Why: issues surfaced early > after damage is done 8. Looking Ahead How does the entity track emerging AI legal and regulatory obligations? What changes to AI governance or controls are planned in the next 12 months? Why: AI is fluid so roadmaps are super important #AI #cyberinsurance
-
Way too many data projects fail. Not because the analysis was wrong but because the goal was never clear to begin with. Before you dive into the data, make sure you understand what problems you actually try to solve and for whom. 𝗔𝘀𝗸 𝘁𝗵𝗲𝘀𝗲 𝟴 𝗾𝘂𝗲𝘀𝘁𝗶𝗼𝗻𝘀 𝗯𝗲𝗳𝗼𝗿𝗲 𝘀𝘁𝗮𝗿𝘁𝗶𝗻𝗴 𝗮𝗻𝘆 𝗱𝗮𝘁𝗮 𝗽𝗿𝗼𝗷𝗲𝗰𝘁: 1. What is the actual business question? 2. Who are the stakeholders? 3. What decisions will this analysis support? 4. What data is available? 5. What pieces are missing? 6. What format is expected? 7. What does success look like? 8. What is the timeline and urgency? Answering these upfront can save hours of rework and ensure your results will get used. What’s the one question you wish you had asked before your last data project? ---------------- ♻️ 𝗦𝗵𝗮𝗿𝗲 if you find these questions helpful. ➕ 𝗙𝗼𝗹𝗹𝗼𝘄 for more daily insights on how to grow your career in the data field. #dataanalytics #datascience #dataproject #stakeholdermanagement #careergrowth
-
AI feels like a magic wand. But it can also lead you off a cliff. Fast answers ≠ good answers. Before you act on AI output, ask these 6 questions. They’ll protect your judgment and steer your team toward real outcomes. Let’s go: 1. What problem am I really solving? → Is this even the right question? Why it matters: AI answers whatever you ask—even if it’s irrelevant. What it does: Keeps your focus on real business goals. 2. What assumptions are part of this output? → What’s hiding beneath the surface? Why it matters: AI is built on data—and that data has blind spots. What it does: Uncovers risks from hidden bias. 3. Do you have any evidence to support this? → Where’s the proof? Why it matters: AI sounds confident—even when it’s wrong. What it does: Promotes evidence-based decisions. 4. What are at least 2 credible alternatives? → Are we settling too early? Why it matters: AI often gives the “first” answer, not the best one. What it does: Pushes your team to explore. 5. What are the risks if we act on this? → What could go wrong? Why it matters: We skip second-order thinking far too often. What it does: Enables better fallback plans. 6. What does my judgment tell me? → Does this align with real-world experience? Why it matters: AI has data—but you have context. What it does: Brings back human judgment. AI is powerful. But it’s not perfect. Use these 6 questions to stay in the driver’s seat. What’s one time AI gave you a bad answer? ⬇️ Let me know in the comments Want to scale your business with AI? Join AI-Empowered Leaders: My weekly newsletter with actionable AI and problem-solving insights from my work as a trainer, consultant & coach. Sign up here 👇 https://lnkd.in/dar5M9p7 ♻️ Repost to help your network make smarter AI decisions
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development