Science-Based Decision Making

Explore top LinkedIn content from expert professionals.

  • View profile for Robert Dur

    Professor of Economics, Erasmus University Rotterdam; President Royal Dutch Economic Association (KVS)

    24,722 followers

    Great seeing our paper out in Science! Stefano Carattini, John List and I argue that policy evaluation should be combined with a causal analysis of public support. Starting point of our argument is that policies that are generally considered socially desirable by the scientific community are not always popular among voters, because of a lack of understanding or biased beliefs. Congestion charges and carbon taxes are a case in point. However, recent empirical studies have shown that, in cases like these, experiencing the policy may lead voters to correct their beliefs and increase their support. A credible policy evaluation may further help voters to learn about the policy's effects. Our article describes how credible policy evaluation can be fruitfully combined with a causal analysis of public support. If it becomes more widely documented that opposition to sound policies dissipates when voters experience a policy, then policy-makers may be more inclined to experiment with such policies. Learning when and why public support does not increase after policy implementation would be very important as well. Indeed, this may even lead to a change in the consensus about the policy's desirability, for instance when scientists learn that they overlooked some negative aspects of the policy that voters strongly care about. Read the full article here: https://lnkd.in/ed2EAj9G Science Magazine

  • View profile for Warren Powell
    Warren Powell Warren Powell is an Influencer

    Professor Emeritus, Princeton University/ Co-Founder, Optimal Dynamics/ Executive-in-Residence Rutgers Business School

    53,413 followers

    Running simulations: base model vs. lookahead model I see people posting on the use of “simulations” for planning inventory policies. If you are using a lookahead model (which is typical for most real-world inventory problems), there are two models where simulation can be used:   1.    The base model, which can be a simulator or the real world. 2.    The lookahead model, which is used in the policy for planning the future to make a decision now. See the figure below - I use the same notational style for both models, but the lookahead model uses tildes on each variables, which also carry two time subscripts: the point in time we are making the decision, and the time period within the lookahead model.   The base model is used to evaluate the policy, and is needed to perform any parameter tuning. The base model can be based on history or a simulation of what you think the future can be.   When simulating inventory policies, special care has to be used because we do not have historical data on market demand – we typically just have sales, which can be “censored” (a topic that has been recognized in the inventory literature for over 60 years). For example, if we run out of product (and there is no back ordering), we lose the sales, which typically means that we do not see (or record) them.   I find it is generally best to run simulations using mathematical models of uncertainty so that we can run many simulations, testing different policies. Stockouts depend on properly simulating the tails of distributions, along with market shifts, price changes and supply chain disruptions. There are, of course, settings where you have no choice but to test your ideas in the field. It is expensive, risky, and slow, but sometimes you just have no choice, especially when you have to capture human behavior.   If your policy requires planning into the future, you really need to be using a stochastic (probabilistic) model of the future which properly captures the tails of distributions. With long lead times, you should also plan for the possibility of significant disruptions, which can mean that you also have to capture the decisions you might make in the future. See chapter 19 of:   https://lnkd.in/dB99tHtM (“tinyurl.com/” with “RLandSO”)   for an in-depth treatment of direct lookahead policies. #supplychain #inventory  Nicolas Vandeput Joannes Vermorel

  • View profile for Mohammed Abdul Gaffar, CFA

    Group Treasury Manager | CFA | Strategic Finance, Capital Allocation, Corp Dev | Valuation and Decision Memos

    12,309 followers

    🛑 𝐒𝐭𝐨𝐩 𝐬𝐚𝐲𝐢𝐧𝐠 𝐯𝐚𝐥𝐮𝐚𝐭𝐢𝐨𝐧 𝐢𝐬 “𝐚𝐫𝐭 𝐚𝐧𝐝 𝐬𝐜𝐢𝐞𝐧𝐜𝐞.” It’s the line analysts use when they don’t want to choose. Here’s the truth: Valuation is math filtered through judgment. 𝐏𝐚𝐫𝐭 𝐈: 𝐓𝐡𝐞 𝐄𝐚𝐬𝐲 𝐇𝐚𝐥𝐟 The science keeps you honest. Build the DCF. Plug in the discount rate. Read the multiple. That’s just accounting for the past. It explains, but it doesn’t decide. 𝐏𝐚𝐫𝐭 𝐈𝐈: 𝐓𝐡𝐞 𝐆𝐫𝐚𝐲 𝐙𝐨𝐧𝐞 The judgment is where value is created or destroyed. 𝐆𝐫𝐨𝐰𝐭𝐡: Story - credible or wishful? 𝐌𝐨𝐚𝐭: Fortress or sandcastle? 𝐂𝐨𝐧𝐯𝐢𝐜𝐭𝐢𝐨𝐧: Discipline or optimism? This is where a spreadsheet stops being a calculator and starts being a mirror. It doesn’t reflect truth; it reflects belief. The model isn’t the output. It’s the vessel for conviction. I learned that the hard way after mis-pricing risk once. I’ve seen perfect models miss markets and rough ones make fortunes. The difference was never Excel skill, it was judgment. 𝐓𝐡𝐞 𝐭𝐞𝐱𝐭𝐛𝐨𝐨𝐤𝐬 𝐧𝐞𝐯𝐞𝐫 𝐭𝐞𝐚𝐜𝐡 𝐭𝐡𝐢𝐬 𝐡𝐚𝐥𝐟. Valuation lives between math and meaning. Be the analyst who owns the story, not just the spreadsheet. Which DCF assumption do you think misleads analysts most?  #Valuation #CFA #InvestmentBanking Parth Verma

  • View profile for Pan Wu
    Pan Wu Pan Wu is an Influencer

    Senior Data Science Manager at Meta

    51,375 followers

    A "sampled success metric" is a performance measure or evaluation criterion calculated from a sample or subset of data rather than the entire population. Its calculation often involves higher costs per sample, such as manual review, leading to a trade-off between sample size and metric accuracy/sensitivity. In this tech blog, written by the data science team from Shopify, the discussion revolves around how the team leverages Monte Carlo simulation to understand metric variability under various scenarios to help the team make the right trade-offs. Initially, the team defines simulation metrics to describe the variability of the sampled success metric. For instance, if the actual success metric is decreasing over time, the metric could indicate how many months of sampled success metric would show a decrease, termed as "1-month decreases observed". Then, the team defines the distribution to run the Monte Carlo simulation. Monte Carlo simulation, a computational technique using random sampling to estimate outcomes of complex systems or processes with uncertain inputs, draws samples from a dedicated distribution that matches business needs. Based on past observations, the team’s application follows a Poisson distribution. Next comes the massive simulation phase, where the team runs multiple simulations for one parameter and then changes various parameters to simulate different scenarios. The goal is to quantify how much the sample mean will differ from the underlying population mean given realistic assumptions. The final result provides a clear statistical distribution of how much extra sample size could lead to metrics variability decrease and increased accuracy. This case study demonstrates that Monte Carlo simulation could be a valuable toolkit to add to your decision-making and data science knowledge. #datascience #analytics #metrics #algorithms #simulation #montecarlo #decisionmaking – – –  Check out the "Snacks Weekly on Data Science" podcast and subscribe, where I explain in more detail the concepts discussed in this and future posts:    -- Spotify: https://lnkd.in/gKgaMvbh   -- Apple Podcast: https://lnkd.in/gj6aPBBY    -- Youtube: https://lnkd.in/gcwPeBmR https://lnkd.in/dKnrZzzV 

  • View profile for Yuval Passov
    Yuval Passov Yuval Passov is an Influencer

    Helping Leaders Stay Relevant (AI) and Resilient (Health) | Global Founder Advocate | Startup Mentor | Certified Coach | Keynote Speaker

    40,198 followers

    Successful startup founders think like scientists. As an entrepreneur, relying on intuition and gut feelings can be tempting. But if you want to increase your chances of success, you might need to think like a scientist. I recently read a Harvard Business Review article titled "Why Entrepreneurs Should Think Like Scientists." The article highlights a study showing that startups using the scientific method generated significantly more revenue and were more likely to pivot away from unviable ideas. For the top 5%, this meant earning an additional €492,000 compared to those who didn’t apply this approach. So, how can you integrate the scientific method into your startup? 1️⃣ Test your assumptions Don’t just assume your idea will work. Test it with real customers and gather feedback. At Google for Startups, we create small pilot programs to avoid costly mistakes by learning early what works and what doesn’t. 2️⃣ Be ready to pivot Flexibility is key. If something isn’t working, be prepared to change direction. I’ve experienced this firsthand—by pivoting based on user feedback, we’ve turned potential failures into successes. 3️⃣ Use the scientific method Follow a structured process of observation, hypothesis, experimentation, and analysis. This methodical approach helps make informed decisions and drive continuous improvement. For practical application: 👉 Create an MVP Develop a basic version of your product to test your assumptions with real users. 👉 Run A/B tests Compare different versions of a feature to determine what performs best. 👉 Track your results Monitor your metrics to understand what’s working and what needs adjustment. The bottom line? Experimentation isn’t just a safety net; it’s a path to discovering what truly works for your startup. Whether you’re just starting out or looking to refine your approach, integrating the scientific method can be transformative to your startup. What’s your experience with using the scientific method in business?

  • View profile for Alicia McKay
    Alicia McKay Alicia McKay is an Influencer

    Writer. Speaker. Strategist. Honing your bullshit radar 🎯

    43,959 followers

    The world's most valuable skill is critical thinking. Here are 3 decision-making frameworks that will save you dozens of painful hours trying to learn critical thinking for yourself: 1. Chip and Dan Heath's WRAP Framework The measure of a good decision isn't the outcome you produce, but the process you use to make it. Learning this completely changed the way I thought about decision-making, and the importance I placed on process. According to the Heath Brothers, you can overcome common decision biases like narrow framing, confirmation bias, short-term emotions and over-confidence by using these four steps for every significant choice you make. W - Widen your Options R - Reality Test Your Assumptions A - Attain Distance P - Prepare for the Worst. --- 2. Greg McKeown's Essentialism Framework Hang this up in your room somewhere—and stare at it everyday. Greg McKeown, in his book Essentialism, makes the case that the highest point of frustration occurs when we're trying to do everything, now, because we feel like we should. In order to reach the highest point of contribution, we need to do: The Right Thing, at The Right Time, for The Right Reason. When we focus on these three variables, we don't waste time and energy on activities and decisions that aren't a right-fit.  --- 3. Tim Ferris' Fear-Setting Framework I consider this the gold-standard of strategic risk management and contingency planning. Important decisions will always come with risks, consequences and unforeseen problems. Instead of trying to eliminate the negative and plan for the best, Ferris advises people to complete a pre-mortem that simulates potential responses. By drawing up a three column table with: The worst things that might happen The steps you can take to prevent those The ways you will respond if they do happen You're able to prepare for a more pragmatic future, rather than being thrown off course at the first unexpected obstacle. For more information on fear setting, and some useful downloads, check out Tim's blog here. These three frameworks completely changed the way I thought about decision-making, and the support I was able to offer leaders in developing the skills they needed to keep tricky programmes on track. I hope they're useful for you. #leadership #decisions #NotAnMBA

  • View profile for Maya Moufarek
    Maya Moufarek Maya Moufarek is an Influencer

    Full-Stack Fractional CMO for Tech Startups | Exited Founder, Angel Investor & Board Member

    25,346 followers

    Stop studying what other marketers are doing. Start importing these 6 mental models from entirely different fields. 1. Inversion (Mathematics) Origin: Mathematician Carl Jacobi's "Invert, always invert" approach Stop obsessing over how to gain customers and instead ask: "What makes customers leave?" → Real-world win: Slack transformed their retention by analysing drop-off patterns and discovering information overload was driving teams away. Using data to address the problem, they created powerful notification filtering and channel organisation tools. 2. Second-Order Thinking (Systems Theory) Origin: Systems theorist Donella Meadows' cascading effects principle Amateur marketers optimise for immediate outcomes. Professionals anticipate the domino effects. → Real-world win: Airbnb knew investing in professional photography would both increase immediate bookings (first-order effect) and establish higher visual standards across their platform as hosts competed to match the quality (second-order effect). 3. Opportunity Cost (Economics) Origin: Friedrich von Wieser's theory of alternative uses Every "yes" to a marketing activity comes with an invisible cost—what you're NOT doing instead. → Real-world win: HubSpot's decision to sunset 3,000+ underperforming content pieces allowed them to redirect resources to high-performers, resulting in traffic growth despite publishing less. 4. Falsifiability (Science) Origin: Karl Popper's scientific method If your marketing hypothesis can't be proven wrong, it's not a strategy—it's a hope. → Real-world win: Spotify's rigorous testing framework requires every new feature idea to include specific metrics that would indicate failure, ensuring objective decision-making rather than emotional attachment. 5. The Pareto Principle (Economics) Origin: Vilfredo Pareto's 80/20 distribution observation Your marketing breakthrough isn't hiding in more activity but in identifying the vital few inputs creating the majority of results. → Real-world win: Walmart's supplier strategy focused intensively on deepening relationships with the 20% of suppliers driving 80% of sales, creating growth opportunities instead of spreading attention. 6. Antifragility (Risk Theory) Origin: Nassim Taleb's concept of systems that gain from disorder Great marketing doesn't just survive disruption—it's designed to benefit from it. → Real-world win: Oreo's famous Super Bowl blackout tweet ("You can still dunk in the dark") showcased how agile teams with decision-making autonomy can capitalise on unexpected events better than competitors trapped in approval processes. The marketers who will thrive tomorrow aren't just studying what other marketers are doing—they're importing brilliance from mathematics, economics, science, and risk theory. Which of these mental models will you apply to your next marketing challenge? ♻️ Found this helpful? Repost to share with your network.  ⚡ Want more content like this? Hit follow Maya Moufarek.

  • View profile for Naveen Bhati

    Head of Engineering & AI, ex-Meta | AI Strategist & Builder | Helping businesses generate revenue, save money, and free up time using AI

    7,938 followers

    𝟱 𝗗𝗲𝗰𝗶𝘀𝗶𝗼𝗻-𝗠𝗮𝗸𝗶𝗻𝗴 𝗙𝗿𝗮𝗺𝗲𝘄𝗼𝗿𝗸𝘀 Decision-making frameworks provide leaders with structured approaches to tackle complex problems, improve team alignment, and drive better outcomes. By using these tools, leaders can enhance their decision-making process, save time, and increase the likelihood of making successful choices. Here are 5 powerful frameworks every leader should know: 𝗧𝗵𝗲 𝗖𝘆𝗻𝗲𝗳𝗶𝗻 𝗙𝗿𝗮𝗺𝗲𝘄𝗼𝗿𝗸 ↳ Description: Helps leaders identify the context of a situation (simple, complicated, complex, or chaotic) and choose appropriate actions. ↳ Used for: Adapting leadership style and decision-making approach based on the nature of the problem. 𝗧𝗵𝗲 𝗚𝗼𝗹𝗱𝗲𝗻 𝗖𝗶𝗿𝗰𝗹𝗲 ↳ Description: Focuses on the "Why," "How," and "What" of decision-making, emphasising the importance of purpose. ↳ Used for: Aligning decisions with core values and organisational mission. 𝗖𝗦𝗗 𝗠𝗮𝘁𝗿𝗶𝘅 ↳ Description: Organises information into Certainties, Suppositions, and Doubts. ↳ Used for: Clarifying knowledge gaps and guiding further investigation before making decisions. 𝗥𝗜𝗖𝗘/𝗜𝗖𝗘 𝗙𝗿𝗮𝗺𝗲𝘄𝗼𝗿𝗸 ↳ Description: Prioritises options based on Reach, Impact, Confidence, and Ease (or Effort). ↳ Used for: Objectively evaluating and ranking multiple options or initiatives. 𝗘𝗶𝘀𝗲𝗻𝗵𝗼𝘄𝗲𝗿 𝗠𝗮𝘁𝗿𝗶𝘅 ↳ Description: Categorises decisions based on importance and urgency (or impact and reversibility). ↳ Used for: Prioritising tasks and allocating appropriate time and resources to decisions. By incorporating these frameworks into your leadership toolkit, you can enhance your decision-making process, foster better team collaboration, and drive more successful outcomes for your organisation. 𝗤𝘂𝗲𝘀𝘁𝗶𝗼𝗻 𝗳𝗼𝗿 𝘆𝗼𝘂: Which of these decision-making frameworks resonates most with your leadership style, and why? Share your thoughts in the comments! #LeadershipSkills #DecisionMaking #BusinessStrategy

  • View profile for Abhayjeet Kumar Lal

    | Do What Makes you feel Alive | 5L+ Impressions | Like to Explore & Hustle |

    17,001 followers

    𝐌𝐲 𝐃𝐞𝐜𝐢𝐬𝐢𝐨𝐧 𝐌𝐚𝐤𝐢𝐧𝐠 𝐅𝐫𝐚𝐦𝐞𝐰𝐨𝐫𝐤 🧭 Ever found yourself stuck at a career/decision crossroads, paralyzed by indecision? 🤔 Here's my strategic approach to making choices that transform dilemmas into opportunities - The Decision Making Compass "From Confusion to Clarity" 1️⃣ Gain & Loss Ledger Create 2 columns -> Potential Gains vs. Potential Losses * Be brutally honest and comprehensive * Quantify impact wherever possible (financial, career growth, personal development) 2️⃣ Professional Growth Mapping * Visualize each option's trajectory * Ask yourself: "Where does this path lead me in 1, 3, and 5 years?" * Evaluate skill acquisition, network expansion, and learning opportunities 3️⃣ Alignment Check * Does this choice align with your core professional values? * Assess emotional and intellectual satisfaction, not just monetary benefits * Trust your intuition, but back it with rational analysis 4️⃣ Future Proofing 1) Consider long term impact over short-term comfort 2) Embrace choices that challenge you and push boundaries Remember that growth happens outside our comfort zone! "No decision is permanently irreversible. Every choice is a learning experience that shapes your unique professional journey💪" What's your decision making strategy? Share in the comments below! 👇 #CareerGrowth #ProfessionalDevelopment #StrategicThinking #CareerAdvice

  • View profile for Hari Rastogi

    CEO at RiseUpp.com – India’s #1 Platform for Online Higher Education, Upskilling, & Career Growth | Author of ‘ZERO to CEO’ | IIM Trichy Rank #2 🏅 | Speaker at IIMs/IITs | Featured in CNBC, ET, Business Today

    32,415 followers

    5 Decision-Making Frameworks That Transformed How I Lead RiseUpp.com Have you ever faced a crucial business decision that kept you up at night? Last week, while deciding on a major partnership, I reflected on how my decision-making process has evolved since founding RiseUpp. Here are the frameworks that guide me: The 10/10/10 Rule What will the impact be in 10 minutes, 10 months, and 10 years? This helped me prioritize long-term partnerships over quick wins. The Regret Minimization Framework Instead of asking "What's the best choice?", I ask "Which choice will I regret the least?" This led us to invest heavily in user experience over rapid expansion. The Second-Order Thinking Looking beyond immediate consequences. When we made our course comparison tool free, we lost short-term revenue but gained massive user trust and market leadership. The Eisenhower Matrix Urgent vs Important. This saved me from countless "urgent" meetings that weren't moving us toward our vision of democratizing education. The Jeff Bezos "70% Rule" If you have 70% of the information needed, make the decision. Waiting for 100% certainty cost us early opportunities. Now we move faster. The most valuable lesson? These frameworks aren't rigid rules – they're tools. Sometimes, you need to combine them or trust your instinct. What decision-making frameworks do you rely on? Share your experiences below. #Leadership #DecisionMaking #CEOLife #StartupGrowth #BusinessStrategy #EdTech #RiseUpp #OnlineEducation #CareerGrowth #ExecutiveDecisions #StrategicThinking #BusinessLeadership #StartupLife #EntrepreneurMindset #ProfessionalDevelopment

Explore categories