McKinsey taught me that brilliant people fail when they answer the wrong question. Don’t just answer questions. Frame them. Because a brilliant answer to the wrong question is still wrong. Ask, “How do we make customer support more efficient?” and everyone races to cut headcount or automate. You might save dollars and bleed trust. Try this instead: “What service approach builds loyalty while balancing cost?” Now you are designing for humans, not just a spreadsheet. How you frame a question shapes what you notice, what you measure and what you ship. Daniel Kahneman and Amos Tversky called this the framing effect. It’s one of the most underrated leadership skills. I learnt the value of spending time on framing the question in my 10 years at McKinsey. At first it felt forced. But projects where we invested serious time up front to define the question led to sharper insights, faster decisions and happier teams & clients. When we didn’t take the time, chaos reigned. Put it into practice this week: 1. Question the question. ↳ What assumptions are baked in? What if you flipped it on its head? 2. Start at the finish line. ↳ Define outcome or experience you want, then trace back the decisions and actions that create it. 3. Make space for the devil’s advocate. ↳ Assign someone to challenge whether you’re even solving the right problem. If you work with data or roll out new tech, your analysis is already shaping outcomes. Make sure you’re shaping the right ones. Have you ever felt like you’ve missed the mark on the question you’re answering? What's one question your team has been wrestling with that might need a reframe? ♻️ Repost to help someone get their question right. 🔔 Follow Clare Kitching for insights on unlocking value with data & AI.
Design Thinking In Customer Experience
Explore top LinkedIn content from expert professionals.
-
-
Teams often implement solutions that do not fix the problem they were trying to address. That's because the issue wasn’t framed correctly in the first place. This is especially true in complex or unfamiliar situations, where quick conclusions feel comforting but are often wrong. When I work with teams on decision-making, I turn to a framework developed by Julia Binder and Michael Watkins. Their E5 approach helps leaders define the right problem before trying to solve it. Phase 1: EXPAND Suspend early judgments and deliberately broaden how the challenge is understood. By exploring multiple interpretations of the issue, teams uncover hidden assumptions, surface blind spots, and create the conditions for more original thinking before jumping to answers. Phase 2: EXAMINE Shift from scope to depth. Teams analyze the problem rigorously, moving beyond visible symptoms to identify behavioral patterns, structural drivers, and underlying beliefs that reveal what is truly at play. Phase 3: EMPATHIZE Center on the perspectives of those most affected by the issue. Through (real) listening and reflection, teams gain insight into stakeholders’ motivations, emotions, concerns, and behaviors, often uncovering needs that data alone cannot reveal. Phase 4: ELEVATE Step back to see how it fits within the broader organization. Viewing the challenge through lenses such as structure, people, power, and culture exposes interdependencies and systemic tensions that shape outcomes. Phase 5: ENVISION Articulate a clear future state and map a path to reach it. Working backward from a shared definition of success, teams prioritize initiatives, sequence efforts, and align resources to move from understanding to execution. I've found that when leaders take the time to frame problems well, they increase the likelihood that those solutions will actually matter. #decisionMaking #leadership #perspective #learning #problems Source: The model is described in more details in this Harvard Business Review article: https://lnkd.in/gAeBb5uT
-
Why Every Product Manager Needs A/B Testing 🚀 Imagine cooking up a recipe for the perfect product feature. Would you trust your instincts blindly, or would you test different ingredients to get the best taste? That’s where A/B testing comes in. It’s the secret sauce that helps Product Managers make data-driven decisions with confidence. Here’s everything you need to know to master A/B testing: ❓ What is A/B Testing❓ A/B testing is the process of comparing two or more versions of a product to determine which one performs better. The versions might differ in small ways - a new button design, a revamped landing page, or an updated pricing structure but the impact on user behaviour can be monumental. This method helps you validate assumptions, optimize user experiences, and ensure every product decision adds value. ⚙️ How to Conduct a Successful A/B Test? ⚙️ 🔹 Set Clear Goals Ask yourself what are you trying to improve? It could be anything from conversion rates to user satisfaction. Your goal is your North Star. 🔹 Choose the Right Metrics Metrics like click-through rates (CTR), time spent on a page, or purchase frequency will guide you in evaluating success. 🔹 Hypothesize Frame your test with a simple prediction. Example: “I believe changing the CTA button color from blue to green will increase clicks by 15%.” 🔹 Design Your Experiment Define your control group (current version) and treatment group (variant to test), ensuring a large enough sample size for reliable results. Run the test for a sufficient duration to capture meaningful patterns and user behaviour. 🔹 Analyze & Implement Use tools like Google Optimize or Optimizely to analyze results and determine statistical significance. Roll out the winning variant confidently, or refine your hypothesis for future iterations if results are inconclusive. ♻️ Four Types of A/B Tests Every PM Should Know ♻️ 1️⃣ Feature Testing: Validate hypotheses for new features pre-launch. 2️⃣ Live Testing: Fine-tune existing features already in the wild. 3️⃣ Trapdoor Testing: Redirect traffic between variants dynamically. 4️⃣ Multi-Armed Bandit (MAB): Let machine learning allocate traffic to better-performing variants in real-time. ❌ Common Pitfalls to Avoid ❌ 1️⃣ Testing trivial changes that won’t move the needle. 2️⃣ Ignoring sample size requirements—small audiences lead to inaccurate conclusions. 3️⃣ Treating A/B testing as a one-off exercise. Optimization is an ongoing journey. What’s been your most surprising A/B testing discovery? Let’s discuss in the comments!👇 Ready to embark on an exhilarating journey into the heart of product management? I’ve recently launched a cohort that is focused on teaching end-to-end product management as well as providing career placement opportunities! 🧠 Fill in the form in the comments to register your interest in the cohort and I’ll reach out to you with further details. ✍️ #ProductManagement #ABTesting #PMTools #ContinuousOptimization
-
#100RulesofThumb — Rule 12 Never state a problem to yourself in the terms it was brought to you. TL;DR It is difficult to see the whole picture when you are inside the frame. If you are in love with an idea, you can't be a good judge of its value. —— I find this rule to be a powerful tool for intellectual clarity. Here's the breakdown: 1. Don't state problems as presented: When someone presents you with a problem or an idea, their perspective might be skewed and biased. So, step one is not to accept their framing of the issue blindly. 2. Think of the problem as a frame: Any problem, in the way it is proposed, is like a picture frame. If you keep standing inside that frame, you see what's inside it from the perspective of the frame, but you miss the opportunity to explore the surrounding context and arrive at better frames. 3. Don't fall in love with your framing: If you're emotionally attached to a particular perspective, your judgment gets clouded. You can't see the flaws or merits clearly because your emotions are in the way. Imagine you're a product manager and your lead designer suggests a particular approach to improve the UX. The first step here would be to recognize that your developer might be attached to this idea because they've invested significant time and effort into it. So, you don't accept the UX problem as it was presented to you. Instead of succumbing to the frame, you step back and ask: "If I weren't attached to this approach, how would I evaluate it objectively?" You see, smartphone cameras were initially seen as an easy way to click photos on the go. The problem as presented was to improve image quality and resolution. However, the tech ecosystem reframed this problem. They realized that people were using their phones not just to "click photos" but to "capture moments", share experiences, and connect with others. So, rather than focusing solely on improving image quality, they integrated cameras with social media apps. This reframing allowed users to instantly share photos and videos, changing the very nature of media. Likewise in public transport, the conventional frame might be to make buses and trains faster. But what if we reframe the problem? Instead of focusing on speed, we could focus on making the wait time more enjoyable. This could lead to better-designed bus stops with Wi-Fi and entertainment, making travel from A to B more pleasant. Or consider a struggling bookstore. The traditional frame is to compete on price with online retailers. But what if we reframe the problem? Instead of just selling books, the store could become a community for book lovers. It could host author events, offer writing-related workshops, and provide a unique experience that online retailers can't match. So you see, not accepting the frame you are presented with is a powerful tool for thinking. If you employ it regularly, you will 10x your decision-making and judgment for senior business roles.
-
Most reps think hitting pain points is enough. It’s not. Because pain without urgency doesn’t close. Think about it… “Save 2 hours per week on reporting.” That’s nice. But it’s not moving the CFO to sign tomorrow. Now compare it to this: “Your board meeting is tomorrow and you still don’t have clean numbers.” One is “meh.” The other is signed. Same story in pipeline deals: “I want to improve pipeline visibility.” = someday “My biggest customer just went dark and my CRO wants an update at 9 AM.” = today You got the lesson? It’s not about finding pain. It’s about tying that pain to a time-sensitive trigger your buyer can’t ignore. The closer you anchor your message to a deadline, a meeting, or a career risk… The faster the deal moves. Buyers don’t act on abstract problems. They act when the clock is ticking.
-
The idea that A/B tests are autonomous deployment decisions is disingenuous. You are ignoring the most important reality, which is that someone is 𝘢𝘭𝘸𝘢𝘺𝘴 making a decision. How are they making that decision? Experimentation is mostly seen as the final stage in a linear process. You conduct some research, conceive an idea, and then test it. In this way of viewing experimentation, the test IS the decision. If it wins, then deploy it without further consideration (and forecast your exact revenue for a year). If it loses, then definitely do not implement it and move on to something else. However, the idea that an A/B test represents a decision is a dangerous illusion: > There are many potential issues with experiments that you may never observe in the data. You can never be completely certain that the result you see is correct, regardless of what 'significance' indicates. The outcome is just a piece of data and information, not a definitive proof of anything and certainly not a decision. > 'Data' is only one way of assessing the benefits and outcomes of the change; there are many other factors you may need to take into account, such as broader customer experience, brand perception, alignment with wider strategic initiatives, etc. These aspects cannot be reduced to simple metrics and data, and if you ignore them you risk damaging your business. > Experimentation is not just a binary way of deciding whether to deploy something; it is a way to test theories that might help support strategy or give rise to bigger ideas. A test is a way to learn something about customer behaviour and develop theories based on that behaviour. By limiting it to just a deployment decision, you lose this potential value. Experimentation is just one form of research among others, all of which should be used in parallel to support the entire process of innovation. More importantly, recognise that YOU are making the decisions, not data. In that case, how are you making decisions? What is your PROCESS & SYSTEM for making effective and efficient decisions? #ecommerce #retail #digitalexperience #cro #experimentation
-
We have all heard it… AI initiatives failing left and right. Same story, different company. After running AI Readiness Assessments across industries, the pattern is crystal clear. Leaders usually skip a critical step. They don't identify the right problems to solve. Instead? They chase shiny AI solutions without understanding their operational pain points. And the result is what we’ve been reading about… impressive demos that don’t deliver real business value. We can fix that… We can cut through the noise and identify AI use cases that actually deliver ROI. → Start with pain, not possibility Map your biggest operational bottlenecks first. Where do processes consistently break down? Where do teams burn hours on repetitive tasks that add zero value? → Measure the human cost Calculate time spent, error rates, opportunity costs. If a process isn't costing you significant resources, AI won't create significant value. → Test the AI-human fit The best AI implementations humanize technology, they make people more effective, not redundant. Ask yourself: will this AI solution work with your team or against them? → Validate with small bets Before building enterprise wide systems, prove value with contained pilots that have clear success metrics. Want to see ROI follow these steps. 👆 They will help you solve actual business problems with technology that aligns people and processes. Not the other way around. What's the biggest operational pain point costing your business right now? ♻️ → Repost if you found this useful! ______________ 𝗙𝗼𝗿 𝗺𝗼𝗿𝗲 𝘁𝗶𝗽𝘀, 𝗳𝗼𝗹𝗹𝗼𝘄 me: @𝗻𝗮𝘀𝘀𝗶𝗮𝘀𝗸𝗼𝘂𝗹𝗶𝗸𝗮𝗿𝗶𝘁𝗶
-
Ever feel stuck trying to pick the "right" design problem to solve? You’re not alone. Most designers rush to solutions before they even know if they’re solving the right thing. Here’s how I find the best design problems- and how you can too: • 𝗦𝘁𝗮𝗿𝘁 𝘄𝗶𝘁𝗵 𝗿𝗲𝗮𝗹 𝘂𝘀𝗲𝗿𝘀, 𝗻𝗼𝘁 𝗴𝘂𝗲𝘀𝘀𝗲𝘀. Watch how people actually use your product. Don’t just listen to what they say- see what they do. Dive deep into how they do things right now. You’ll spot hidden pain points and strange shortcuts surveys miss. • 𝗞𝗲𝗲𝗽 𝗮𝘀𝗸𝗶𝗻𝗴 “𝘄𝗵𝘆”. Don’t settle for the first answer. Dig deeper. The best problems hide beneath surface complaints. Asking “why” helps you identify the real barriers. • 𝗗𝗲𝗳𝗶𝗻𝗲 𝘁𝗵𝗲 𝗽𝗿𝗼𝗯𝗹𝗲𝗺 𝗰𝗹𝗲𝗮𝗿𝗹𝘆 𝗯𝗲𝗳𝗼𝗿𝗲 𝗷𝘂𝗺𝗽𝗶𝗻𝗴 𝘁𝗼 𝘀𝗼𝗹𝘂𝘁𝗶𝗼𝗻𝘀. Write it simply. No jargon, no features. Just what’s broken and for whom. If anyone can understand your problem statement, you’re on the right track. • 𝗟𝗼𝗼𝗸 𝗳𝗼𝗿 𝗽𝗮𝘁𝘁𝗲𝗿𝗻𝘀, 𝗻𝗼𝘁 𝗼𝗻𝗲-𝗼𝗳𝗳𝘀. A good design problem isn’t just a bug- it’s a pattern. If the same struggle shows up in different places or users, you’ve found something worth fixing. • 𝗧𝗵𝗶𝗻𝗸 𝗯𝗲𝘆𝗼𝗻𝗱 𝘁𝗵𝗲 𝗯𝗿𝗶𝗲𝗳. Sometimes the client’s request is just part of the story. Step back. Is there a deeper, bigger problem you can solve? The best designers create solutions people didn’t even know they needed. Solving small, obvious problems is easy. Spotting the invisible problems- the ones that change the whole experience- is what makes you stand out. When you focus on finding the right problems, not just any problem, that’s when you start to create a real impact. Follow for more practical design insights you can use everyday.
-
founder learnings! part 8. A/B test math interpretation - I love stuff like this: Two members of our team (Fletcher Ehlers and Marie-Louise Brunet) - ran a test recently that decreased click-through rate (CTR) by over 10% - they added a warning telling users they’d need to log in if they clicked. However - instead of hurting conversions like you’d think, it actually increased them. As in - Fewer users clicked through, but overall, more users ended up finishing the flow. Why? Selection bias & signal vs. noise. By adding friction, we filtered out low-intent users—those who would have clicked but bounced at the next step. The ones who still clicked knew what they were getting into, making them far more likely to convert. Fewer clicks, but higher quality clicks. Here's a visual representation of the A/B test results. You can see how the click-through rate (CTR) dropped after adding friction (fewer clicks), but the total number of conversions increased. This highlights the power of understanding selection bias—removing low-intent users improved the quality of clicks, leading to better overall results.
-
Most project failures aren’t execution errors. They’re upstream misunderstandings. Your Gantt chart is already in trouble if the problem isn’t framed right. In matrix environments, the pressure to move often overrides the need to understand. So, projects get scoped before anyone agrees on what’s actually broken. That’s why top-performing PMs use something called Phase Zero. A short, high-leverage pre-kickoff moment focused on problem framing, not just planning. This isn’t fluffy. It’s structured. Here’s how you know problem framing is working: ✔️ Context is documented: Why this problem matters now ✔️ Success is defined: What done looks like, clearly and measurably ✔️ Constraints are visible: Time, tech, political, or data limitations ✔️ Assumptions are surfaced: What’s being taken for granted and tested early ✔️ Stakeholder perspectives are aligned: You’ve validated that everyone sees the same issue Skipping this feels faster. But it costs you alignment, momentum, and team trust when change hits mid-execution. Execution doesn’t start at kickoff. It starts with shared clarity. And problem framing is how you get there. → Found this helpful? Repost and follow Jesus Romero for frameworks that make execution smarter, not just faster.
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development