User Experience for Educational Tools

Explore top LinkedIn content from expert professionals.

  • View profile for Sim Shagaya

    Founder of Konga, uLesson/Miva, and Myka — building enduring consumer businesses across Africa.

    11,542 followers

    Education technology is easy to build in theory. The real challenge is making it work in the hands of a student whose internet drops mid-lesson, or a working mum who is logging into university for the first time on a shared device. The test is not in creating EdTech tools but in making them work for the people who need them most. When we started uLesson in 2019, we built a platform with high-quality video lessons, quizzes, and practice tests. Everything worked perfectly in our offices in Jos and then, Abuja. But that changed when we tried to get them into the hands of students in towns and villages where electricity was unreliable, data was expensive, and smartphones were often shared among siblings. The same lessons appeared when we launched Miva Open University, an affordable, accessible university that delivers quality education with the same rigour as a physical campus. Creating the platform was one challenge; helping working adults adapt to digital learning for the first time was another. Some of our students had never studied without the structure of a physical classroom. Many were logging in from places where network connectivity was patchy at best. These challenges sit against a larger backdrop: According to Quartz, only 1 in 4 students applying to university will get accepted. Not because they didn’t study hard enough, instead, in many cases, it is because there simply isn’t enough room for all of them. From these experiences, I’ve learnt that successful EdTech implementation requires: - Designing for context: Tools must work offline or in low-bandwidth environments. - Investing in people: Teachers, facilitators, and students need training, support, and trust to use technology effectively. - Patience in adoption: Communities don’t adopt new systems overnight. Value has to be proven, and trust earned, over time. I remain convinced that EdTech will play a central role in the future of African learning. But for it to truly work, it must be built not just for ambition, but for reality. It has to be built for students walking kilometres to school, for families sharing a single device, and for communities learning to trust digital tools for the first time. We’re still learning. We’ll keep improving. And with each iteration, we get closer to delivering not just access, but quality learning wherever a student lives.

  • View profile for Rishav Gupta
    Rishav Gupta Rishav Gupta is an Influencer

    The “Why” behind the “How” | Product @ ETS

    12,358 followers

    I manage a product where if we mess up, someone doesn't get into their dream university. Their life trajectory changes. This changes how you think about product decisions. We can't A/B test critical flows. Half the users can't be in a control group that gets the worst experience when it's their one shot at the TOEFL. We can't "move fast and break things" when breaking things means a student in country X loses their test slot and has to wait another month to apply. We can't "ship it and iterate" when the iteration window is someone's grad school application deadline. You need to develop a different risk calculus entirely. Low-stakes product PM: "Let's try it and see what happens.” High-stakes product PM: "Let's map every scenario where this could go wrong.” Every edge case is someone's only case. Every error is someone's nightmare. Every downtime window is someone's missed opportunity. And you can't explain this to people who've only built low-stakes products. They think you're being slow. Being overcautious. Being bureaucratic. They don't understand that your user doesn't get a second chance. This changes how you prioritize: - Reliability beats features. - Error handling beats new capabilities. - Support infrastructure beats growth experiments. Growth means nothing if the core experience fails when it matters most. Somewhere right now, someone is taking their test. If something breaks, I can't give them their hour back. You either build for this reality, or you don't. There's no middle ground. #ProductManagement #EdTech #PMLife

  • View profile for Joao Santos

    Expert in education and training policy

    31,685 followers

    📚 New EU Report: Promoting Well-being in Digital Education 🇪🇺 The European Commission's Joint Research Centre just released a comprehensive study on well-being in digital education across EU schools. This matters for VET because our learners are digital natives navigating increasingly tech-driven learning environments—and their well-being directly impacts skill acquisition and career readiness. 🧩The Narrative Flip: We are moving from "Digital First" to "Well-being First." The document argues that digital competence is useless if it comes at the cost of physical health, social connection, or mental stability. It proposes a Model of Emerging Practices that places the human being back at the centre of the digital ecosystem. 🔑 Key insights for VET: The challenge: ▪️Digital tech enhances learning BUT creates risks: eye strain, disrupted sleep, cyberbullying, anxiety, and digital divides ▪️VET learners face unique pressures—balancing practical skills with digital competence while managing digital fatigue The opportunity: ▪️Whole-school approach works: When leaders, teachers, learners, parents, and EdTech providers collaborate, well-being improves ▪️Pedagogical balance is critical: Mix digital and analogue methods; use age-appropriate content; build in movement breaks ▪️Safety-first design: EdTech must prioritize data privacy, accessibility, and mental health considerations What VET can do: ▪️Train educators on balanced tech use and digital risks—not just digital skills ▪️Co-design learning tools with students to ensure they're fit-for-purpose ▪️Establish clear guidelines for device use, screen time, and online communication ▪️Address infrastructure gaps—reliable connectivity and devices remain barriers for vulnerable learners 💡 My take: ▪️We often treat "digital skills" as a technical box to check. This report proves that true digital competence includes the ability to disconnect, self-regulate, and stay safe. ▪️If we want a healthy workforce, we must stop treating well-being as an "add-on" to digital education. It is the foundation. ▪️In VET, we prepare young people for real-world jobs increasingly shaped by AI, automation, and digital collaboration. But if we don't prioritize their well-being in digital learning environments, we risk burnout before they even enter the workforce. #DigitalEducation #EdTech #SkillsDevelopment #WellBeing EU Employment and Skills Cedefop Eurofound European Training Foundation EfVET European Association of Institutes for Vocational Training (EVBB) European Vocational Training Association - EVTA EUproVET EURASHE eucen CoP CoVEs UNESCO-UNEVOC International Labour Organization OECD Education and Skills World Federation of Colleges and Polytechnics (WFCP) WorldSkills International National Centre for Vocational Education Research (NCVER) VETNET-Europe IEFP - Instituto do Emprego e Formação Profissional Agência Nacional Erasmus+ Educação e Formação Teresa e Alexandre Soares dos Santos - Iniciativa Educação

  • View profile for Tamer Sabry

    Chief Product Officer | AI & SaaS Expert | Digital Transformation Leader | Ecommerce & Logistics Specialist | Startup Builder | AI Instructor | Prompt Engineer | Former Amazon VP | Led Multiple Successful Exits

    21,997 followers

    Most product managers prioritize features the wrong way. AI can fix that. Here are 3 powerful AI prompts to revolutionize your workflow. Here are 3 AI prompts that will change how you rank features based on user needs and business impact: 1️⃣ Comprehensive Feature Analysis: A deep dive into each feature's potential impact and alignment with goals. 💡 Prompt: "Analyze the following features: {feature_list}. For each feature, provide a detailed assessment of its potential impact on user satisfaction, retention, and revenue growth. Consider our current user base demographics, market trends, and competitive landscape. Prioritize these features based on their alignment with our Q4 goal of improving user retention by 15%. Finally, rank the features in order of priority and explain the rationale behind this ranking." 2️⃣ User Feedback Synthesizer: AI powered analysis of user pain points and feature requests. 💡 Prompt: "Aggregate and analyze customer feedback from the following sources: {feedback_sources} (e.g., app store reviews, customer support tickets, user interviews, NPS surveys). Identify the top 5 recurring themes or pain points mentioned by users. For each theme, provide specific examples of user quotes or data points. Rank these themes based on frequency of mention and severity of impact on user experience. Then, map each theme to potential feature improvements or new feature ideas. Prioritize these feature ideas based on their potential to address user pain points, estimated development effort, and alignment with our product strategy. Share a detailed rationale for your prioritization, including any potential risks or trade-offs to consider." 3️⃣ Development Effort Estimator: A comprehensive analysis of resource requirements. 💡 Prompt: "Estimate the development effort for implementing {feature_name} in our {product_type}, considering our team of 10 engineers and 8-week timeline. Break down the implementation into key components or stages (e.g., design, frontend development, backend development, testing, deployment). For each component, estimate the number of engineer-days required, potential technical challenges, and any dependencies on other systems or third-party integrations. Consider our team's expertise and any learning curve associated with new technologies. Identify any potential bottlenecks or risks that could impact the timeline. Suggest strategies to mitigate these risks, such as parallel development tracks or phased rollout approaches. Provide a confidence level (low, medium, high) for each estimate and explain the reasoning. Finally, give a range estimate for the total development time (best case, expected case, worst case) and suggest any features or scope that could be adjusted to fit within the 8-week timeline if necessary." Product Managers, these AI prompts are designed to enhance your decision making, not replace it. Use them to gain data-driven insights, then apply your expertise to make the final call.

  • View profile for Maxim Poulsen

    GTM stuff @Contrast | #1 webinar platform for HubSpot | Growth & Automation Nerd

    54,509 followers

    6mo ago → 70% of users skipped onboarding. How we reduced this to 22%: A bit of context: - Our activation sucked (~25%) - People got to the ‘Aha-moment’ too late — Some never got to it at all — Others gave up before they got to it So we asked ourselves: "How can we front-load the 'Aha-moment' and get people to experience it as early as possible?" So we launched the "Playground": → Fully interactive demo on homepage → All features from the Studio → No signup needed Early data showed it was working: → Prospects familiar with Studio before demos → Website traffic (+100% on release week) → Signup% (2% → 5%) But this didn’t improve activation: People needed to get value from our product. We’re a playful team. But we know some people have sh*t to get done. We have 2 different "types" of signups: 1. Explore the product (to see if worth switching) 2. Set up their first event and landing page Everyone thinks of onboarding as linear. But by definition: that won’t work for everyone. Introducing: Flexible funnels We have a lot of gamers on the team. So that’s where we looked for inspiration. Games let you roam free (even through tutorials). This gives users a sense of control. And discovery. So we created a new onboarding flow: (See gif below) And the first results are promising: → 55% of people explore the Studio → 22% of people set up a webinar Only 22% of people skip (mostly post-demo signups). What prompted us to make the change? We thought about what would help users. Instead of pushing them down the path we wanted them to follow.

  • View profile for Rupika Taneja

    Co-Founder at Codeyoung | Ex-Flipkart | IIT Delhi

    6,520 followers

    One of the biggest mistakes we can make in EdTech? {We’re guilty of this too + what we did to navigate it} : Thinking that dashboards and reports tell the full story of learning. At Codeyoung, we decided to challenge that : Ran a customer feedback initiative + involved leadership team - not just support team. Instead of relying solely on numbers, we spoke directly to our customers—the parents. We reached out to parents & personally connected with them on 1:1 calls. No filters - just real conversations about their child’s learning journey. And what we learned was eye-opening. 1/ What’s working: Parents shared how their kids are growing in confidence, logical thinking, and problem-solving—skills that go beyond the classroom. 2/ What needs work: Some parents pointed out areas we could improve - insights we would’ve never spotted in a dashboard. For eg. parents wanted more involvement in their child’s learning journey - so we have started sending daily summaries to address that and now they are loving it! 3/ What surprised us: One parent had an idea so valuable, we’re already working on bringing it to life! Coming soon :) 𝐖𝐡𝐲 𝐓𝐡𝐢𝐬 𝐌𝐚𝐭𝐭𝐞𝐫𝐬 📍 Proximity to Customers = Better Decisions A spreadsheet can tell you completion rates, but a conversation tells you how a child feels while learning. That’s where the real insights are. 📍 Leadership Needs to Be Involved This wasn’t just a task for our customer support team. Our leadership was in these conversations because building a great learning experience starts with truly understanding the learner. 📍 Data vs. Stories: You Need Both Metrics tell you what’s happening, but parents and students tell you why it matters. And sometimes, that one insight can redefine an entire strategy. —— At Codeyoung, we’re not just teaching kids - we’re shaping how they learn, think & grow. And that’s why listening to parents and students will always be at the heart of what we do. If you’re building in EdTech, how often do you talk to your customers? #founder #customers #edtech

  • View profile for Bahareh Jozranjbar, PhD

    UX Researcher at PUX Lab | Human-AI Interaction Researcher at UALR

    10,022 followers

    User experience surveys are often underestimated. Too many teams reduce them to a checkbox exercise - a few questions thrown in post-launch, a quick look at average scores, and then back to development. But that approach leaves immense value on the table. A UX survey is not just a feedback form; it’s a structured method for learning what users think, feel, and need at scale- a design artifact in its own right. Designing an effective UX survey starts with a deeper commitment to methodology. Every question must serve a specific purpose aligned with research and product objectives. This means writing questions with cognitive clarity and neutrality, minimizing effort while maximizing insight. Whether you’re measuring satisfaction, engagement, feature prioritization, or behavioral intent, the wording, order, and format of your questions matter. Even small design choices, like using semantic differential scales instead of Likert items, can significantly reduce bias and enhance the authenticity of user responses. When we ask users, "How satisfied are you with this feature?" we might assume we're getting a clear answer. But subtle framing, mode of delivery, and even time of day can skew responses. Research shows that midweek deployment, especially on Wednesdays and Thursdays, significantly boosts both response rate and data quality. In-app micro-surveys work best for contextual feedback after specific actions, while email campaigns are better for longer, reflective questions-if properly timed and personalized. Sampling and segmentation are not just statistical details-they’re strategy. Voluntary surveys often over-represent highly engaged users, so proactively reaching less vocal segments is crucial. Carefully designed incentive structures (that don't distort motivation) and multi-modal distribution (like combining in-product, email, and social channels) offer more balanced and complete data. Survey analysis should also go beyond averages. Tracking distributions over time, comparing segments, and integrating open-ended insights lets you uncover both patterns and outliers that drive deeper understanding. One-off surveys are helpful, but longitudinal tracking and transactional pulse surveys provide trend data that allows teams to act on real user sentiment changes over time. The richest insights emerge when we synthesize qualitative and quantitative data. An open comment field that surfaces friction points, layered with behavioral analytics and sentiment analysis, can highlight not just what users feel, but why. Done well, UX surveys are not a support function - they are core to user-centered design. They can help prioritize features, flag usability breakdowns, and measure engagement in a way that's scalable and repeatable. But this only works when we elevate surveys from a technical task to a strategic discipline.

  • View profile for Hasanga Abeyaratne

    Create something new while fully preserving what is familiar.

    13,810 followers

    Before you write a single requirement, consider this: Are you solving the right problems? To ensure your product aligns with user needs and supports your business goals, start with a problem framing session and design thinking workshop. Why? By involving users early and identifying relevant problems, you can: 1. Identify which problems and feature requests are truly relevant. 2. Uncover pain points users experience. 3. Align features with your business goals to maximize impact. The benefit? Designers gain clarity on user priorities, while diverse perspectives uncover fresh insights to overlooked challenges—ensuring solutions that align with both user needs and business objectives. The result: • A more user-centric product. • No wasted development resources on irrelevant features. • A stronger competitive edge. Start by framing the problem to uncover what will have the most impact, and include designers and user testing to build smarter, more effective products.

  • View profile for Chris Agnew

    ⚡️Future Focused Learning | AI Research | Applied & Experiential Learning Evangelist 🌱

    7,811 followers

    Everywhere you look, there’s a survey asking students if they’re using AI for learning. The problem is, we don't have real user data on how much are they using it and more importantly, how are they using it? Last month, Anthropic released a paper on CLIO (Claude Insights and Observations) — a tool that analyzes usage patterns of Claude while protecting user privacy. Think of it as Google Trends, but for LLMs. It didn’t get nearly the attention it deserved. (link in the comments) Imagine applying this idea to education: CLIO for learning. Instead of relying on surveys, real anonymized data would help education leaders understand how high school and college students are engaging with AI tools like LLMs in their coursework. 1️⃣ How often are students using AI tools? 2️⃣ Are they using them as “answer engines” or for deeper exploration of topics? 3️⃣ What drives brief, one-and-done interactions versus extended, curiosity-driven engagement in a topic? Right now, we have no real data points for teachers or school leaders to understand how students are interacting with these tools. Banning AI doesn’t work. AI detection tools are ineffective at best. School and district leaders empowered with data on volume of use, types of use, and what contributes to use that furthers learning, sets up the millions of gifted educators across the country with the information they need to evolve learning environments that keep rigor, improve engagement, and help young people thrive in the future. 

  • View profile for Vartika Mishra

    Marketer | Building in AI | Sharing What I Learn Every Day

    40,585 followers

    If your onboarding feels clunky, confusing, or last-minute… your client can feel it too. The work doesn’t begin after the payment. It begins the moment someone says “yes.” And this is where most people drop the ball. I’ve been there too. Until I started using AI to simplify, personalize, and hold space for my onboarding flow, without losing the human in the process. Here’s what that looks like: Step 1: Welcome, with intention: As soon as a client signs up, I feed their context to ChatGPT: “Write a warm welcome email to a new client who just signed up for [X service]. Acknowledge their goals, set the tone for our work together, and share what to expect this week.” It helps me start the relationship right, with presence, not a template. . . . Step 2: Kickoff kit, custom to them Instead of sending a generic Notion board or onboarding doc… I use AI to create a personalized one-pager: - Their name, goals, timeline - Pre-work checklist - Tools we’ll use - Access links - FAQs based on their niche It makes them feel seen. . . . Step 3: Pre-call prep that’s actually useful If I’ve collected form answers or voice notes, I prompt: “Summarize this client’s challenges and suggest 3 angles I should explore in our kickoff call.” I walk into the call aligned and calm. They feel it. . . . Step 4: Clarity recap - fast After the call, I feed my notes to ChatGPT: “Turn this into a call recap email with clear next steps and aligned expectations. Keep it real, not robotic.” It saves 30 minutes of staring at the screen and helps me build trust in the tiny details. . . . Step 5: Ongoing onboarding, quietly handled Need reminders? Nudges? Status updates? I’ll set up small AI workflows that keep things moving without nagging or micro-managing. Because onboarding isn’t a task. It’s the first chapter of your client experience. You don’t need AI to replace the way you work. But you can use it to hold the edges, so you show up more fully in the middle. That’s what onboarding should feel like. Intentional. Warm. Clear. And deeply human. If you want the actual AI stack I use to support this flow (without feeling cold or corporate), comment "ONBOARD" or DM me and I’ll send it over. Follow Vartika Mishra !

Explore categories