I manage a product where if we mess up, someone doesn't get into their dream university. Their life trajectory changes. This changes how you think about product decisions. We can't A/B test critical flows. Half the users can't be in a control group that gets the worst experience when it's their one shot at the TOEFL. We can't "move fast and break things" when breaking things means a student in country X loses their test slot and has to wait another month to apply. We can't "ship it and iterate" when the iteration window is someone's grad school application deadline. You need to develop a different risk calculus entirely. Low-stakes product PM: "Let's try it and see what happens.” High-stakes product PM: "Let's map every scenario where this could go wrong.” Every edge case is someone's only case. Every error is someone's nightmare. Every downtime window is someone's missed opportunity. And you can't explain this to people who've only built low-stakes products. They think you're being slow. Being overcautious. Being bureaucratic. They don't understand that your user doesn't get a second chance. This changes how you prioritize: - Reliability beats features. - Error handling beats new capabilities. - Support infrastructure beats growth experiments. Growth means nothing if the core experience fails when it matters most. Somewhere right now, someone is taking their test. If something breaks, I can't give them their hour back. You either build for this reality, or you don't. There's no middle ground. #ProductManagement #EdTech #PMLife
Prioritizing User Experience in EdTech Development
Explore top LinkedIn content from expert professionals.
Summary
Prioritizing user experience in EdTech development means designing educational technology so that real students and educators can easily use it and benefit from it, especially when their learning opportunities—and sometimes their futures—are at stake. This approach focuses on making digital tools reliable, accessible, and relevant to the unique challenges faced by learners and teachers in diverse environments.
- Build for reliability: Always ensure your platform works dependably, even in situations where technical failures could disrupt critical learning moments or deadlines.
- Design with empathy: Take time to understand the context and needs of students and educators, including those with limited access to devices, internet, or who are experiencing digital learning for the first time.
- Listen and adapt: Gather real feedback directly from users—parents, students, and teachers—so you can keep improving the platform and address concerns that dashboards and analytics might miss.
-
-
Education technology is easy to build in theory. The real challenge is making it work in the hands of a student whose internet drops mid-lesson, or a working mum who is logging into university for the first time on a shared device. The test is not in creating EdTech tools but in making them work for the people who need them most. When we started uLesson in 2019, we built a platform with high-quality video lessons, quizzes, and practice tests. Everything worked perfectly in our offices in Jos and then, Abuja. But that changed when we tried to get them into the hands of students in towns and villages where electricity was unreliable, data was expensive, and smartphones were often shared among siblings. The same lessons appeared when we launched Miva Open University, an affordable, accessible university that delivers quality education with the same rigour as a physical campus. Creating the platform was one challenge; helping working adults adapt to digital learning for the first time was another. Some of our students had never studied without the structure of a physical classroom. Many were logging in from places where network connectivity was patchy at best. These challenges sit against a larger backdrop: According to Quartz, only 1 in 4 students applying to university will get accepted. Not because they didn’t study hard enough, instead, in many cases, it is because there simply isn’t enough room for all of them. From these experiences, I’ve learnt that successful EdTech implementation requires: - Designing for context: Tools must work offline or in low-bandwidth environments. - Investing in people: Teachers, facilitators, and students need training, support, and trust to use technology effectively. - Patience in adoption: Communities don’t adopt new systems overnight. Value has to be proven, and trust earned, over time. I remain convinced that EdTech will play a central role in the future of African learning. But for it to truly work, it must be built not just for ambition, but for reality. It has to be built for students walking kilometres to school, for families sharing a single device, and for communities learning to trust digital tools for the first time. We’re still learning. We’ll keep improving. And with each iteration, we get closer to delivering not just access, but quality learning wherever a student lives.
-
📚 New EU Report: Promoting Well-being in Digital Education 🇪🇺 The European Commission's Joint Research Centre just released a comprehensive study on well-being in digital education across EU schools. This matters for VET because our learners are digital natives navigating increasingly tech-driven learning environments—and their well-being directly impacts skill acquisition and career readiness. 🧩The Narrative Flip: We are moving from "Digital First" to "Well-being First." The document argues that digital competence is useless if it comes at the cost of physical health, social connection, or mental stability. It proposes a Model of Emerging Practices that places the human being back at the centre of the digital ecosystem. 🔑 Key insights for VET: The challenge: ▪️Digital tech enhances learning BUT creates risks: eye strain, disrupted sleep, cyberbullying, anxiety, and digital divides ▪️VET learners face unique pressures—balancing practical skills with digital competence while managing digital fatigue The opportunity: ▪️Whole-school approach works: When leaders, teachers, learners, parents, and EdTech providers collaborate, well-being improves ▪️Pedagogical balance is critical: Mix digital and analogue methods; use age-appropriate content; build in movement breaks ▪️Safety-first design: EdTech must prioritize data privacy, accessibility, and mental health considerations What VET can do: ▪️Train educators on balanced tech use and digital risks—not just digital skills ▪️Co-design learning tools with students to ensure they're fit-for-purpose ▪️Establish clear guidelines for device use, screen time, and online communication ▪️Address infrastructure gaps—reliable connectivity and devices remain barriers for vulnerable learners 💡 My take: ▪️We often treat "digital skills" as a technical box to check. This report proves that true digital competence includes the ability to disconnect, self-regulate, and stay safe. ▪️If we want a healthy workforce, we must stop treating well-being as an "add-on" to digital education. It is the foundation. ▪️In VET, we prepare young people for real-world jobs increasingly shaped by AI, automation, and digital collaboration. But if we don't prioritize their well-being in digital learning environments, we risk burnout before they even enter the workforce. #DigitalEducation #EdTech #SkillsDevelopment #WellBeing EU Employment and Skills Cedefop Eurofound European Training Foundation EfVET European Association of Institutes for Vocational Training (EVBB) European Vocational Training Association - EVTA EUproVET EURASHE eucen CoP CoVEs UNESCO-UNEVOC International Labour Organization OECD Education and Skills World Federation of Colleges and Polytechnics (WFCP) WorldSkills International National Centre for Vocational Education Research (NCVER) VETNET-Europe IEFP - Instituto do Emprego e Formação Profissional Agência Nacional Erasmus+ Educação e Formação Teresa e Alexandre Soares dos Santos - Iniciativa Educação
-
One of the biggest mistakes we can make in EdTech? {We’re guilty of this too + what we did to navigate it} : Thinking that dashboards and reports tell the full story of learning. At Codeyoung, we decided to challenge that : Ran a customer feedback initiative + involved leadership team - not just support team. Instead of relying solely on numbers, we spoke directly to our customers—the parents. We reached out to parents & personally connected with them on 1:1 calls. No filters - just real conversations about their child’s learning journey. And what we learned was eye-opening. 1/ What’s working: Parents shared how their kids are growing in confidence, logical thinking, and problem-solving—skills that go beyond the classroom. 2/ What needs work: Some parents pointed out areas we could improve - insights we would’ve never spotted in a dashboard. For eg. parents wanted more involvement in their child’s learning journey - so we have started sending daily summaries to address that and now they are loving it! 3/ What surprised us: One parent had an idea so valuable, we’re already working on bringing it to life! Coming soon :) 𝐖𝐡𝐲 𝐓𝐡𝐢𝐬 𝐌𝐚𝐭𝐭𝐞𝐫𝐬 📍 Proximity to Customers = Better Decisions A spreadsheet can tell you completion rates, but a conversation tells you how a child feels while learning. That’s where the real insights are. 📍 Leadership Needs to Be Involved This wasn’t just a task for our customer support team. Our leadership was in these conversations because building a great learning experience starts with truly understanding the learner. 📍 Data vs. Stories: You Need Both Metrics tell you what’s happening, but parents and students tell you why it matters. And sometimes, that one insight can redefine an entire strategy. —— At Codeyoung, we’re not just teaching kids - we’re shaping how they learn, think & grow. And that’s why listening to parents and students will always be at the heart of what we do. If you’re building in EdTech, how often do you talk to your customers? #founder #customers #edtech
-
User experience surveys are often underestimated. Too many teams reduce them to a checkbox exercise - a few questions thrown in post-launch, a quick look at average scores, and then back to development. But that approach leaves immense value on the table. A UX survey is not just a feedback form; it’s a structured method for learning what users think, feel, and need at scale- a design artifact in its own right. Designing an effective UX survey starts with a deeper commitment to methodology. Every question must serve a specific purpose aligned with research and product objectives. This means writing questions with cognitive clarity and neutrality, minimizing effort while maximizing insight. Whether you’re measuring satisfaction, engagement, feature prioritization, or behavioral intent, the wording, order, and format of your questions matter. Even small design choices, like using semantic differential scales instead of Likert items, can significantly reduce bias and enhance the authenticity of user responses. When we ask users, "How satisfied are you with this feature?" we might assume we're getting a clear answer. But subtle framing, mode of delivery, and even time of day can skew responses. Research shows that midweek deployment, especially on Wednesdays and Thursdays, significantly boosts both response rate and data quality. In-app micro-surveys work best for contextual feedback after specific actions, while email campaigns are better for longer, reflective questions-if properly timed and personalized. Sampling and segmentation are not just statistical details-they’re strategy. Voluntary surveys often over-represent highly engaged users, so proactively reaching less vocal segments is crucial. Carefully designed incentive structures (that don't distort motivation) and multi-modal distribution (like combining in-product, email, and social channels) offer more balanced and complete data. Survey analysis should also go beyond averages. Tracking distributions over time, comparing segments, and integrating open-ended insights lets you uncover both patterns and outliers that drive deeper understanding. One-off surveys are helpful, but longitudinal tracking and transactional pulse surveys provide trend data that allows teams to act on real user sentiment changes over time. The richest insights emerge when we synthesize qualitative and quantitative data. An open comment field that surfaces friction points, layered with behavioral analytics and sentiment analysis, can highlight not just what users feel, but why. Done well, UX surveys are not a support function - they are core to user-centered design. They can help prioritize features, flag usability breakdowns, and measure engagement in a way that's scalable and repeatable. But this only works when we elevate surveys from a technical task to a strategic discipline.
-
Before you write a single requirement, consider this: Are you solving the right problems? To ensure your product aligns with user needs and supports your business goals, start with a problem framing session and design thinking workshop. Why? By involving users early and identifying relevant problems, you can: 1. Identify which problems and feature requests are truly relevant. 2. Uncover pain points users experience. 3. Align features with your business goals to maximize impact. The benefit? Designers gain clarity on user priorities, while diverse perspectives uncover fresh insights to overlooked challenges—ensuring solutions that align with both user needs and business objectives. The result: • A more user-centric product. • No wasted development resources on irrelevant features. • A stronger competitive edge. Start by framing the problem to uncover what will have the most impact, and include designers and user testing to build smarter, more effective products.
-
Today, during a call with a customer, we heard a perspective that deserves broader attention in the world of online proctoring and assessments. "With 10,000+ candidates, all with very different levels of tech skill, our biggest hurdle isn’t just security—it’s making sure every single test-taker can easily get set up, log in, and feel confident on exam day." In an era where AI and automation are transforming proctoring, this conversation reminds us that the candidate journey is just as important as security. For them, the real differentiator isn’t just locking down browsers or monitoring for misconduct, it’s reducing the time and support needed for candidates to get comfortable with the technology, especially when proficiency varies so widely. Key takeaways for EdTech and proctoring: Candidate tech support can consume more human hours than live proctoring itself. Streamlined onboarding and integrated, user-friendly platforms are vital for both integrity and inclusivity. AI’s most valuable role may be in guiding and reassuring candidates, not just watching for rule-breakers. This reminder to design for empathy, not just compliance, is a lesson every EdTech leader should internalize. https://lnkd.in/gXxY6BWY
-
I’ve been thinking about this recent Inside Higher Ed article about the mental load students carry and the impact it has on their ability to learn. As a father, I’m acutely aware of helping my children “balance life” and how that balance impacts their mental wellbeing. As a technologist, understanding this balance is critical to building learning tools for students. Beyond building tools to improve learning and mastery of concepts, we must build with our end user in mind – the student. We owe it to students to build products that consider and support their mental wellbeing by reducing stress, encouraging stronger engagement and ultimately, helping them feel in control of their learning pace. The IHE article revealed that improving students’ study habits (54%) and having more time to study (47%) are the top ways to alleviate some of the mental anxiety they face. And edtech can help in those areas – especially if products are built with mental wellbeing in mind. Some areas that must be considered throughout development: ⏱️ Accessibility and versatility: meeting students with options to engage the product when, how and where they want 🔐 Safety: ensuring comfortable, secure, inclusive tech environments that support individual learning styles 💡 Engagement and community: creating learning moments that inspire curiosity, spark new ideas and encourage open collaboration 🏆 Recognition: encouraging and motivating students to engage and celebrate learning moments What other considerations must be made in developing tools that support greater mental wellbeing and make learning a “lighter lift”? #EdTech #HigherEducation #MentalWellness https://lnkd.in/gmz_T7pF
-
When teams need to move fast but still prioritize UX, flexible in-app elements can be a way to iterate without waiting on engineering bandwidth. For Eyeball Football Technologies, a platform built for football players, the goal was to make the homepage experience dynamic and responsive without long development cycles. Instead of building everything from scratch, they used interactive elements that could be updated instantly without engineering effort: 1/ Smart activation nudges to encourage players to sign up and unlock the full experience 2/ In-app stories to launch educational content—like key English phrases to help players communicate better across multilingual teams 3/ Season-based placeholders to keep players engaged while waiting for match data, ensuring a seamless transition when the season starts This approach let Eyeball refine their user experience on the go, ensuring players always had relevant, engaging content—without waiting on app updates. #appgrowth #userexperience #userengagement #nocode
-
Most product managers prioritize features the wrong way. AI can fix that. Here are 3 powerful AI prompts to revolutionize your workflow. Here are 3 AI prompts that will change how you rank features based on user needs and business impact: 1️⃣ Comprehensive Feature Analysis: A deep dive into each feature's potential impact and alignment with goals. 💡 Prompt: "Analyze the following features: {feature_list}. For each feature, provide a detailed assessment of its potential impact on user satisfaction, retention, and revenue growth. Consider our current user base demographics, market trends, and competitive landscape. Prioritize these features based on their alignment with our Q4 goal of improving user retention by 15%. Finally, rank the features in order of priority and explain the rationale behind this ranking." 2️⃣ User Feedback Synthesizer: AI powered analysis of user pain points and feature requests. 💡 Prompt: "Aggregate and analyze customer feedback from the following sources: {feedback_sources} (e.g., app store reviews, customer support tickets, user interviews, NPS surveys). Identify the top 5 recurring themes or pain points mentioned by users. For each theme, provide specific examples of user quotes or data points. Rank these themes based on frequency of mention and severity of impact on user experience. Then, map each theme to potential feature improvements or new feature ideas. Prioritize these feature ideas based on their potential to address user pain points, estimated development effort, and alignment with our product strategy. Share a detailed rationale for your prioritization, including any potential risks or trade-offs to consider." 3️⃣ Development Effort Estimator: A comprehensive analysis of resource requirements. 💡 Prompt: "Estimate the development effort for implementing {feature_name} in our {product_type}, considering our team of 10 engineers and 8-week timeline. Break down the implementation into key components or stages (e.g., design, frontend development, backend development, testing, deployment). For each component, estimate the number of engineer-days required, potential technical challenges, and any dependencies on other systems or third-party integrations. Consider our team's expertise and any learning curve associated with new technologies. Identify any potential bottlenecks or risks that could impact the timeline. Suggest strategies to mitigate these risks, such as parallel development tracks or phased rollout approaches. Provide a confidence level (low, medium, high) for each estimate and explain the reasoning. Finally, give a range estimate for the total development time (best case, expected case, worst case) and suggest any features or scope that could be adjusted to fit within the 8-week timeline if necessary." Product Managers, these AI prompts are designed to enhance your decision making, not replace it. Use them to gain data-driven insights, then apply your expertise to make the final call.
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development