Medical students across seven Shanghai schools reported using AI tools about 5 times per week on average, with more than 90% using at least two platforms and over 60% using three or more. Mainstream models like DeepSeek, Doubao, and ChatGPT dominate, with students deploying them for theoretical learning, exam question analysis, information retrieval, literature interpretation, and research design. Despite this heavy use, only about one-fifth reported access to a dedicated institutional AI education platform, and satisfaction with those platforms was mixed, underscoring a gap between organic student demand and formal educational infrastructure. Key takeaways - AI adoption is high and multipolar: 94% use DeepSeek, 58% Doubao, 65% ChatGPT, and 91% use 2+ platforms, reflecting a “tool stack” approach rather than reliance on a single model. - Usage patterns differ by stage and role: undergrads lean on AI for exam prep, while master’s and doctoral students use it more for research tasks like study design and data analysis; part-time students show especially strong needs in research and exam analysis due to work–study pressures. - Disciplines shape use cases: clinical and nursing students rely more on AI for question analysis, whereas basic medicine, public health, and pharmacy students emphasize research support; traditional Chinese medicine students have lower demand for literature translation and interpretation. - Only 20% reported an institutional AI education platform, and satisfaction among users was modest (mean ~72/100, with ~21% dissatisfied), with no strong predictors of satisfaction—suggesting current platforms are generic rather than tightly aligned with task–technology fit. - Students overwhelmingly expect future platforms to go beyond generic chatbots toward tailored functions such as literature translation and interpretation, exam analysis, clinical trial support, basic lab assistance, knowledge mapping, frontier knowledge curation, virtual simulation, and even intelligent emotional support, with priorities varying by discipline and educational stage. Dipu’s Take Medical schools are no longer deciding whether students will use AI; they are deciding whether that use will be intentional, equitable, and well-governed. The signal here is clear: students want institutionally supported, discipline-aware, and stage-specific platforms that integrate AI into core workflows. The real opportunity is to treat AI platforms as part of the educational architecture that is co-designed with students, differentiated for undergrads vs. postgrads and clinical vs. basic sciences, and evaluated on how well they actually fit tasks like exam prep, research, and skills training, rather than as a one-size-fits-all pilot project.
Analyzing User Behavior in Educational Platforms
Explore top LinkedIn content from expert professionals.
Summary
Analyzing user behavior in educational platforms means studying how students interact with online learning tools to understand their learning patterns, preferences, and challenges. By examining user actions, like which features they use or how long they stay engaged, educators and technologists can improve educational experiences and outcomes for diverse groups of learners.
- Track meaningful actions: Look beyond simple metrics like logins and monitor deeper activities such as discussions, quiz completions, and resource usage to gain a clearer view of student engagement.
- Pair data with insight: Combine user data with behavioral observations and student feedback to understand not just what learners do, but why they make certain choices or face obstacles.
- Adapt for every learner: Use behavior patterns to design features and support systems that match the unique needs of different user groups, improving the platform experience for all.
-
-
A few years ago, I worked with an online education platform facing challenges with student engagement. While they had a significant number of users enrolling in courses, they struggled with low participation rates in course discussions and activities, leading to a decline in course completion rates. The platform needed to identify the causes behind low engagement and implement strategies to encourage more active participation. Improving Student Engagement Using Data Analytics 1️⃣ Analyzing Engagement Data We began by analyzing user interaction data, focusing on metrics such as time spent on the platform, participation in discussions, video completion rates, and quiz scores. Using SQL, we aggregated the data to identify patterns and pinpoint where students were losing interest. SELECT student_id, course_id, AVG(time_spent) AS avg_time_spent, COUNT(discussion_post_id) AS posts_made, AVG(quiz_score) AS avg_quiz_score FROM student_activity GROUP BY student_id, course_id; 🔹 Insight: We identified that students who interacted with course discussions and quizzes had higher completion rates, while others dropped off quickly. 2️⃣ Building a Predictive Model We then created a predictive model to determine which students were at risk of disengaging based on their activity patterns. The model incorporated features such as time spent on the platform, participation in discussions, and progress through the course material. # Pseudocode for Predictive Model def predict_student_engagement(student_data): model = train_engagement_model(student_data) predictions = model.predict(student_data) return predictions 🔹 Insight: This model helped us flag students who were likely to disengage early, allowing for timely interventions. 3️⃣ Implementing Engagement Strategies Based on insights from the model, we implemented strategies such as sending personalized emails with reminders, offering incentives for completing activities, and increasing interaction opportunities through live Q&A sessions. # Pseudocode for Engagement Follow-Up def send_engagement_reminder(student_data): if model.predict(student_data) == 'at_risk': send_email_reminder(student_data) 🔹 Insight: Personalized engagement and incentives led to an increase in student participation. Challenges Faced Identifying meaningful engagement metrics that were predictive of success. Finding the right balance between engaging students without overwhelming them. Business Impact ✔ Student engagement improved, leading to higher completion rates. ✔ Retention rates increased, as more students continued with courses. ✔ Revenue grew, driven by more active and satisfied students. Key Takeaway: By analyzing user activity and leveraging predictive analytics, businesses can identify disengaged customers early and implement strategies to improve engagement and retention.
-
“Beta dhokha dega, data nahi.” Sounds reassuring, right? But in education especially Online courses, this belief can quietly mislead us. Yes, data analytics in education helps us track logins, completion rates, drop-offs, quiz scores. It tells us what happened. But from a Behavioural Science lens, data rarely tells us WHY it happened. 📉 A learner drops out of a MOOC. Data says: Low engagement after Week 3. Behavioural reality may be: 👉 Cognitive overload 👉Loss of identity (“people like me don’t finish MOOCs”) 👉Present bias (“I’ll do it later”) 👉Lack of social accountability None of this shows up cleanly on a dashboard. When we become obsessed with metrics, we risk: Designing for completion rates, not learning Nudging clicks instead of shaping habits ❌ Treating learners as datapoints, not humans with context, emotion, and constraints In #MOOCs, more data ≠ better decisions Unless it’s paired with: 🧠 behavioural diagnostics 🧪 experimentation (A/B tests with theory) 💬 qualitative insight So maybe the wiser mantra is: “Beta bhi dhokha de sakta hai, data bhi .....agar behaviour ko samjhe bina dekha.” Data is a tool. #Behaviour is the truth behind it.
-
🚀 Steal Our Customer Education Strategy: Link Learning Directly to Behavior🚀 Customer education metrics shouldn’t stop at course completions or CSAT scores. Those metrics are useful—but they don’t tell you if your training actually changes what customers do. That’s why our team set out to connect training directly to product behavior. Here’s how we approached it: 1️⃣ Export learner data from our LMS (who took which courses, certifications, or live trainings). 2️⃣ Upload those cohorts into Gainsight PX as Segments. 3️⃣ Run retention and feature adoption analyses comparing trained vs. all users. Some key results: 📈 Admins who attended ILT logged in 41% more often than those who didn’t. 📊 Accounts with trained end users showed a 36% higher product retention rate. 🚀 Certified users consistently demonstrated higher adoption of key features. Why this matters: When we can show that training changes behavior in the product, it becomes crystal clear that education is a business lever and not just a nice-to-have. It also helps us prioritize content that drives the biggest impact. 💡 Pro tip: You don’t need a giant data team to get started. Even a simple analysis on 1–2 key features can tell a powerful story. #CustomerEducation #DigitalAdoption #LearningAnalytics
-
Ever since launching Shikho AI, I have been diving into how learners interact with it across different demographics. I previously shared how actively students across different cities were using Shikho AI. Instead of just tracking usage, we focused on how rural and urban female students learn differently. By analyzing 20,000 recent sessions, we identified six learning behavior patterns: - Quick Clarification: Single or few questions for quick answers, minimal follow-up - Deep Exploration: Multiple questions showing progressive deeper understanding of a topic - Struggling Learner: Repeated similar questions, difficulty grasping concepts - Exam Focused: Concentrated preparation, practice problems, exam strategies - Homework Session: Focused problem-solving for specific assignments These patterns became our new lens for understanding learning, not just activity, and the chart below shows one of the most revealing findings. The data revealed fascinating contrasts: rural female learners showed more quick clarification (62% vs 58%), while urban learners showed more deep exploration (19% vs 17%) and homework sessions (11% vs 8%). Struggling learner patterns were twice as common in rural areas (4% vs 2%). Each percentage point reflects a real story - of access, context, and learning style. Insights like these are helping us design more context-aware AI features inside Shikho AI that adapt to each learner’s needs. This is just the beginning of understanding how AI can truly democratize education in Bangladesh. What patterns have you noticed in your product data that completely changed how you thought about your users? #AIinEducation #BangladeshEdTech
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development