✅ Solved a SQL problem on LeetCode — Day 48 of my SQL Journey 💪 Not all active users are valuable… Some look busy 👀 Today’s problem was about identifying zombie sessions — users who scroll endlessly but never click, never buy, never engage. Active on paper. Dead in reality. The approach: • Calculated session duration using MIN() and MAX() timestamps • Counted scrolls and clicks with conditional aggregation • Computed click-to-scroll ratio inside HAVING • Filtered sessions: duration > 30 min, scrolls ≥ 5, ratio < 0.20, zero purchases What I practised: • Conditional aggregation using CASE WHEN • Session-level grouping • Ratio logic inside HAVING • Translating behaviour into SQL filters What stood out — Scrolling is passive. Clicking is intent. Buying is a commitment. A session full of scrolls but no clicks It's just a ghost. That’s where the real insight lies. SQL doesn’t just count events. It reads the story behind them. Consistent learning, one query at a time 🚀 #SQL #LeetCode #DataAnalytics #LearningInPublic #SQLPractice
Solved SQL problem on LeetCode with session duration and click-to-scroll ratio
More Relevant Posts
-
✅ Solved a SQL problem on LeetCode — Day 49 of my SQL Journey 💪 Consistency isn’t always visible… But patterns reveal it over time 📈 Today’s problem was about identifying users with persistent behaviour — not just active users, but those who show the same action consistently across days. I worked on analysing behaviour patterns to: • Track user actions across consecutive days • Group continuous activity using ROW_NUMBER() logic • Identify streaks by adjusting date gaps • Filter users with meaningful streak length (≥ 5 days) • Select the strongest pattern per user using RANK() What I practised: • Window functions for streak detection • Handling sequential data with date logic • GROUP BY + HAVING for filtering patterns • Translating consistency into measurable conditions What stood out — Activity can be random… Consistency is never random. When actions repeat over time, They stop being noise and become behaviour. That’s where real insights start. SQL doesn’t just analyse data. It uncovers patterns hidden in time. Consistent learning, one query at a time 🚀 #SQL #LeetCode #SQLPractice #DataAnalytics #LearningInPublic
To view or add a comment, sign in
-
-
✅ Solved a SQL problem on LeetCode — Day 39 of my SQL Journey 💪 Some customers order anytime… But some follow patterns you can actually measure 📊 Today’s problem: identify “Golden Hour Customers” — people who consistently order during peak hours and leave high ratings. I used a single-pass aggregation query to: • Count total orders per customer • Flag orders placed during peak hours ⏰ • Calculate peak-hour percentage • Compute average rating while handling NULLs • Filter customers across all conditions at once What I practised: • CASE WHEN inside aggregations • Applying multiple conditions with HAVING • Time-based filtering using HOUR() • Combining behavioural and quality metrics in one query What stood out — Good customers aren’t just active… they’re consistent in when and how they engage. That’s where the real insight lies. SQL isn’t just about breaking problems into steps. Sometimes it’s about bringing everything together — cleanly. Consistent learning, one query at a time 🚀 #SQL #LeetCode #SQLPractice #DataAnalytics #LearningInPublic
To view or add a comment, sign in
-
-
🚀 Day 34/100 – SQL Practice (LeetCode) Today I solved the “Daily Leads and Partners” problem. 🔹 Approach: - Identified that the dataset contains duplicate records - Goal was to calculate unique leads and unique partners for each date_id and make_name - Used GROUP BY to organize data and COUNT(DISTINCT ...) to count only unique values 🔹 SQL Solution: SELECT date_id, make_name, COUNT(DISTINCT lead_id) AS unique_leads, COUNT(DISTINCT partner_id) AS unique_partners FROM DailySales GROUP BY date_id, make_name; 🔹 What I Learned: ✔️ Difference between DISTINCT and COUNT(DISTINCT) ✔️ Handling duplicate data efficiently ✔️ Importance of grouping before aggregation ✔️ Writing cleaner and optimized SQL queries 💡 Key Takeaway: 👉 When dealing with duplicates and you need unique counts, COUNT(DISTINCT column) is your best friend. Consistency + Practice = Growth 📈 #Day34 #100DaysOfCode #SQL #LeetCode #DataAnalytics #LearningJourney
To view or add a comment, sign in
-
-
🚀 Day 50 of My SQL Learning Journey Today I solved a medium-level SQL problem involving user retention analysis 🔥 🔹 Problem: Find the fraction of players who logged in again the day after their first login 🔗 Problem Link: https://lnkd.in/gsSuBeg5 🔹 Solution: SELECT ROUND( COUNT(DISTINCT a.player_id) / (SELECT COUNT(DISTINCT player_id) FROM Activity), 2) AS fraction FROM Activity a JOIN ( SELECT player_id, MIN(event_date) AS first_login FROM Activity GROUP BY player_id ) f ON a.player_id = f.player_id AND a.event_date = DATE_ADD(f.first_login, INTERVAL 1 DAY); 🔹 Key Learning: Using subqueries to find first login Applying date functions (DATE_ADD) Calculating ratios using aggregation Real-world concept: user retention analysis 💡 SQL is powerful for analyzing user behavior and retention patterns! Consistency continues 🚀 #SQL #LeetCode #90DaysOfCode #CodingJourney #DataAnalytics #InterviewPreparation
To view or add a comment, sign in
-
-
💡 𝐂𝐓𝐄 𝐦𝐚𝐝𝐞 𝐦𝐲 𝐒𝐐𝐋 𝟏𝟎𝐱 𝐜𝐥𝐞𝐚𝐧𝐞𝐫 Earlier, my SQL queries looked like this: • Nested queries inside nested queries 😵💫 • Hard to read • Even harder to debug Then I discovered CTE (Common Table Expressions)… and everything changed. 👉 Instead of writing one giant messy query, I started breaking it into steps. 𝐋𝐢𝐤𝐞 𝐭𝐡𝐢𝐬: • Step 1: Get base data • Step 2: Apply filters • Step 3: Do aggregations • Step 4: Final output All in a clean, readable flow. 💻 𝐄𝐱𝐚𝐦𝐩𝐥𝐞 𝐦𝐢𝐧𝐝𝐬𝐞𝐭: WITH step1 AS (...), step2 AS (...), step3 AS (...) SELECT * FROM step3; ✨ 𝐖𝐡𝐚𝐭 𝐈 𝐥𝐨𝐯𝐞 𝐚𝐛𝐨𝐮𝐭 𝐂𝐓𝐄: • Makes queries readable • Easy to debug step-by-step • Reusable logic in the same query • Feels like writing code, not chaos 💡 Write SQL in a way that anyone can easily understand. #SQL #DataAnalytics #CTE #DataAnalyst #LearningSQL #TechGrowth 🚀
To view or add a comment, sign in
-
-
✅ Solved a SQL problem on LeetCode — Day 42 of my SQL Journey 💪 Reactions look random… but behaviour usually isn’t 🎯 Today’s problem was about identifying emotionally consistent users — people who tend to react the same way across different content, not just react more. The approach: • Counted total reactions per user and per reaction type • Used a window function inside aggregation to get totals in one pass • Calculated the ratio of the dominant reaction to total reactions • Filtered users where the ratio crossed 60% • Used RANK() to identify the most frequent reaction per user What I practised: • Using window functions inside aggregations • Applying RANK() to capture dominant behaviour • Ratio-based filtering instead of just counts • Translating consistency into a measurable SQL condition What stood out — Consistency isn’t about reacting more… it’s about reacting the same way, repeatedly. Once you frame it as a ratio problem, the pattern becomes clearly visible. That’s where the real insight lies. Also noticed — SQL syntax isn’t universal. Things like ::numeric or QUALIFY don’t work in MySQL. Small difference, but important to keep in mind. Consistent learning, one query at a time 🚀 #SQL #LeetCode #SQLPractice #DataAnalytics #WindowFunctions #LearningInPublic
To view or add a comment, sign in
-
-
✅ Solved a SQL problem on LeetCode — Day 38 of my SQL Journey 💪 Most queries give you answers… But some force you to define the question first.❓ Today’s problem was about identifying loyal customers — not just frequent buyers, but those who show consistent behaviour over time. I worked on analysing customer patterns to: • Ensure at least 3 meaningful purchase actions • Check long-term engagement (30+ days activity) • Filter out unstable behaviour through the refund ratio • Combine all conditions cleanly using GROUP BY + HAVING What stood out to me — loyalty isn’t a single metric. A customer can purchase often, yet still not be reliable. Because behaviour is not about volume… it’s about consistency. That’s where the real insight lies. Also noticed something important — boolean logic inside aggregations (like SUM(condition)) works smoothly in MySQL, but small differences like integer division can break results in other SQL environments. Tiny detail. Big impact. SQL isn’t just about querying data. It’s about defining reality through logic. Consistent learning, one query at a time 🚀 #SQL #LeetCode #SQLPractice #DataAnalytics #LearningInPublic
To view or add a comment, sign in
-
-
✅ Solved a SQL problem on LeetCode — Day 40 of my SQL Journey 💪 Not every customer leaves suddenly… Some show signs before they churn. ⚠️ Today’s problem was about identifying churn risk customers — users who are still active but showing warning behaviour. I used aggregation and event analysis to: • Track each user’s latest subscription status • Identify downgrade behaviour over time • Compare current spend with historical maximum • Measure total subscription duration • Filter users based on combined risk signals What I practised: • Working with event-based data using GROUP BY • Using CASE WHEN to capture behavioural signals • Extracting latest values with ordered aggregation • Applying multiple conditions to detect patterns What stood out — Churn doesn’t happen instantly… it builds up through small changes. A downgrade here, a drop in spending there. That’s where the real insight lies. SQL isn’t just about analysing what happened. It’s about spotting what might happen next. Consistent learning, one query at a time 🚀 #SQL #LeetCode #SQLPractice #DataAnalytics #LearningInPublic
To view or add a comment, sign in
-
-
✅ Solved a SQL problem on LeetCode — Day 46 of my SQL Journey 💪 Customers don’t just buy products… They buy patterns together 🛒 Today’s problem was about finding product categories that are frequently bought together — The logic behind “you might also like.” The approach: • Mapped each user to their purchased categories • Used a self-join to create category pairs • Applied category1 < category2 to avoid duplicates • Counted shared users per pair and filtered meaningful ones What I practised: • Self joins to relate data within the same dataset • Using DISTINCT for accurate counting • Applying GROUP BY with HAVING • Translating real behaviour into SQL logic What stood out — Recommendations aren’t random… They’re built on repeated patterns. When enough users repeat the same combination, It becomes a signal. That’s where the real insight lies. SQL doesn’t just store transactions. It uncovers relationships between them. Consistent learning, one query at a time 🚀 #SQL #LeetCode #DataAnalytics #LearningInPublic #SQLPractice
To view or add a comment, sign in
-
-
🚀 Day 41/100 – LeetCode SQL Challenge 📌 Problem: 1407. Top Travellers Today’s problem focused on calculating the total distance traveled by each user and sorting the results based on multiple conditions. 💡 My Approach: Started by understanding the relationship between Users and Rides tables Used a LEFT JOIN to ensure all users are included (even those with no rides) Applied SUM(distance) to calculate total travel distance per user Handled NULL values using IFNULL(..., 0) to show 0 for users with no rides Grouped data using GROUP BY u.id, u.name to aggregate per user Sorted results using ORDER BY travelled_distance DESC, u.name ASC 🧠 What I Learned Today: Importance of choosing the correct column in GROUP BY (group by user, not ride) How LEFT JOIN affects NULL values in aggregation Different ways to handle NULL (COALESCE, IFNULL, CASE) How to apply multiple column sorting effectively Real-world thinking: handling missing data is as important as calculations 💻 Final Query: SELECT u.name, IFNULL(SUM(r.distance), 0) AS travelled_distance FROM Users u LEFT JOIN Rides r ON u.id = r.user_id GROUP BY u.id, u.name ORDER BY travelled_distance DESC, u.name ASC; 🔥 Consistency is the key — learning something new every day! #Day41 #100DaysOfCode #LeetCode #SQL #DataAnalytics #CodingJourney
To view or add a comment, sign in
-
More from this author
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development