🗂️5 𝐓𝐲𝐩𝐞𝐬 𝐨𝐟 𝐏𝐚𝐠𝐢𝐧𝐚𝐭𝐢𝐨𝐧 𝐄𝐯𝐞𝐫𝐲 𝐃𝐞𝐯𝐞𝐥𝐨𝐩𝐞𝐫 𝐒𝐡𝐨𝐮𝐥𝐝 𝐊𝐧𝐨𝐰 Pagination isn't just about splitting data — it's about doing it efficiently. The wrong approach can kill your API performance at scale. Here are the 5 most important pagination types: 1️⃣ 𝐎𝐟𝐟𝐬𝐞𝐭-𝐁𝐚𝐬𝐞𝐝 𝐏𝐚𝐠𝐢𝐧𝐚𝐭𝐢𝐨𝐧: The classic approach — skip N records, take M. Simple to implement but gets slow on large datasets. ?page=3&limit=10 ⚠️ Avoid on tables with millions of rows. 2️⃣ 𝐂𝐮𝐫𝐬𝐨𝐫-𝐁𝐚𝐬𝐞𝐝 𝐏𝐚𝐠𝐢𝐧𝐚𝐭𝐢𝐨𝐧: Uses a pointer to the last seen item instead of a page number. Efficient, consistent, and perfect for real-time data. ?after=eyJpZCI6MTIzfQ== ✅ Used by Twitter/X and Instagram APIs. 3️⃣ 𝐊𝐞𝐲𝐬𝐞𝐭 𝐏𝐚𝐠𝐢𝐧𝐚𝐭𝐢𝐨𝐧: Uses a unique column value (ID or timestamp) as the anchor. Blazing fast on indexed columns — scales beautifully. ?last_id=500&limit=10 ✅ Best choice for high-performance backends. 4️⃣ 𝐏𝐚𝐠𝐞 𝐍𝐮𝐦𝐛𝐞𝐫 𝐏𝐚𝐠𝐢𝐧𝐚𝐭𝐢𝐨𝐧: The classic UI pattern — pages 1, 2, 3… Easy for users but needs proper indexing server-side. 📌 Great for search results and admin dashboards. 5️⃣ 𝐓𝐢𝐦𝐞-𝐁𝐚𝐬𝐞𝐝 𝐏𝐚𝐠𝐢𝐧𝐚𝐭𝐢𝐨𝐧: Fetches records within a specific time range. Perfect for feeds, logs, and event streams. ?from=2024-01-01&to=2024-01-31 📌 Common in analytics and reporting systems. 💡 Pro Tip: Most production apps combine strategies — cursor-based for feeds, offset for search, time-based for reports. Which pagination type do you use most in your projects? Drop it in the comments 👇 #WebDevelopment #BackendDevelopment #SoftwareEngineering #API#Programming #DatabaseOptimization #SystemDesign #CleanCode #100DaysOfCode #CodingTips #Developer #TechCommunity #Flutter #Python #JavaScript #mitprogrammer
5 Pagination Types Every Developer Should Know
More Relevant Posts
-
Your webhook handler is probably slowing down your system without you realizing it. ⚡ I used to process webhook events directly inside the request cycle. Receive request → process logic → update DB → return response. It worked… until traffic increased. Multiple webhook events started hitting at the same time. External services kept retrying if response was slow. And suddenly, the system was under pressure. The problem: Webhook providers don’t care about your processing time. If you’re slow, they retry. If they retry, you get duplicate load. The fix: Stop processing inside the request. Receive → validate → acknowledge fast Then push the actual work to a background queue Example approach: # views.py def webhook_handler(request): data = request.data process_webhook.delay(data) # async task return Response({"status": "received"}) What changed: • Faster response to webhook provider • No blocking request threads • Better handling of traffic spikes • System stayed stable under load Important detail: Async alone is not enough. You still need idempotency to handle retries safely. The insight: Webhooks should be received fast, not processed fast. #SoftwareEngineering #BackendDevelopment #Django #Python #SystemDesign #Webhooks #Scalability #Performance #Developers
To view or add a comment, sign in
-
-
Built something a bit different this evening. I wanted a better way to search through emails (especially job alerts), so I put together a small system that: - indexes my email archive - lets me search it instantly via a web interface - pulls results from a FastAPI backend - and displays full emails on click (a bit like a lightweight Outlook) It started as a quick idea and turned into a fully working tool with a frontend, API, and database behind it. Still lots I could add (UI polish, smarter filtering, tagging etc.), but really pleased with how it came together. Nice reminder that sometimes the best way to solve a problem is just to build the thing you wish existed 👍 #Python #FastAPI #ITSupport #Homelab #Learning #Automation
To view or add a comment, sign in
-
Running heavy AI media tasks locally is easy. But deploying a heavy Python backend to the cloud just to show off a UI. That gets expensive fast. Instead, I built a fully decoupled AI processing engine and engineered a way to deploy just the interactive UI to the edge. Meet Cliply. The Architecture (Local Engine) - To prevent long ElevenLabs API calls and FFmpeg rendering from freezing the browser, I decoupled the architecture: The Producer: A Flask web server that handles user sessions and instantly offloads tasks to a queue. The Consumer: A background Python daemon that quietly monitors the queue and handles the heavy rendering. The Bridge: Vanilla JavaScript async polling that pings the server every 2 seconds to drive a live progress bar without blocking the main thread. The Deployment Hustle - I wanted to share the UI/UX, but I didn't want to pay for a dedicated GPU server just for a demo. So I created a parallel Git branch (vercel-demo) to bypass the heavy backend and deploy only the frontend to Vercel's serverless edge. Deploying a Python app to a serverless JS platform is a battle. My commit history today tells the real story: Commit - Added vercel.json...: Forcing Vercel to map and execute a Python Flask routing tree. Commit - Fix: Added requirements.txt and exposed app. Squashing the dreaded 500 Internal Server error so the serverless workers could find the global environment. Commit - Fix: Bypassed read-only file system. The final boss (Error 30). Serverless functions don't give you a hard drive. I had to bypass my local os.makedirs storage entirely just to get the frontend UI to boot up. The Result - The full heavy-lifting engine is built and running locally, while the lightweight asynchronous UI is live on Vercel for anyone to test. Next up - Pivoting into Data Science and ML model training to eventually plug my own models into this pipeline! GitHub Repo & Docs: https://lnkd.in/gHHjYWnF #SoftwareEngineering #Python #Flask #Microservices #DataEngineering #SystemArchitecture #Vercel #AI #ML #AIMODELS
To view or add a comment, sign in
-
JavaScript was getting messy. Our team was building a complex data visualization tool, and the codebase was quickly becoming a tangled mess. State management was a nightmare, and we were constantly worried about data integrity. It was painful. Then we decided to go all-in on TypeScript for the core logic. We started modeling our complex data structures using generics and mapped types. It felt like overkill at first, but suddenly, operations started *making sense*. Type safety wasn't just a buzzword; it was a safety net. The result? The component shipped on time. More importantly, we had unprecedented confidence in its correctness. Future updates and feature additions? They're now dramatically faster and far less prone to bugs. It turns out, strong typing isn't just for preventing errors; it's a massive accelerant for development. What's your experience with TypeScript in complex projects? #typescript #frontenddevelopment #javascript #datascience #softwareengineering
To view or add a comment, sign in
-
I just engineered a decoupled AI video processing factory from scratch. A lot of AI projects right now are just simple API wrappers that block the main thread and freeze the browser while waiting for a response. I wanted to build something the right way the way real media platforms handle heavy rendering. Meet Cliply. Instead of a monolithic app, I built a Decoupled Worker Architecture: 1. The Producer: A Flask web layer that captures user inputs, manages secure UUID sandbox environments, and writes the media manifests. 2. The Consumer: A background Python daemon that monitors the queue, fetches the AI voiceovers, and orchestrates the FFmpeg rendering. 2. The Bridge: Asynchronous JavaScript polling that pings a REST API every 2 seconds to update the UI progress bar without ever freezing the client. Tech Stack: Python, Flask, Vanilla JS, FFmpeg, ElevenLabs. You can check out the source code and the architecture breakdown on my GitHub here: https://lnkd.in/gHHjYWnF I had to solve some wild file I/O race conditions to get the daemon to wait for the API streams to finish writing to disk before triggering FFmpeg. I’m curious for the engineers out there what is your go-to strategy for handling heavy background tasks? Do you build custom daemon workers like this, or do you reach straight for Celery/Redis? Let me know. #SoftwareEngineering #Python #Flask #Architecture #FFmpeg #Microservices #Tech #BuildingInPublic #AI #SaaS
To view or add a comment, sign in
-
📊 Day 6 — Now it actually feels like a real application… This was a big step for me. Until now, everything was just UI and routes. Today, I added authentication and database. Now the app actually knows: 👉 Who the user is 👉 What data belongs to them This is the flow I built (shown below 👆): user → login → Flask-Login → session → dashboard → database What I implemented: ✔ User model (for storing login details) ✔ Task model (linked to each user) ✔ Flask-Login for authentication ✔ Login & Register system ✔ Protected routes using @login_required What I understood today: When a user logs in, Flask-Login creates a session So instead of asking “who are you?” again and again, the app remembers the user across requests Also learned: → How SQLAlchemy maps Python classes to DB tables → How each task is linked to a specific user (user_id) This made me realize: Backend is not just about logic… It’s about managing users, sessions, and data together Before this, everything felt disconnected Now it feels like a real system Next: Connecting frontend with backend using JavaScript (fetch API) Things are finally coming together. Have you ever implemented authentication from scratch? 👀 #flask #sqlalchemy #authentication #backenddeveloper #buildinpublic
To view or add a comment, sign in
-
-
💼 I specialize in Web Scraping & Automation systems I help businesses automatically extract data from websites (leads, products, competitors) and convert it into structured formats like Excel, APIs, or dashboards. 🚀 My goal is to save companies time and help them make better decisions using clean and reliable data instead of manual work. Python | JavaScript | Automation If you need a system for data extraction or lead generation, feel free to reach out 👍 #hashtags #webscraping #python #javascript #automation #data #leadgeneration #datascience #programming #business #tech
To view or add a comment, sign in
-
My not so "hidden” engine behind my latest dashboard. A few weeks back, I “finished” version 1 of a full-stack financial dashboard that tracks real-time stocks and weather data. While the UI is nice and built using React, the real story is what’s happening “behind the scenes” in the backend. Essentially, I didn't want the frontend to do its own "shopping". If you call API fetching “shopping”. Most apps have the user's browser call five different APIs at once. Instead, I built a BFF (Backend-for-Frontend) using FastAPI. The BFF acts like a Personal Assistant. Instead of the frontend running around to the "Weather Store" and the "Stock Market Store" itself, it just asks the assistant. The assistant (FastAPI) fetches everything, hides my private API keys, cleans up the data, and hands the frontend one perfectly organized package. That's great and all, but then I started to ask myself “what about the analytics?” So I needed to come up with a solution for real time analytics that is fast. For the heavy math—calculating stock indicators like RSI or Moving Averages across massive datasets—Python can sometimes be the bottleneck. So, I went with one of the best. I wrote a custom Compute Engine in C++ for the raw math and used pybind11 to bridge it directly into my Python code. “Why bother with all this?” First, and most importantly, speed. Delegating C++ to handle the math frees up time as it’s a fraction of the time compared to python. - Speed: The C++ engine handles the math in a fraction of the time. - UI/UX: The frontend React app stays snappy because it isn't bogged down by data processing. Building this was a great reminder that "Full Stack" isn't just about making things look pretty—it's about making sure the engine under the hood is built for the job. #WebDev #SoftwareArchitecture #Python #Cpp #FullStack
To view or add a comment, sign in
-
𝗧𝘂𝗿𝗻𝗶𝗻𝗴 𝗠𝗲𝘀𝘀𝘆 𝗗𝗮𝘁𝗮 𝗶𝗻𝘁𝗼 𝗩𝗶𝘀𝘂𝗮𝗹 𝗦𝘁𝗿𝗮𝘁𝗲𝗴𝘆 📈 Handling raw sales data is time-consuming and complicated. I built 𝗘𝘅𝗚𝗿𝗼𝘄𝘁𝗵 to bridge that gap; a full-stack web app that 𝘁𝗿𝗮𝗻𝘀𝗳𝗼𝗿𝗺𝘀 𝗿𝗮𝘄 𝗖𝗦𝗩/𝗘𝘅𝗰𝗲𝗹 𝗳𝗶𝗹𝗲𝘀 𝗶𝗻𝘁𝗼 𝗶𝗻𝘀𝘁𝗮𝗻𝘁, 𝘃𝗶𝘀𝘂𝗮𝗹 𝗶𝗻𝘁𝗲𝗹𝗹𝗶𝗴𝗲𝗻𝗰𝗲. 𝗪𝗵𝗮𝘁 𝗘𝘅𝗚𝗿𝗼𝘄𝘁𝗵 𝗵𝗮𝗻𝗱𝗹𝗲𝘀: 𝗥𝗲𝗮𝗹-𝘁𝗶𝗺𝗲 𝗞𝗣𝗜𝘀: Revenue, Profit, and Average Order Value (AOV). 𝗗𝗲𝗲𝗽 𝗦𝗲𝗴𝗺𝗲𝗻𝘁𝗮𝘁𝗶𝗼𝗻: Performance by Product Category, Region, and Top Customers. 𝗧𝗿𝗲𝗻𝗱 𝗧𝗿𝗮𝗰𝗸𝗶𝗻𝗴: Monthly charts for sales and customer acquisition. 𝗦𝗲𝗰𝘂𝗿𝗲 𝗠𝗮𝗻𝗮𝗴𝗲𝗺𝗲𝗻𝘁: File history with interactive previews and cloud deletion. 𝗧𝗵𝗲 𝗧𝗲𝗰𝗵𝗻𝗶𝗰𝗮𝗹 𝗕𝗮𝗰𝗸𝗯𝗼𝗻𝗲: I prioritized a modular architecture using the Service Layer Pattern to keep logic decoupled from the UI. 𝗦𝘁𝗮𝗰𝗸: Python, Django 6.0.4, Pandas, and NumPy. 𝗙𝘂𝘇𝘇𝘆 𝗠𝗮𝘁𝗰𝗵𝗶𝗻𝗴: Integrated RapidFuzz to intelligently map inconsistent column headers. 𝗜𝗻𝗳𝗿𝗮𝘀𝘁𝗿𝘂𝗰𝘁𝘂𝗿𝗲: Supabase (Postgres & Storage) with local fallback for resilience. 𝗧𝗵𝗲 𝗘𝘃𝗼𝗹𝘂𝘁𝗶𝗼𝗻 & 𝗣𝗿𝗼𝗰𝗲𝘀𝘀: This project is a complete V1 overhaul of a Flask prototype I built in my first year. I migrated to Django to implement JSON caching for near-instant dashboard loads. To move faster, I used AI as a "technical partner" to troubleshoot edge cases while I maintained control over the architecture. 𝗕𝗼𝗻𝘂𝘀 𝗦𝗸𝗶𝗹𝗹: This was also my first time editing a software demo video! It was a great challenge to align the visual flow with the app's logic. It’s not perfect, but it was a massive deep dive into building cloud-native data pipelines. 𝗜'𝗱 𝗹𝗼𝘃𝗲 𝘆𝗼𝘂𝗿 𝗳𝗲𝗲𝗱𝗯𝗮𝗰𝗸! 🔗 𝗚𝗶𝘁𝗛𝘂𝗯:https://lnkd.in/dAtZu2MV 🔗 𝗣𝗿𝗼𝗳𝗶𝗹𝗲: https://lnkd.in/d-vXp97X #Python #Django #DataScience #FullStack #ExGrowth #BuildInPublic #ExGrowth #SoftwareEngineering #WebDevelopment #Supabase
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
Useful