cache.get('user_123') never touches Redis directly. Something else runs first. The assumption is Django's cache API is a thin wrapper. Call get, retrieve from storage and done. The reality is that every cache call passes through a pipeline before a single byte touches the backend. Here's what happens: 1. Every cache backend in Django inherits from BaseCache. BaseCache owns the entire caching contract - get, set, delete, incr, get_or_set. 2. The concrete backend like RedisCache, MemcachedCache implements only the storage-specific parts. The abstraction layer runs first. Always! 3. The first thing BaseCache does is transform the key. 4. Every key passes through make_key() — which prepends KEY_PREFIX and VERSION from settings. The key 'user_123' becomes ':1:user_123' in storage by default. Cache miss on a key that exists → check the actual key in storage first! Django's cache versioning lets the entire cache be invalidated by bumping VERSION in settings. Old keys still exist in storage, until eviction clears them. The cache API feels simple because BaseCache is doing the hard work invisibly. Have you ever debugged a cache miss only to find the key was there, just under a different name? #Python #Django #BackendDevelopment #SoftwareEngineering
Django cache API abstraction layer
More Relevant Posts
-
I reduced my API response time from 2.3s to 140ms. No Redis. No CDN. No caching layer. Just 4 changes to my Django REST Framework setup that most tutorials never mention. N+1 queries everywhere. My serializer accessed post.author.name on every row. 100 posts = 101 database queries. One select_related('author') brought it down to 1. Response time: 2.3s to 800ms instantly. Using ModelSerializer for read endpoints. ModelSerializer builds fields dynamically on every request. It's up to 377x slower than raw Python dicts. Switched read-only endpoints to serializers.Serializer with explicit fields. Another 40% gone. No pagination on list endpoints. Returning the entire table. 10,000 rows. Every request. Added CursorPagination, constant-time queries regardless of dataset size. OFFSET-based pagination breaks at high page numbers. Cursor doesn't. Fetching fields I never used. Serializer returned 15 fields. Frontend used 6. Added .only() and trimmed the serializer. 2.3s to 140ms. Same server. Same database. Same $12/month VPS. The bottleneck was never my infrastructure. It was my code. Run queryset.explain(analyze=True) on your slowest endpoint. You'll probably find the same mistakes. Which of these have you tried? #Django #Python #API #WebPerformance #BuildInPublic
To view or add a comment, sign in
-
Swapping Django's cache backend is one settings change. The behavioral differences are not that simple. The catch - BaseCache defines the contract - get, set, incr, delete, get_or_set. Every backend implements this contract. Not every backend can fulfill it with the same guarantees! 1. incr() - where the divergence is most dangerous - cache.incr('counter') on Redis is a single INCR command sent to Redis. - Redis processes it atomically. One operation. Safe under any concurrency. - cache.incr('counter') on DatabaseCache - Django reads the current value, increments it in Python, writes it back. - Three steps. No lock between them. NOT safe under any concurrency. 2. get_or_set() - the race condition most of us miss - get_or_set('key', default, timeout) looks atomic. On no backend is it truly atomic. - Two concurrent requests both find the key missing. Both compute the default. Both set it. - This is the cache stampede problem. get_or_set() does not prevent it. 3. clear() - scope is more than just keys in an app - cache.clear() on a shared Redis instance clears the entire Redis database. Not just keys belonging to this application. - Multiple applications sharing one Redis instance, one clear() call wipes everything. - KEY_PREFIX only prevents key collisions. It does not scope clear(). The cache API is an abstraction over storage. Abstractions leak at the worst possible moment. Have you ever discovered a backend-specific behavior difference under load rather than in testing? #Python #Django #BackendDevelopment #SoftwareEngineering
To view or add a comment, sign in
-
-
You don't always need Redis. Here's a rate limiter I built in 40 lines of pure Python. No django-ratelimit. No external dependencies. Just a sliding window algorithm, a dictionary, and timestamps. The core idea: → Every request logs a timestamp against a client key → On each new request, prune anything older than your window → If the remaining count hits your limit — return 429 → Otherwise, log it and let it through That's the entire algorithm. I plugged it into Django middleware and a per-view decorator so you can control limits at both levels. I've used Redis-backed rate limiters in production. They're great when you need them. But for a personal project, an internal tool, or a lightweight API — this does the job without the infrastructure overhead. Full implementation + Django integration on my Medium. Also covers the one IP extraction detail that will silently break your limiter if you're behind a proxy — worth checking even if you're using a library. #Python #Django #BackendDevelopment #SoftwareEngineering #WebDevelopment
To view or add a comment, sign in
-
Django doesn't store what's passed to cache.set(). It stores a transformed version of it. The reality is Django serializes everything before storage and that serialization has consequences most engineers never consider. Here's what actually happens: 1. Before any value reaches the backend, Django serializes it using pickle. 2. Not JSON. Not a string representation. Python's pickle, a binary serialization of the entire object graph. 3. This is why caching a model instance, a queryset result or a complex nested object just works. But pickle is also why a network-exposed cache is a critical security vulnerability. Here is how - -> Pickle deserialization executes arbitrary Python code. That's not a bug, it's how pickle reconstructs complex objects. -> An attacker who can write to the cache can craft a malicious pickle payload. -> When Django deserializes it and that code runs. On the server. With full application privileges. Precautions for avoiding this attack - - Redis and Memcached should never be publicly accessible, bind to localhost or a private network only. - Use Redis AUTH or TLS for any cache that travels over a network. They're not enabled by default in Django. - django-redis supports pluggable serializers. Replace pickle with msgpack or a custom JSON encoder. Safer, often faster. The cache feels invisible until it isn't. Treat it like any other network service that touches application data. Has cache security ever been part of a security review in your stack or is it assumed safe by default? #Python #Django #BackendDevelopment #SoftwareEngineering
To view or add a comment, sign in
-
-
designed a PostgreSQL schema for an e-commerce backend. 4 tables. clean relationships. production-ready structure. users place orders → orders contain items → items reference products every foreign key is intentional. every index exists for a reason. next step — connecting this to my FastAPI microservices and adding real data. this is how you build backends that scale. #PostgreSQL #Backend #DataEngineering #FastAPI #Python
To view or add a comment, sign in
-
-
This post makes me think about PostgreSQL schema adaptability. Even with only a single tenant, users and products can be a separate schema from the orders allowing users to be represented by multiple tables in a schema and products represented by multiple tables. For example, products can split into tables by store if the company owns multiple store brands with exclusive products or by region if the company has products that are only legal to sell in a region or only can be produced in that region. Users can have tables for additional info that can be more adaptibly changed in its own users schema. For example, a table tracking users linked universities can be used by an online course website, like Coursera, where universities want to track students linked to their university, like Standford, to give credit for online course completion with updates sent to users telling them the credit value of online courses before taking the online course. https://lnkd.in/gTHZPvt6
designed a PostgreSQL schema for an e-commerce backend. 4 tables. clean relationships. production-ready structure. users place orders → orders contain items → items reference products every foreign key is intentional. every index exists for a reason. next step — connecting this to my FastAPI microservices and adding real data. this is how you build backends that scale. #PostgreSQL #Backend #DataEngineering #FastAPI #Python
To view or add a comment, sign in
-
-
You don't need Node.js for real-time applications. This is the assumption I challenged when I required WebSocket support in a Django project. The common advice suggests adding a Node service for the real-time components, which introduces a second language, a second deployment pipeline, and an additional point of failure. Django Channels combined with Redis demonstrated that this approach is unnecessary. Here’s the setup: - Daphne replaces Gunicorn as the ASGI server, managing both HTTP and WebSocket on the same process. - ProtocolTypeRouter effectively splits traffic: HTTP requests are directed to Django, while WebSocket connections are handled by Channels consumers. - Redis acts as a message broker and serves as the channel layer, enabling pub/sub functionality across all connected consumers. - Consumers are async Python classes with methods like receive(), group_send(), and disconnect(). The outcome is that a message sent by one client reaches Redis, propagates to every consumer in the group, and connects with all clients, without leaving the Python ecosystem. No Node. No socket.io. No separate service to maintain. Everything operates within the same Docker container as the rest of the backend, utilizing the same codebase, deployments, and logs. Sometimes, the seemingly boring choice is actually the most intelligent one. #django #python #webdev #backend #software #architecture
To view or add a comment, sign in
-
-
Day 9 - "It works on my machine." Docker fixes that sentence permanently. Here's how — with a real 3-service app, not a toy tutorial. 🚀TechFromZero Series - DockerFromZero This isn't a Hello World. It's a real multi-container application: 📐 Client → FastAPI (Python) → MongoDB (Database) → Redis (Cache) — all orchestrated with Docker Compose 🔗 The full code (with step-by-step commits you can follow): https://lnkd.in/dtnykq35 🧱 What I built (step by step): 1️⃣ Project scaffold — FastAPI app with async endpoints and health check 2️⃣ Dockerfile — FROM, COPY, RUN, EXPOSE, CMD with layer caching explained 3️⃣ .dockerignore — keep secrets and junk out of images 4️⃣ MongoDB connection — async motor driver, Docker DNS (service names, not localhost) 5️⃣ Weather API client — httpx async calls to OpenWeatherMap from inside a container 6️⃣ Full CRUD endpoints — log, list, stats, filter, delete weather data 7️⃣ Docker Compose — 3 services, health checks, depends_on, named volumes, custom network 8️⃣ Redis caching — 5-minute TTL, sub-millisecond cache hits vs 300ms API calls 9️⃣ README — architecture diagram, Docker cheat sheet, step-by-step guide 💡 Every file has detailed comments explaining WHY, not just what. Written for any beginner who wants to learn Docker by reading real code — with full clarity on each step. 👉 If you're a beginner learning Docker, clone it and read the commits one by one. Each commit = one concept. Each file = one lesson. Built from scratch, so nothing is hidden. 🔥 This is Day 9 of a 50-day series. A new technology every day. Follow along! 🌐 See all days: https://lnkd.in/dhDN6Z3F #TechFromZero #Day9 #Docker #DockerCompose #FastAPI #MongoDB #Redis #Python #Containers #DevOps #LearnByDoing #OpenSource #BeginnerGuide #100DaysOfCode #CodingFromScratch
To view or add a comment, sign in
-
-
🚀 Built my first multi-container application using Docker! Over the past few days, I worked through building a simple Flask web app connected to a Redis database — all containerised and orchestrated with Docker Compose. 🔧 What I built: A Flask app with multiple routes: • / → welcome page • /count → increments visit count • /reset → resets the counter Redis used as a key-value store for tracking visits Multi-container setup using Docker Compose 🧠 What I learned: How to containerise a Python application using a Dockerfile Running multiple services with Docker Compose Container-to-container communication using service names (no localhost!) Using environment variables instead of hardcoding configuration Persisting data with Docker volumes 📌 Key takeaway: Understanding how services communicate inside containers (and debugging when they don’t!) was the biggest learning moment. Next step: exploring scaling and load balancing 👀 #Docker #DevOps #Python #Flask #Redis #LearningInPublic CoderCo
To view or add a comment, sign in
-
We all use Redis libraries to work with Redis, but what actually happens under the hood when you call redis_client.get("key")? To find out, I built a simple Redis client in Python from scratch using raw TCP sockets. It implements a subset of redis commands and basic features of a client library. Here's what happens: 1. Whenever you call a command, it gets converted into a list of strings similar to what you'd type in redis-cli. redis_client.get("key") → ["GET", "key"] 2. This is then encoded into bytes using the RESP protocol. ["GET", "key"] → "*2\r\n$3\r\nGET\r\n$3\r\nkey\r\n" 3. These bytes are transmitted over a TCP socket, and the Redis server responds with the result of the command. 4. Finally the response is then parsed back from bytes into a Python value and returned to caller or error is raised. What I found most interesting is how simple and thoughtful the RESP protocol is. Simple protocol means less time on the network and more time doing actual work. For Redis that promises speed, it makes sense to keep everything around it as lightweight as possible. I would recommend you to read the RESP protocol, you can understand it in an hour. Start with RESP2 as its the widely used and actually the first one. Code: https://lnkd.in/gXp7ZM5p #Python #Redis #BackendDevelopment #LearningByBuilding
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development