💡Django tip Subqueries: #python #subqueries #orm When you need to perform more complex filtering or calculations, Django’s ORM supports subqueries and conditional expressions, allowing you to take your queries to the next level. A subquery allows you to embed one query inside another. This is useful when you need to fetch related data conditionally. This query fetches each customer and adds a latest_order_date field that contains the date of their most recent order. #tip #tips #tipoftheday #djv_mo #100daysofcode
Django Subqueries: Complex Filtering with ORM
More Relevant Posts
-
💡Django tip Debug Django Query inside templates: #python #html #template Ever wondered how many database queries your Django views are executing? Drop this at the end of your base template to see ALL SQL queries generated for each page #tip #tips #tipoftheday #djv_mo #100daysofcode
To view or add a comment, sign in
-
-
One of the things I really like about Django is how cleanly it handles CRUD, from database models to forms, views, and templates, everything fits together in a very natural way. It’s a great framework for building real, production level backends without overcomplicating things. Fast to develop, structured by default, and powerful when you need to scale. . . . #Django #Python #WebDevelopment #BackendDevelopment #SoftwareEngineering #BuildInPublic
To view or add a comment, sign in
-
-
Built a Python-based web scraper that collects top news headlines from a public website and stores them in a text file. The project uses HTTP requests to fetch HTML content and BeautifulSoup to parse and extract headline data automatically. This helped me practice web scraping, HTML parsing, and basic file handling in Python. GitHub: https://lnkd.in/gi56cKgZ #Python #WebScraping #BeautifulSoup #Automation #Learning
To view or add a comment, sign in
-
Web Scraping Project using Python: Scraped real-world tabular data from Wikipedia using Requests and BeautifulSoup, and transformed it into structured datasets using Pandas. 📌 Learned how to handle multiple HTML tables 📌 Converted raw web data into clean DataFrames 📌 Exported data for further analysis Tools: Python | BeautifulSoup | Pandas | Jupyter Notebook 🔗 Check out the full project here: https://lnkd.in/gS8bxAeN #WebScraping #Python #DataAnalytics #BeautifulSoup #Pandas #LearningByDoing
To view or add a comment, sign in
-
-
A quick Python readability tip: Use `dataclasses` instead of regular classes for data containers. Before: ```python class User: def __init__(self, name, email, age): self.name = name self.email = email self.age = age def __repr__(self): return f"User({self.name}, {self.email}, {self.age})" ``` After: ```python from dataclasses import dataclass @dataclass class User: name: str email: str age: int ``` You get: - init automatically - repr automatically - eq automatically - Type hints built-in - Less boilerplate Small optimization but great improvement on code readability. What's your favorite Python feature? #Python #SoftwareEngineering #CleanCode #Developer #perpetualsquared
To view or add a comment, sign in
-
This is one of the simplest examples of web scraping. Using Python to: Fetch a webpage Extract its title and headings A small example, but a powerful concept. Here, I have used punchng.com you can use any other public site. I needed to extract news headlines, rather than copy - paste, python did it. Package used : BeautifulSoup (bs4) Part 2 of a simple Python web scraping series. #Python #Automation #WebScraping
To view or add a comment, sign in
-
-
My Python RAG pipeline choked at 50 concurrent users. So I ripped out the orchestration layer and rebuilt it in Node.js. Unpopular opinion: Python is the king of training. But for serving? It’s too heavy. When you move from a Jupyter notebook to real-world WebSockets, things break. I didn't just need inference. I needed: • To handle 1,000+ concurrent embeddings. • Non-blocking streams. • Zero serialization headaches. Python’s GIL (Global Interpreter Lock) fought me every step. Node’s event loop ate the load for breakfast. The new stack: 1. Training: Python (obviously). 2. API/Orchestration: Node.js + TypeScript. 3. Vector DB: Pinecone. The result? 40% lower latency and no thread-blocking nightmares. Use the right tool for the layer, not just the language you learned first. What is the biggest bottleneck in your current stack? #VectorDatabase #RAG #Javascript
To view or add a comment, sign in
-
-
Python dictionaries are great. Until they aren't. Passing raw dictionaries through your codebase is fast, but it creates mess. You have to memorize keys, guess value types, and pray the API didn't change the schema silently. This is why #Pydantic is standard equipment for modern Python stacks. It forces you to treat your data as a contract, not a suggestion. The immediate ROI: IDE #Autocomplete: Because Pydantic uses standard type hints, VS Code and PyCharm actually know what attributes exist. No more tabbing back to the documentation. Precise #Debugging: Instead of a generic KeyError deep in your logic, Pydantic catches the error at the entry point and tells you exactly which field failed and why. JSON #Serialization: It handles the heavy lifting of converting complex types (like datetime objects) to JSON automatically. Stop guessing what's inside the dictionary. Define the model and let the code document itself. #Python #SoftwareDevelopment #Pydantic
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development