Day 5 When I was a junior dev, this line of code confused the hell out of me: const response = await fetch(url) const data = await response.json() I kept asking — why TWO awaits? Why can't fetch just give me the data directly? So I stopped copy-pasting and went back to first principles. Here's what I learned: → 200 OK does NOT mean the data arrived. It just means the server is saying, "I got your request, here comes the response." The connection is still open. The body is still travelling through the wire. → fetch() returns a promise for the headers first. That's the first await — waiting for the server to respond and say "200 OK." → response.json() returns a second promise for the body. That's the second await — waiting for all the actual data to arrive and parse. Think of it like a phone call. When someone picks up and says "hello" — that's the 200. But you haven't heard the actual message yet. You wait. They speak. Now you have the data. Once I understood THAT — promises stopped feeling scary. I stopped seeing async/await as magic syntax. I started seeing it as: "wait here until the data actually arrives." First principles thinking didn't just teach me promises. It changed how I debug, how I read docs, and how I learn anything new in tech. Stop memorising patterns. Start asking WHY they exist. That one question will make you a better developer faster than any tutorial. — — — What concept finally clicked for you when you went back to first principles? Drop it in the comments 👇 #JavaScript #WebDevelopment #Promises #AsyncAwait #JuniorDeveloper #FirstPrinciples #Programming #SoftwareEngineering #TechCommunity #CodingTips #LearnToCode #NodeJS #Frontend #Backend #Developer
Jay Patel’s Post
More Relevant Posts
-
Last week, a client project hit a wall. The feature was architected perfectly in Next.js, and the Supabase queries were lightning-fast. But the client was ready to pull the plug because they didn't understand the trade-offs we were making. I didn't need to invert a binary tree to fix it. I needed to translate technical debt into business risk. In my years as a lead, I’ve never seen a Senior Engineer get fired for not knowing a niche algorithm. I’ve seen plenty get sidelined for being unable to communicate. LeetCode gets you the interview. Soft skills get you the promotion. Technical excellence is just the entry fee. The real work is managing expectations, handling friction between stakeholders, and knowing when to say "no" to a feature that hurts the product long-term. If you can’t explain why you’re choosing FastAPI over Django, or why we’re sticking to React Native instead of a web-view, your code doesn't matter. At the end of the day, we’re building solutions for people, not compilers. What’s the one "soft" skill that leveled up your career more than any coding framework? #SoftwareEngineering #TechLeadership #CareerGrowth #EngineeringManagement
To view or add a comment, sign in
-
Your Django REST API is not slow… It’s bloated. You think you're building a clean API… But you're actually sending a data truck 🚛 When the frontend asked for a bike 🚲 Every request: “Give me username” You reply with: • username • email • last_login • permissions • groups • everything… WHY??? And then you say: “Why is my API slow?” Here’s the real problem: fields = "**all**" Looks harmless. Destroys performance silently. 🔥 Fix: Send only what matters. fields = ["id", "username"] That’s it. Clean. Fast. Efficient. ⚡ Real impact: • Smaller responses • Faster APIs • Better frontend performance 📌 Lesson: Users don’t care how much data you send. They care how FAST you respond. Stop sending everything. Start sending what’s needed. Most devs ship data. Top devs ship precision. #Django #DjangoRESTFramework #API #Backend #WebDevelopment #Performance
To view or add a comment, sign in
-
𝗛𝗼𝘄 𝗚𝗮𝗿𝗯𝗮𝗴𝗲 𝗖𝗼𝗹𝗹𝗲𝗰𝘁𝗶𝗼𝗻 𝗪𝗼𝗿𝗸𝘀 𝗶𝗻 𝗡𝗼𝗱𝗲.𝗷𝘀 As developers, we often focus on writing efficient code—but what about memory management behind the scenes? In 𝗡𝗼𝗱𝗲.𝗷𝘀, garbage collection (GC) is handled automatically by the 𝗩𝟴 𝗝𝗮𝘃𝗮𝗦𝗰𝗿𝗶𝗽𝘁 𝗲𝗻𝗴𝗶𝗻𝗲, so you don’t need to manually free memory like in languages such as C or C++. But understanding how it works can help you write more optimized and scalable applications. 𝗞𝗲𝘆 𝗖𝗼𝗻𝗰𝗲𝗽𝘁𝘀: 𝟭. 𝗠𝗲𝗺𝗼𝗿𝘆 𝗔𝗹𝗹𝗼𝗰𝗮𝘁𝗶𝗼𝗻 Whenever you create variables, objects, or functions, memory is allocated in two main areas: Stack→ Stores primitive values and references Heap→ Stores objects and complex data 𝟮. 𝗚𝗮𝗿𝗯𝗮𝗴𝗲 𝗖𝗼𝗹𝗹𝗲𝗰𝘁𝗶𝗼𝗻 (𝗠𝗮𝗿𝗸-𝗮𝗻𝗱-𝗦𝘄𝗲𝗲𝗽) V8 uses a technique called Mark-and-Sweep: * It starts from “root” objects (global scope) * Marks all reachable objects * Unreachable objects are considered garbage * Then, it sweeps (removes) them from memory 𝟯. 𝗚𝗲𝗻𝗲𝗿𝗮𝘁𝗶𝗼𝗻𝗮𝗹 𝗚𝗮𝗿𝗯𝗮𝗴𝗲 𝗖𝗼𝗹𝗹𝗲𝗰𝘁𝗶𝗼𝗻 Not all objects live the same lifespan: Young Generation (New Space) → Short-lived objects Old Generation (Old Space) → Long-lived objects Objects that survive multiple GC cycles get promoted to the Old Generation. 𝟰. 𝗠𝗶𝗻𝗼𝗿 & 𝗠𝗮𝗷𝗼𝗿 𝗚𝗖 Minor GC (Scavenge)→ Fast cleanup of short-lived objects Major GC (Mark-Sweep / Mark-Compact) → Handles long-lived objects but is more expensive 𝟱. 𝗦𝘁𝗼𝗽-𝘁𝗵𝗲-𝗪𝗼𝗿𝗹𝗱 During GC, execution pauses briefly. Modern V8 minimizes this with optimizations like incremental and concurrent GC. 𝗖𝗼𝗺𝗺𝗼𝗻 𝗠𝗲𝗺𝗼𝗿𝘆 𝗜𝘀𝘀𝘂𝗲𝘀: * Memory leaks due to unused references * Global variables holding data unnecessarily * Closures retaining large objects 𝗕𝗲𝘀𝘁 𝗣𝗿𝗮𝗰𝘁𝗶𝗰𝗲𝘀: * Avoid global variables * Clean up event listeners and timers * Use streams for large data processing * Monitor memory using tools like Chrome DevTools or `--inspect` Understanding GC = Writing better, faster, and scalable applications #NodeJS #JavaScript #BackendDevelopment #V8 #Performance #WebDevelopment
To view or add a comment, sign in
-
-
I used to think backend = writing APIs. Create route. Connect DB. Return response. But while building my project, I realized: Writing APIs is easy. Designing how everything works together is hard. Handling data flow, edge cases, failures — that’s the real backend. Now I think before coding: “What can go wrong here?” Small shift, big difference. How do you approach backend — code first or thinking first? #NodeJS #BackendDevelopment #LearnInPublic
To view or add a comment, sign in
-
3 years ago, I wrote my first API. It worked. Barely. No error handling. No input validation. Hardcoded values everywhere. I was just happy it returned a 200. Fast forward to today - I've shipped APIs in production that handled real client data, prevented revenue losses, and a API that directly convinced a client to onboard. Here's what I wish someone had told me at the start: 1. "It works on my machine" is not done. Done means it works under load, with bad inputs, with network failures, with edge cases you didn't think of. I learned this the hard way. 2. Naming things well is a superpower. The biggest time sink in early code isn't logic - it's trying to understand what past-you was thinking. Write for the next developer, not the compiler. 3. You will touch the database in production. And it will be terrifying the first time. Learn SQL properly. Understand indexes. Respect transactions. I've fixed bugs at the DB level that would have taken down a live client system. 4. Pick boring technology first. I chased new tools early. Then I spent a week building a document processing POC under a tight deadline - and the tools that saved me were the ones I already knew deeply: NestJS and solid API design. Familiarity under pressure is an unfair advantage. 5. Ship something real as fast as you can. Side projects are great. But nothing teaches you faster than code that actual users depend on. The feedback loop is brutal and honest. The gap between "it works" and "it's production-ready" is where most of the real learning happens. Still learning. Always will be. What's one thing you wish you knew when you wrote your first API? Drop it below 👇 #softwaredevelopment #webdevelopment #reactjs #nodejs #apidesign #fullstackdeveloper #devjourney #programming
To view or add a comment, sign in
-
Today I used Postman for the first time… and realized I was testing my backend completely wrong. When I was building APIs for user authentication and house listings in my MERN project… Then I hit a question that actually made me pause: 👉 “If the frontend isn’t even connected yet… how do I know my backend is actually working?” That’s when I tried Postman. And suddenly… things started making sense. ✅ Tested my signup API → user stored in DB ✅ Sent request for house upload → verified data persistence ✅ Understood status codes (200, 400, 500) in real scenarios ✅ Caught silly mistakes (wrong JSON, missing fields 😅) Basically, Postman became my backend testing ground before touching the frontend. Instead of jumping into UI and guessing where things break, I could: → send clean requests → inspect exact responses → verify data flow → catch issues early That gave me confidence that my APIs were working before writing a single line of frontend code. 💡 Biggest realization: Don’t connect your frontend until your API is proven to work. Day 1 with Postman… and I already see why every backend dev likes it. What do you use for API testing? Any better alternatives I should try? #mernstack #backenddevelopment #postman #webdevelopment #learninginpublic #fullstackdeveloper #nodejs #expressjs #mongodb #api #restapi #softwaredevelopment #codinglife #developerlife #programming #techcommunity #buildinpublic #devjourney
To view or add a comment, sign in
-
-
I promised — and I delivered. Here's usePromise: a custom React hook I built that I genuinely believe should be in every developer's project from day one. Let me explain why. The problem nobody talks about openly: Every React developer has written this exact block of code hundreds of times mentioned in the image 👇 It works. It's familiar. And it's been silently violating the DRY principle across every codebase you've ever touched. usePromise replaces all of that with a single hook that handles: ✅ Loading, data, and error state — managed via useReducer to prevent async race conditions ✅ Real request cancellation via AbortController (not just ignoring the response — actually aborting the request) ✅ Data transformation at the configuration level with dataMapper ✅ Lifecycle callbacks — onSuccess, onError, onComplete, and isRequestAbortionComplete ✅ executeOnMount support — fire on render without a single useEffect in your component ✅ Full reset capability — return to initial state cleanly Why not just React Query? React Query is excellent for caching, deduplication, and large-scale data orchestration. But sometimes you want something you fully own — no black boxes, no magic, no dependency debates in code review. usePromise gives you that. It's a foundation you understand end-to-end and can extend however you need. Why should this be standard? SOLID principles tell us: don't repeat yourself. Async data fetching is the most repeated pattern in every React application in existence. The framework gives us the primitives — useReducer, useCallback, useEffect — but leaves the wiring entirely to us. Every team solves this problem. Most teams solve it inconsistently. This hook is the consistent answer. Three years in, and the thing I keep coming back to is this: the first few years of your career build the developer you'll be. The habits, the patterns, the defaults you reach for. Reach for clean ones. Full deep-dive article on Medium including the complete implementation, the Promise lifecycle explained from first principles, and an honest breakdown of trade-offs. This is the medium article for more clarity down below 👇 https://lnkd.in/gJWZhQXk #React #JavaScript #WebDevelopment #Frontend #OpenSource #ReactHooks #CleanCode
To view or add a comment, sign in
-
-
🧵 Buffer vs Stream in Node.js — A concept every developer should understand! When dealing with data in programming, two fundamental approaches define HOW and WHEN data is processed: ━━━━━━━━━━━━━━━━━━━━━ 📦 BUFFER ━━━━━━━━━━━━━━━━━━━━━ A buffer collects ALL data in memory FIRST, then processes it. Think of it like filling a bucket of water completely before using it. ✅ Simple to use ✅ Easier to manipulate (slice, copy, transform) ❌ High memory usage for large data ❌ Latency — user waits until everything is loaded Example: Reading a full video file into memory before playing it. ━━━━━━━━━━━━━━━━━━━━━ 🌊 STREAM ━━━━━━━━━━━━━━━━━━━━━ A stream processes data CHUNK by CHUNK as it arrives. Think of it like drinking water directly from a tap — no waiting, no bucket. ✅ Low memory footprint ✅ Faster response time (start processing immediately) ✅ Ideal for large files & real-time data ❌ Slightly more complex to implement Example: Netflix streaming — you watch while it loads. ━━━━━━━━━━━━━━━━━━━━━ 🔑 KEY DIFFERENCES ━━━━━━━━━━━━━━━━━━━━━ | | Buffer | Stream | |---|---|---| | Data availability | All at once | Chunk by chunk | | Memory usage | High | Low | | Speed | Slower start | Faster start | | Best for | Small data | Large / real-time data | ━━━━━━━━━━━━━━━━━━━━━ 💡 Pro Tip: In Node.js, streams are first-class citizens. Use them when reading/writing large files, handling HTTP requests, or working with real-time pipelines! Which one do you use more in your day-to-day work? Drop a comment below! 👇 #NodeJS #JavaScript #WebDevelopment #Programming #SoftwareEngineering #BackendDevelopment #CodingTips #TechLearning #Developer #OpenSourceDev
To view or add a comment, sign in
-
-
Shipped a major update to LeetTrack -- my full-stack coding progress tracker Just pushed a big feature drop for LeetTrack, a platform I'm building to help developers track their coding journey across LeetCode, GitHub, GFG, and Codeforces. What's new: Interactive Roadmaps -- Built a visual, node-graph based roadmap system (think roadmap.sh but integrated with your progress tracker). Currently covering: Java Developer Path (75 topics across 18 modules) Data Structures & Algorithms (52 topics across 14 modules) System Design (53 topics across 12 modules) Each topic has curated resources (docs, articles, videos, practice problems) and a detail panel. Your progress is tracked per-topic and persists to your account -- check off what you've learned and watch the graph light up. Under the hood: React Flow for interactive node graphs with live edge animations based on progress Spring Boot backend with MongoDB for persistent per-user progress tracking Auto-seeding roadmap data on server startup (no manual config needed) Full mobile + tablet responsive design Security hardened -- JWT secret management, OAuth code exchange flow, rate limiting, granular endpoint authorization Tech Stack: React 19 | Spring Boot 3 | MongoDB | Redis | Tailwind CSS | React Flow | Docker | Render Building in public. More roadmaps and features coming soon. Check it out: https://lnkd.in/dVhEJMPC #buildinpublic #webdevelopment #java #springboot #react #fullstack #leetcode #dsa #systemdesign #mongodb #opensource
To view or add a comment, sign in
-
🚀 Day 38 – Node.js Core Modules Deep Dive (fs & http) Today I explored the core building blocks of Node.js by working directly with the File System (fs) and HTTP (http) modules — without using any frameworks. This helped me understand how backend systems actually work behind the scenes. 📁 fs – File System Module Worked with both asynchronous and synchronous operations. 🔹 Implemented: • Read, write, append, and delete files • Create and remove directories • Sync vs async execution • Callbacks vs promises (fs.promises) • Error handling in file operations • Streams (createReadStream) for large files 🔹 Key Insight: Streams process data in chunks, improving performance and memory efficiency. Real-time use cases: • Logging systems • File upload/download • Config management • Data processing (CSV/JSON) 🌐 http – Server Creation from Scratch Built a server using the native http module to understand the request-response lifecycle. 🔹 Explored: • http.createServer() • req & res objects • Manual routing using req.url • Handling GET & POST methods • Sending JSON responses • Setting headers & status codes • Handling request body using streams 🔹 Key Insight: Frameworks like Express are built on top of this. ⚡ Core Concepts Strengthened ✔ Non-blocking I/O → No waiting for file/network operations ✔ Event Loop → Efficient handling of concurrent requests ✔ Single-threaded architecture with async capabilities ✔ Streaming & buffering → Performance optimization Real-World Understandings • How client requests are processed • How Node.js handles multiple requests • What happens behind APIs • Better debugging of backend issues Challenges Faced • Managing async flow • Handling request body streams • Writing scalable routing without frameworks 🚀 Mini Implementation ✔ File handling using fs ✔ Basic HTTP server ✔ Routing (/home, /about) ✔ JSON response handling Interview Takeaways • Sync vs Async in fs • Streams in Node.js • Event Loop concept • req & res usage #NodeJS #BackendDevelopment #JavaScript #LearningJourney #WebDevelopment #TechGrowth 🚀
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development