Day 92 of me reading random and basic but important dev topicsss..... Today I read about the modern Fetch API.... If you are still reaching for XMLHttpRequest or unnecessarily bundling heavy external libraries for simple network calls, it's time to leverage the native power of fetch(). It’s modern, versatile, and built directly into all modern browsers. Here is everything every dev need to know about the anatomy of a Fetch request. 1. The Two-Stage Process Getting a response with fetch() isn't a single step; it’s a two-stage promise resolution: Stage 1: The Headers Arrive The promise returned by fetch(url) resolves with a Response object the moment the server responds with headers - before the full body downloads. This is where you check the HTTP status. Note: A 404 or 500 error does NOT reject the promise. A fetch promise only rejects on network failures. Always check response.ok (returns true for 200-299 statuses) or response.status! Stage 2: Reading the Body To actually get the data, you need to call an additional promise-based method on the response. Fetch gives you multiple ways to parse the body: * response.json() - parses as JSON (most common) * response.text() - returns raw text * response.blob() - for binary data with types (like downloading an image) * response.formData() - for multipart/form-data * response.arrayBuffer() - for low-level binary data 2. The Already Consumed Trap Here is a classic gotcha that trips up many developers: We can only read the body once. If we call await response.text() to debug or log the output, and then subsequently call await response.json(), your code will fail. The stream has already been consumed! Summary of a standard GET request: let response = await fetch('https://lnkd.in/e4utYKVK'); if (response.ok) { let data = await response.json(); console.log(data); } else { console.error("HTTP-Error: " + response.status); } Keep Learning!!!! #JavaScript #WebDevelopment #SoftwareEngineering #FetchAPI #FrontendDev
Modern Fetch API: Simplify Network Calls with Native Power
More Relevant Posts
-
Excited to announce Typedrift v1.0.0! 🎉 A new kind of React data-fetching library that eliminates the most painful part of modern full-stack development: “type drift “ between your frontend components and backend queries. What’s New in v1.0.0 - Props = Query: Define your component’s data needs via TypeScript interfaces, everything else is derived automatically. - Zero boilerplate resolvers with ‘view()’, ‘batch.one()’, ‘batch.many()’ - First-class mutations using ‘action()’ with Zod validation, guards, and built-in redirects - Production-ready: middleware, caching, OpenTelemetry, rate limiting, audit logs - Full React 19 + RSC compatibility - No codegen, no separate query hooks, no manual data wiring Why Typedrift? Because the biggest source of bugs in React/Next.js apps isn’t bad code, it’s “out-of-sync types” between what your component expects and what the server actually returns. Typedrift makes the component prop shape the “single source of truth”. The server must conform. Type safety becomes structural. If you’re tired of maintaining query files, fighting stale types, or dealing with the boilerplate of TanStack Query + Zod + manual wiring… this one’s for you. Check it out: https://lnkd.in/e5HRgUgA Would love your feedback, stars, or contributions! #React #TypeScript #NextJS #FullStack #DeveloperTools
To view or add a comment, sign in
-
-
Coming in typedrift v1.1.0 Exactly one week ago, I shipped the first version of Typedrift because I got tired of the same old headaches in data fetching for React-driven frameworks. Data fetching has always felt broken. You define exactly what data a component needs in its props… then duplicate that logic in a separate query. Over time, they drift. You only discover the mismatch at runtime. Typedrift fixes that. Instead of writing queries, you define a typed view from your model. That view becomes the single source of truth for both fetching on the server and the props your component receives. No more duplication. No drift. What’s new in v1.1.0: - TanStack Adapter: seamless integration with TanStack Start - Next.js Adapter: first-class support for App Router and Server Components Clean, type-safe data fetching with zero boilerplate and zero codegen. If you’ve felt the pain of maintaining queries that drift away from your components (especially if you’ve used Relay, tRPC, or TanStack Query), I’d love your feedback. Check it out: - NPM: https://lnkd.in/drSZji_9 - GitHub: https://lnkd.in/e5HRgUgA What do you think — does this approach solve a real problem for you? #React #TypeScript #DataFetching #WebDev #NextJS #TanStack
Excited to announce Typedrift v1.0.0! 🎉 A new kind of React data-fetching library that eliminates the most painful part of modern full-stack development: “type drift “ between your frontend components and backend queries. What’s New in v1.0.0 - Props = Query: Define your component’s data needs via TypeScript interfaces, everything else is derived automatically. - Zero boilerplate resolvers with ‘view()’, ‘batch.one()’, ‘batch.many()’ - First-class mutations using ‘action()’ with Zod validation, guards, and built-in redirects - Production-ready: middleware, caching, OpenTelemetry, rate limiting, audit logs - Full React 19 + RSC compatibility - No codegen, no separate query hooks, no manual data wiring Why Typedrift? Because the biggest source of bugs in React/Next.js apps isn’t bad code, it’s “out-of-sync types” between what your component expects and what the server actually returns. Typedrift makes the component prop shape the “single source of truth”. The server must conform. Type safety becomes structural. If you’re tired of maintaining query files, fighting stale types, or dealing with the boilerplate of TanStack Query + Zod + manual wiring… this one’s for you. Check it out: https://lnkd.in/e5HRgUgA Would love your feedback, stars, or contributions! #React #TypeScript #NextJS #FullStack #DeveloperTools
To view or add a comment, sign in
-
-
Day 98 of me reading random and basic but important dev topicsss.... Today I read about the abort request in fetch..... By default, JavaScript Promises don't have a built-in "cancel" button. Once a fetch() is fired, it wants to run to completion, which can eat up bandwidth, cause memory leaks, or create race conditions in your UI if the data arrives after the user has moved on. Enter the native Web API superhero: AbortController.... Here's how it works under the hood: An AbortController is a simple object with a single method (abort()) and a single property (signal). When you call controller.abort(), the signal object emits an "abort" event, and its aborted property becomes true. Because modern fetch is designed to integrate seamlessly with this API, it actively listens for that exact signal! Here is the standard recipe: 1. Create a new controller instance. 2. Pass its signal as an option to your fetch. 3. Call controller.abort() when the request is no longer needed (e.g. component unmounts, user hits a "Cancel" button, or a timeout is reached). the implementation: // 1. Initialize the controller const controller = new AbortController(); // Let's set a timeout to cancel the request after 1 second setTimeout(() => controller.abort(), 1000); try { // 2. Pass the signal to fetch const response = await fetch('/api/heavy-data', { signal: controller.signal }); console.log("Data loaded!", await response.json()); } catch (err) { // 3. Handle the AbortError specifically if (err.name === 'AbortError') { console.log(" Request was successfully aborted!"); } else { console.error("Fetch failed:", err); } } Note- Always handle that AbortError in your catch block! When fetch aborts, it intentionally rejects the promise. Catching it specifically ensures it doesn't get logged as a false positive in your error tracking software (like Sentry or Datadog). Keep Learning!!!! #JavaScript #WebDevelopment #FrontendEngineering #SoftwareDevelopment #FetchAPI
To view or add a comment, sign in
-
-
Found a fascinating behavior of the Fetch API while debugging a production streaming issue today. If you work with LLMs or real-time data, you need to know this: The Scenario I had a backend endpoint that was "polymorphic": 1. If credentials were low, it returned a JSON error. 2. If everything was fine, it returned a Text Stream (for that snappy AI typing effect). The Bug I tried to parse the response as JSON first to check for errors. If that failed, I assumed it was a stream and tried to read it. Result? The stream was empty, and the browser threw a "Locked or Disturbed" error. Why does this happen? A Fetch response body isn't just a variable; it’s a ReadableStream. Think of it like a one-way conveyor belt: A. The "Lock": Once you call .json(), .text(), or .blob(), the browser attaches a "reader" to that conveyor belt. To protect memory, a stream can only have one reader at a time. B. No Rewind: Once those bytes are pulled off the belt and turned into a JavaScript object, they are gone. You can't "rewind" the belt to read them again as a stream. The stream is now "disturbed." The Fix: .clone() If you need to "peek" at the data without destroying the original stream, you must use the .clone() method. const response = await fetch('/api/stream'); // Create an identical twin of the response const clone = response.clone(); try { // Use the clone to "peek" for JSON errors const data = await clone.json(); handleError(data); } catch { // If parsing the clone fails, the ORIGINAL response is still // "undisturbed" and ready to be streamed to the UI! return response.body.getReader(); } The Lesson: Streams are built for performance and memory efficiency, not flexibility. If you need to read a response body twice, .clone() is your best friend. #WebDev #JavaScript #Frontend #CodingTips #SoftwareEngineering #Fetch #BrowserAPI
To view or add a comment, sign in
-
Why my API was slow (and what actually fixed it) I recently noticed one of my APIs was taking way too long to respond — sometimes 3–4 seconds per request. At first, I thought it was just my code being “messy,” but digging deeper taught me a lot. Here’s what I found: Too many unnecessary DB calls – I was fetching the same data multiple times instead of reusing it. Unoptimized queries – Some queries were scanning entire collections instead of using indexes. Synchronous loops – I was waiting for each call to finish one by one, instead of running them in parallel. After making a few changes: Added proper indexes Used Promise.all for parallel DB calls Cached repeated data where possible Response time went from 3–4 seconds → under 300ms. The biggest takeaway? Sometimes it’s not your code logic, it’s how your code talks to the database and handles tasks. Small adjustments can make a huge difference. #FullStackDeveloper #WebDevelopment #APIDevelopment #BackendDevelopment #NestJS #NextJS #JavaScript #PerformanceOptimization #SoftwareDevelopment
To view or add a comment, sign in
-
-
Just published a new video in my Web Development Series 🚀 This time I covered Request Data Handling in Express.js in a very simple and beginner-friendly way. If you are learning backend development, this is a must-know topic. I explained: • req.params • req.query • req.body • And built a simple CRUD API Everything with easy examples so you can understand clearly even if you're starting for the first time. Check it out and let me know your feedback 👇 https://lnkd.in/g76RMnBt #webdevelopment #nodejs #expressjs #backenddevelopment #javascript #coding #learnprogramming
Request Data Handling in Express.js | req.params, req.query, req.body Explained | CRUD API
https://www.youtube.com/
To view or add a comment, sign in
-
Stop treating Multer as just a “File Uploader.” 📦 Ever wondered why req.body is empty when you send a file and data, even though the frontend is perfect? 🤔 Most developers add upload.single() to the route because the tutorial said to. But the real work happens in the parsing, not just the storage. The Technical Reality ⚙️ The Parser Gap: Standard JSON parsers cannot decode multipart/form-data. This format sends data in a boundary-separated stream that Express doesn't naturally read. The "Unlock" Mechanism: Multer’s main job is to intercept that stream 🔓 It is the middleware that fills req.body for your text fields and req.file for your binary data. The Dependency: Without this middleware, your server is essentially blind 👀❌ Not just to the file — but to the entire request payload. Don't just say you're uploading a file. You are parsing a multipart request to reconstruct data that would otherwise be unavailable to the server. Understanding the request-response cycle at this level is what separates someone who just codes… from someone who truly builds 🚀 #NodeJS #Backend #ExpressJS #SoftwareEngineering #WebDev
To view or add a comment, sign in
-
Count the lines of React 18 code you write for every single form submission: const [isPending, setIsPending] = useState(false) const [error, setError] = useState(null) async function handleSubmit(e) { e.preventDefault() setIsPending(true) try { await save() } catch(e) { setError(e.message) } finally { setIsPending(false) } } Now count the React 19 version: const [state, formAction, isPending] = useActionState(saveAction, null) One line. Same behavior. Automatic pending state, error handling, and reset. That's what React 19 is: the same React, with the boilerplate removed. Here's everything that changed: ⚡ Actions + useActionState — async mutations without manual loading state 🌐 Server Actions — call server functions from client components. No custom API routes. Just 'use server'. 🪜 Server Components — render on server, ship zero JS. Default in Next.js 15. ❤️🔥 useOptimistic — instant UI updates before the server responds. Auto-rollback on failure. ⚙️ use() hook — unwrap promises and read context inside loops, conditions, early returns. 🏠 Native metadata — <title> and <meta> tags from any component. No react-helmet. ❌ No more forwardRef — ref is just a prop in React 19. forwardRef deprecated. 🔍 Better hydration errors — actual diffs instead of "tree will be regenerated". 🤖 React Compiler — automatic memoization at build time. No more useMemo busywork. I wrote the complete guide — every new API with real before/after examples, Server Actions deep dive, and the React 18 → 19 migration steps. Still on React 18? 👇 #React #JavaScript #Frontend #WebDev #ReactJS #100DaysOfBlogging
To view or add a comment, sign in
-
Your API returns JSON and you just JSON.parse() it straight into your app. Congrats, you've just imported a bug you didn't write. Two issues kill production apps silently: unvalidated payload shapes and big integer precision loss. When a backend sends { "id": 9007199254740993 }, JavaScript quietly rounds it. You never notice until a wrong record gets updated. The fix? Use a JSON reviver or a library like json-bigint to handle numeric precision at parse time: import JSONbig from "json-bigint"; const data = JSONbig.parse(response); console.log(data.id.toString()); // "9007199254740993" - exact For shape validation, parse your data through a schema validator like Zod immediately after parsing. Never trust the shape just because it parsed without throwing. JSON.parse only tells you the string is valid JSON - it says nothing about whether the data is what your code expects. Practical takeaway: treat every JSON.parse call as an untrusted boundary. Validate shape, handle large numbers explicitly, and fail loudly at the edge - not deep inside your business logic. How are you currently handling big integers or payload validation in production? #JavaScript #WebDevelopment #Frontend #NodeJS #SoftwareEngineering #CodeQuality
To view or add a comment, sign in
-
💡 “Just a Small Fix” Trap It’s Friday… Project is almost complete… Everything is stable… Weekend is already planned 😌🔥 Then at 4:00 PM 💥 “Just a small fix and a few improvements before release 🙂” At first: “No problem, quick update.” But reality disagrees 😭 That “small fix” becomes a chain reaction… API changes because it “can’t handle this case” Query rewritten because it wasn’t ready for real data DB indexing added “just in case” Frontend becomes “dynamic” 😭 Security suddenly becomes important Edge cases appear everywhere By the end… Nothing is small anymore 😅 And trust in the system is gone. You deploy it… You go home… 🌙 3:00 AM 💀 Brain: “What if filter is broken in production?” Then another thought: “what if that query is still slow under load…” 💭 Even after testing 100 times in production… your brain still whispers: “but what if you missed something?” 😭 💡 There is no “small fix”… On Fridays, “small fix” = full module rewrite in disguise 🔥 #SoftwareEngineering #DeveloperLife #ProgrammerHumor #CodingLife #WebDevelopment #BackendDevelopment #Laravel #PHP #FullStackDeveloper #SoftwareDeveloper #Debugging #ProductionIssues #TechHumor #FridayFeeling #WorkLifeBalance #DevLife #CleanCode #SystemDesign #ProgrammingMemes #TechCommunity
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development